[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 15330 1726882250.20706: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 15330 1726882250.21338: Added group all to inventory 15330 1726882250.21340: Added group ungrouped to inventory 15330 1726882250.21343: Group all now contains ungrouped 15330 1726882250.21346: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 15330 1726882250.48978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 15330 1726882250.49040: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 15330 1726882250.49067: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 15330 1726882250.49133: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 15330 1726882250.49217: Loaded config def from plugin (inventory/script) 15330 1726882250.49219: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 15330 1726882250.49260: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 15330 1726882250.49354: Loaded config def from plugin (inventory/yaml) 15330 1726882250.49356: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 15330 1726882250.49448: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 15330 1726882250.49902: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 15330 1726882250.49905: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 15330 1726882250.49909: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 15330 1726882250.49915: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 15330 1726882250.49924: Loading data from /tmp/network-Kc3/inventory.yml 15330 1726882250.49999: /tmp/network-Kc3/inventory.yml was not parsable by auto 15330 1726882250.50070: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 15330 1726882250.50111: Loading data from /tmp/network-Kc3/inventory.yml 15330 1726882250.50203: group all already in inventory 15330 1726882250.50210: set inventory_file for managed_node1 15330 1726882250.50214: set inventory_dir for managed_node1 15330 1726882250.50216: Added host managed_node1 to inventory 15330 1726882250.50218: Added host managed_node1 to group all 15330 1726882250.50219: set ansible_host for managed_node1 15330 1726882250.50220: set ansible_ssh_extra_args for managed_node1 15330 1726882250.50223: set inventory_file for managed_node2 15330 1726882250.50225: set inventory_dir for managed_node2 15330 1726882250.50226: Added host managed_node2 to inventory 15330 1726882250.50228: Added host managed_node2 to group all 15330 1726882250.50229: set ansible_host for managed_node2 15330 1726882250.50229: set ansible_ssh_extra_args for managed_node2 15330 1726882250.50232: set inventory_file for managed_node3 15330 1726882250.50234: set inventory_dir for managed_node3 15330 1726882250.50235: Added host managed_node3 to inventory 15330 1726882250.50237: Added host managed_node3 to group all 15330 1726882250.50237: set ansible_host for managed_node3 15330 1726882250.50238: set ansible_ssh_extra_args for managed_node3 15330 1726882250.50241: Reconcile groups and hosts in inventory. 15330 1726882250.50249: Group ungrouped now contains managed_node1 15330 1726882250.50251: Group ungrouped now contains managed_node2 15330 1726882250.50253: Group ungrouped now contains managed_node3 15330 1726882250.50333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 15330 1726882250.50468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 15330 1726882250.50521: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 15330 1726882250.50548: Loaded config def from plugin (vars/host_group_vars) 15330 1726882250.50551: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 15330 1726882250.50557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 15330 1726882250.50565: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 15330 1726882250.50615: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 15330 1726882250.51473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882250.51704: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 15330 1726882250.51863: Loaded config def from plugin (connection/local) 15330 1726882250.51866: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 15330 1726882250.52796: Loaded config def from plugin (connection/paramiko_ssh) 15330 1726882250.52799: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 15330 1726882250.53732: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15330 1726882250.53771: Loaded config def from plugin (connection/psrp) 15330 1726882250.53774: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 15330 1726882250.54532: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15330 1726882250.54578: Loaded config def from plugin (connection/ssh) 15330 1726882250.54581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 15330 1726882250.56637: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15330 1726882250.56685: Loaded config def from plugin (connection/winrm) 15330 1726882250.56689: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 15330 1726882250.56725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 15330 1726882250.56799: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 15330 1726882250.56876: Loaded config def from plugin (shell/cmd) 15330 1726882250.56879: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 15330 1726882250.56907: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 15330 1726882250.56981: Loaded config def from plugin (shell/powershell) 15330 1726882250.56983: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 15330 1726882250.57038: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 15330 1726882250.57231: Loaded config def from plugin (shell/sh) 15330 1726882250.57234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 15330 1726882250.57268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 15330 1726882250.57389: Loaded config def from plugin (become/runas) 15330 1726882250.57391: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 15330 1726882250.57594: Loaded config def from plugin (become/su) 15330 1726882250.57597: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 15330 1726882250.57770: Loaded config def from plugin (become/sudo) 15330 1726882250.57773: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 15330 1726882250.57809: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15330 1726882250.58162: in VariableManager get_vars() 15330 1726882250.58191: done with get_vars() 15330 1726882250.58330: trying /usr/local/lib/python3.12/site-packages/ansible/modules 15330 1726882250.61572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 15330 1726882250.61695: in VariableManager get_vars() 15330 1726882250.61700: done with get_vars() 15330 1726882250.61819: variable 'playbook_dir' from source: magic vars 15330 1726882250.61820: variable 'ansible_playbook_python' from source: magic vars 15330 1726882250.61821: variable 'ansible_config_file' from source: magic vars 15330 1726882250.61822: variable 'groups' from source: magic vars 15330 1726882250.61823: variable 'omit' from source: magic vars 15330 1726882250.61823: variable 'ansible_version' from source: magic vars 15330 1726882250.61824: variable 'ansible_check_mode' from source: magic vars 15330 1726882250.61825: variable 'ansible_diff_mode' from source: magic vars 15330 1726882250.61826: variable 'ansible_forks' from source: magic vars 15330 1726882250.61826: variable 'ansible_inventory_sources' from source: magic vars 15330 1726882250.61827: variable 'ansible_skip_tags' from source: magic vars 15330 1726882250.61828: variable 'ansible_limit' from source: magic vars 15330 1726882250.61828: variable 'ansible_run_tags' from source: magic vars 15330 1726882250.61829: variable 'ansible_verbosity' from source: magic vars 15330 1726882250.61864: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 15330 1726882250.62552: in VariableManager get_vars() 15330 1726882250.62569: done with get_vars() 15330 1726882250.62652: in VariableManager get_vars() 15330 1726882250.62665: done with get_vars() 15330 1726882250.62703: in VariableManager get_vars() 15330 1726882250.62715: done with get_vars() 15330 1726882250.62785: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15330 1726882250.63006: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15330 1726882250.63147: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15330 1726882250.63936: in VariableManager get_vars() 15330 1726882250.63956: done with get_vars() 15330 1726882250.64452: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 15330 1726882250.64591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15330 1726882250.66120: in VariableManager get_vars() 15330 1726882250.66124: done with get_vars() 15330 1726882250.66126: variable 'playbook_dir' from source: magic vars 15330 1726882250.66127: variable 'ansible_playbook_python' from source: magic vars 15330 1726882250.66128: variable 'ansible_config_file' from source: magic vars 15330 1726882250.66129: variable 'groups' from source: magic vars 15330 1726882250.66129: variable 'omit' from source: magic vars 15330 1726882250.66130: variable 'ansible_version' from source: magic vars 15330 1726882250.66131: variable 'ansible_check_mode' from source: magic vars 15330 1726882250.66132: variable 'ansible_diff_mode' from source: magic vars 15330 1726882250.66133: variable 'ansible_forks' from source: magic vars 15330 1726882250.66133: variable 'ansible_inventory_sources' from source: magic vars 15330 1726882250.66134: variable 'ansible_skip_tags' from source: magic vars 15330 1726882250.66135: variable 'ansible_limit' from source: magic vars 15330 1726882250.66135: variable 'ansible_run_tags' from source: magic vars 15330 1726882250.66136: variable 'ansible_verbosity' from source: magic vars 15330 1726882250.66169: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15330 1726882250.66271: in VariableManager get_vars() 15330 1726882250.66284: done with get_vars() 15330 1726882250.66327: in VariableManager get_vars() 15330 1726882250.66329: done with get_vars() 15330 1726882250.66332: variable 'playbook_dir' from source: magic vars 15330 1726882250.66333: variable 'ansible_playbook_python' from source: magic vars 15330 1726882250.66333: variable 'ansible_config_file' from source: magic vars 15330 1726882250.66334: variable 'groups' from source: magic vars 15330 1726882250.66335: variable 'omit' from source: magic vars 15330 1726882250.66335: variable 'ansible_version' from source: magic vars 15330 1726882250.66336: variable 'ansible_check_mode' from source: magic vars 15330 1726882250.66337: variable 'ansible_diff_mode' from source: magic vars 15330 1726882250.66338: variable 'ansible_forks' from source: magic vars 15330 1726882250.66338: variable 'ansible_inventory_sources' from source: magic vars 15330 1726882250.66339: variable 'ansible_skip_tags' from source: magic vars 15330 1726882250.66340: variable 'ansible_limit' from source: magic vars 15330 1726882250.66340: variable 'ansible_run_tags' from source: magic vars 15330 1726882250.66341: variable 'ansible_verbosity' from source: magic vars 15330 1726882250.66371: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15330 1726882250.66443: in VariableManager get_vars() 15330 1726882250.66456: done with get_vars() 15330 1726882250.66513: in VariableManager get_vars() 15330 1726882250.66516: done with get_vars() 15330 1726882250.66518: variable 'playbook_dir' from source: magic vars 15330 1726882250.66519: variable 'ansible_playbook_python' from source: magic vars 15330 1726882250.66519: variable 'ansible_config_file' from source: magic vars 15330 1726882250.66520: variable 'groups' from source: magic vars 15330 1726882250.66521: variable 'omit' from source: magic vars 15330 1726882250.66521: variable 'ansible_version' from source: magic vars 15330 1726882250.66522: variable 'ansible_check_mode' from source: magic vars 15330 1726882250.66523: variable 'ansible_diff_mode' from source: magic vars 15330 1726882250.66523: variable 'ansible_forks' from source: magic vars 15330 1726882250.66529: variable 'ansible_inventory_sources' from source: magic vars 15330 1726882250.66530: variable 'ansible_skip_tags' from source: magic vars 15330 1726882250.66530: variable 'ansible_limit' from source: magic vars 15330 1726882250.66531: variable 'ansible_run_tags' from source: magic vars 15330 1726882250.66532: variable 'ansible_verbosity' from source: magic vars 15330 1726882250.66561: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 15330 1726882250.66673: in VariableManager get_vars() 15330 1726882250.66676: done with get_vars() 15330 1726882250.66678: variable 'playbook_dir' from source: magic vars 15330 1726882250.66679: variable 'ansible_playbook_python' from source: magic vars 15330 1726882250.66680: variable 'ansible_config_file' from source: magic vars 15330 1726882250.66680: variable 'groups' from source: magic vars 15330 1726882250.66681: variable 'omit' from source: magic vars 15330 1726882250.66682: variable 'ansible_version' from source: magic vars 15330 1726882250.66682: variable 'ansible_check_mode' from source: magic vars 15330 1726882250.66683: variable 'ansible_diff_mode' from source: magic vars 15330 1726882250.66683: variable 'ansible_forks' from source: magic vars 15330 1726882250.66684: variable 'ansible_inventory_sources' from source: magic vars 15330 1726882250.66685: variable 'ansible_skip_tags' from source: magic vars 15330 1726882250.66685: variable 'ansible_limit' from source: magic vars 15330 1726882250.66686: variable 'ansible_run_tags' from source: magic vars 15330 1726882250.66687: variable 'ansible_verbosity' from source: magic vars 15330 1726882250.66717: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 15330 1726882250.66791: in VariableManager get_vars() 15330 1726882250.66804: done with get_vars() 15330 1726882250.66855: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15330 1726882250.66976: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15330 1726882250.67113: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15330 1726882250.67547: in VariableManager get_vars() 15330 1726882250.67566: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15330 1726882250.69268: in VariableManager get_vars() 15330 1726882250.69282: done with get_vars() 15330 1726882250.69525: in VariableManager get_vars() 15330 1726882250.69529: done with get_vars() 15330 1726882250.69531: variable 'playbook_dir' from source: magic vars 15330 1726882250.69532: variable 'ansible_playbook_python' from source: magic vars 15330 1726882250.69533: variable 'ansible_config_file' from source: magic vars 15330 1726882250.69533: variable 'groups' from source: magic vars 15330 1726882250.69534: variable 'omit' from source: magic vars 15330 1726882250.69535: variable 'ansible_version' from source: magic vars 15330 1726882250.69536: variable 'ansible_check_mode' from source: magic vars 15330 1726882250.69536: variable 'ansible_diff_mode' from source: magic vars 15330 1726882250.69537: variable 'ansible_forks' from source: magic vars 15330 1726882250.69538: variable 'ansible_inventory_sources' from source: magic vars 15330 1726882250.69538: variable 'ansible_skip_tags' from source: magic vars 15330 1726882250.69539: variable 'ansible_limit' from source: magic vars 15330 1726882250.69540: variable 'ansible_run_tags' from source: magic vars 15330 1726882250.69540: variable 'ansible_verbosity' from source: magic vars 15330 1726882250.69572: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 15330 1726882250.69758: in VariableManager get_vars() 15330 1726882250.69769: done with get_vars() 15330 1726882250.69913: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15330 1726882250.70154: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15330 1726882250.70256: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15330 1726882250.73536: in VariableManager get_vars() 15330 1726882250.73560: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15330 1726882250.75538: in VariableManager get_vars() 15330 1726882250.75542: done with get_vars() 15330 1726882250.75545: variable 'playbook_dir' from source: magic vars 15330 1726882250.75546: variable 'ansible_playbook_python' from source: magic vars 15330 1726882250.75546: variable 'ansible_config_file' from source: magic vars 15330 1726882250.75547: variable 'groups' from source: magic vars 15330 1726882250.75548: variable 'omit' from source: magic vars 15330 1726882250.75548: variable 'ansible_version' from source: magic vars 15330 1726882250.75549: variable 'ansible_check_mode' from source: magic vars 15330 1726882250.75550: variable 'ansible_diff_mode' from source: magic vars 15330 1726882250.75550: variable 'ansible_forks' from source: magic vars 15330 1726882250.75551: variable 'ansible_inventory_sources' from source: magic vars 15330 1726882250.75552: variable 'ansible_skip_tags' from source: magic vars 15330 1726882250.75553: variable 'ansible_limit' from source: magic vars 15330 1726882250.75605: variable 'ansible_run_tags' from source: magic vars 15330 1726882250.75606: variable 'ansible_verbosity' from source: magic vars 15330 1726882250.75642: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15330 1726882250.75792: in VariableManager get_vars() 15330 1726882250.75909: done with get_vars() 15330 1726882250.75953: in VariableManager get_vars() 15330 1726882250.75956: done with get_vars() 15330 1726882250.75958: variable 'playbook_dir' from source: magic vars 15330 1726882250.75959: variable 'ansible_playbook_python' from source: magic vars 15330 1726882250.75960: variable 'ansible_config_file' from source: magic vars 15330 1726882250.75961: variable 'groups' from source: magic vars 15330 1726882250.75962: variable 'omit' from source: magic vars 15330 1726882250.75963: variable 'ansible_version' from source: magic vars 15330 1726882250.75964: variable 'ansible_check_mode' from source: magic vars 15330 1726882250.75964: variable 'ansible_diff_mode' from source: magic vars 15330 1726882250.75965: variable 'ansible_forks' from source: magic vars 15330 1726882250.75966: variable 'ansible_inventory_sources' from source: magic vars 15330 1726882250.75967: variable 'ansible_skip_tags' from source: magic vars 15330 1726882250.75967: variable 'ansible_limit' from source: magic vars 15330 1726882250.75968: variable 'ansible_run_tags' from source: magic vars 15330 1726882250.75969: variable 'ansible_verbosity' from source: magic vars 15330 1726882250.76005: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15330 1726882250.76081: in VariableManager get_vars() 15330 1726882250.76317: done with get_vars() 15330 1726882250.76388: in VariableManager get_vars() 15330 1726882250.76402: done with get_vars() 15330 1726882250.76704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 15330 1726882250.76717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 15330 1726882250.77215: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 15330 1726882250.77420: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 15330 1726882250.77429: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 15330 1726882250.77459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 15330 1726882250.77486: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 15330 1726882250.77655: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 15330 1726882250.77713: Loaded config def from plugin (callback/default) 15330 1726882250.77716: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15330 1726882250.78920: Loaded config def from plugin (callback/junit) 15330 1726882250.78928: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15330 1726882250.78971: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 15330 1726882250.79048: Loaded config def from plugin (callback/minimal) 15330 1726882250.79051: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15330 1726882250.79174: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15330 1726882250.79236: Loaded config def from plugin (callback/tree) 15330 1726882250.79238: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 15330 1726882250.79491: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 15330 1726882250.79498: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15330 1726882250.79525: in VariableManager get_vars() 15330 1726882250.79539: done with get_vars() 15330 1726882250.79544: in VariableManager get_vars() 15330 1726882250.79551: done with get_vars() 15330 1726882250.79554: variable 'omit' from source: magic vars 15330 1726882250.79599: in VariableManager get_vars() 15330 1726882250.79615: done with get_vars() 15330 1726882250.79637: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 15330 1726882250.80222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 15330 1726882250.80305: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 15330 1726882250.80335: getting the remaining hosts for this loop 15330 1726882250.80342: done getting the remaining hosts for this loop 15330 1726882250.80349: getting the next task for host managed_node3 15330 1726882250.80353: done getting next task for host managed_node3 15330 1726882250.80355: ^ task is: TASK: Gathering Facts 15330 1726882250.80356: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882250.80358: getting variables 15330 1726882250.80359: in VariableManager get_vars() 15330 1726882250.80369: Calling all_inventory to load vars for managed_node3 15330 1726882250.80371: Calling groups_inventory to load vars for managed_node3 15330 1726882250.80373: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882250.80385: Calling all_plugins_play to load vars for managed_node3 15330 1726882250.80400: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882250.80405: Calling groups_plugins_play to load vars for managed_node3 15330 1726882250.80476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882250.80533: done with get_vars() 15330 1726882250.80540: done getting variables 15330 1726882250.80615: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Friday 20 September 2024 21:30:50 -0400 (0:00:00.012) 0:00:00.012 ****** 15330 1726882250.80636: entering _queue_task() for managed_node3/gather_facts 15330 1726882250.80637: Creating lock for gather_facts 15330 1726882250.81145: worker is 1 (out of 1 available) 15330 1726882250.81154: exiting _queue_task() for managed_node3/gather_facts 15330 1726882250.81167: done queuing things up, now waiting for results queue to drain 15330 1726882250.81169: waiting for pending results... 15330 1726882250.81342: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882250.81501: in run() - task 12673a56-9f93-e4fe-1358-00000000007e 15330 1726882250.81506: variable 'ansible_search_path' from source: unknown 15330 1726882250.81509: calling self._execute() 15330 1726882250.81573: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882250.81584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882250.81601: variable 'omit' from source: magic vars 15330 1726882250.81731: variable 'omit' from source: magic vars 15330 1726882250.81747: variable 'omit' from source: magic vars 15330 1726882250.81801: variable 'omit' from source: magic vars 15330 1726882250.81874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882250.81914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882250.81949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882250.81983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882250.82096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882250.82100: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882250.82103: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882250.82105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882250.82181: Set connection var ansible_pipelining to False 15330 1726882250.82212: Set connection var ansible_timeout to 10 15330 1726882250.82225: Set connection var ansible_connection to ssh 15330 1726882250.82232: Set connection var ansible_shell_type to sh 15330 1726882250.82242: Set connection var ansible_shell_executable to /bin/sh 15330 1726882250.82250: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882250.82332: variable 'ansible_shell_executable' from source: unknown 15330 1726882250.82340: variable 'ansible_connection' from source: unknown 15330 1726882250.82354: variable 'ansible_module_compression' from source: unknown 15330 1726882250.82422: variable 'ansible_shell_type' from source: unknown 15330 1726882250.82425: variable 'ansible_shell_executable' from source: unknown 15330 1726882250.82428: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882250.82430: variable 'ansible_pipelining' from source: unknown 15330 1726882250.82433: variable 'ansible_timeout' from source: unknown 15330 1726882250.82435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882250.82640: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882250.82644: variable 'omit' from source: magic vars 15330 1726882250.82646: starting attempt loop 15330 1726882250.82648: running the handler 15330 1726882250.82653: variable 'ansible_facts' from source: unknown 15330 1726882250.82700: _low_level_execute_command(): starting 15330 1726882250.82703: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882250.83458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882250.83474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882250.83490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882250.83521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882250.83578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882250.83644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882250.83674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882250.83690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882250.83776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882250.85470: stdout chunk (state=3): >>>/root <<< 15330 1726882250.85705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882250.85709: stdout chunk (state=3): >>><<< 15330 1726882250.85711: stderr chunk (state=3): >>><<< 15330 1726882250.85714: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882250.85717: _low_level_execute_command(): starting 15330 1726882250.85720: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867 `" && echo ansible-tmp-1726882250.856447-15360-196240918800867="` echo /root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867 `" ) && sleep 0' 15330 1726882250.86387: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882250.86402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882250.86506: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882250.86526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882250.86547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882250.86623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882250.88506: stdout chunk (state=3): >>>ansible-tmp-1726882250.856447-15360-196240918800867=/root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867 <<< 15330 1726882250.88649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882250.88669: stdout chunk (state=3): >>><<< 15330 1726882250.88680: stderr chunk (state=3): >>><<< 15330 1726882250.88704: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882250.856447-15360-196240918800867=/root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882250.88899: variable 'ansible_module_compression' from source: unknown 15330 1726882250.88902: ANSIBALLZ: Using generic lock for ansible.legacy.setup 15330 1726882250.88904: ANSIBALLZ: Acquiring lock 15330 1726882250.88906: ANSIBALLZ: Lock acquired: 140238209361168 15330 1726882250.88908: ANSIBALLZ: Creating module 15330 1726882251.16013: ANSIBALLZ: Writing module into payload 15330 1726882251.16164: ANSIBALLZ: Writing module 15330 1726882251.16190: ANSIBALLZ: Renaming module 15330 1726882251.16207: ANSIBALLZ: Done creating module 15330 1726882251.16246: variable 'ansible_facts' from source: unknown 15330 1726882251.16257: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882251.16270: _low_level_execute_command(): starting 15330 1726882251.16350: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 15330 1726882251.16891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882251.16913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882251.16927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882251.16944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882251.16960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882251.16971: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882251.16984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882251.17015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882251.17027: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882251.17046: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15330 1726882251.17062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882251.17211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882251.17329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882251.17423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882251.17649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15330 1726882251.19931: stdout chunk (state=3): >>>PLATFORM <<< 15330 1726882251.20111: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 <<< 15330 1726882251.20114: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 15330 1726882251.20317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882251.20320: stdout chunk (state=3): >>><<< 15330 1726882251.20323: stderr chunk (state=3): >>><<< 15330 1726882251.20325: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15330 1726882251.20330 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 15330 1726882251.20405: _low_level_execute_command(): starting 15330 1726882251.20408: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 15330 1726882251.20724: Sending initial data 15330 1726882251.20727: Sent initial data (1181 bytes) 15330 1726882251.21562: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882251.21581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882251.21900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882251.21950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882251.21982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882251.22141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15330 1726882251.27106: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 15330 1726882251.27854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882251.27859: stdout chunk (state=3): >>><<< 15330 1726882251.27861: stderr chunk (state=3): >>><<< 15330 1726882251.27864: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15330 1726882251.27918: variable 'ansible_facts' from source: unknown 15330 1726882251.27927: variable 'ansible_facts' from source: unknown 15330 1726882251.27944: variable 'ansible_module_compression' from source: unknown 15330 1726882251.28000: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882251.28035: variable 'ansible_facts' from source: unknown 15330 1726882251.28377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867/AnsiballZ_setup.py 15330 1726882251.28576: Sending initial data 15330 1726882251.28579: Sent initial data (153 bytes) 15330 1726882251.29414: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882251.29568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882251.29582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882251.29810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882251.31864: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882251.32025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882251.32094: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp84qsuzb0 /root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867/AnsiballZ_setup.py <<< 15330 1726882251.32098: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867/AnsiballZ_setup.py" <<< 15330 1726882251.32168: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp84qsuzb0" to remote "/root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867/AnsiballZ_setup.py" <<< 15330 1726882251.35604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882251.35615: stdout chunk (state=3): >>><<< 15330 1726882251.35636: stderr chunk (state=3): >>><<< 15330 1726882251.35708: done transferring module to remote 15330 1726882251.35885: _low_level_execute_command(): starting 15330 1726882251.35889: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867/ /root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867/AnsiballZ_setup.py && sleep 0' 15330 1726882251.36581: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882251.36598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882251.36614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882251.36669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882251.36738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882251.36753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882251.36781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882251.36986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882251.39200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882251.39221: stdout chunk (state=3): >>><<< 15330 1726882251.39224: stderr chunk (state=3): >>><<< 15330 1726882251.39239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882251.39247: _low_level_execute_command(): starting 15330 1726882251.39261: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867/AnsiballZ_setup.py && sleep 0' 15330 1726882251.40614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882251.40670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882251.40691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882251.40709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882251.40828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882251.43968: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15330 1726882251.44019: stdout chunk (state=3): >>>import '_io' # <<< 15330 1726882251.44155: stdout chunk (state=3): >>>import 'marshal' # <<< 15330 1726882251.44159: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 15330 1726882251.44215: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882251.44249: stdout chunk (state=3): >>>import '_codecs' # <<< 15330 1726882251.44358: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15330 1726882251.44461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf3604d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf32fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf362a50> import '_signal' # import '_abc' # import 'abc' # <<< 15330 1726882251.44483: stdout chunk (state=3): >>>import 'io' # <<< 15330 1726882251.44570: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15330 1726882251.44654: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15330 1726882251.44690: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15330 1726882251.44822: stdout chunk (state=3): >>>import 'os' # <<< 15330 1726882251.44826: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 15330 1726882251.44829: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 15330 1726882251.45032: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf371130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf371fa0> <<< 15330 1726882251.45036: stdout chunk (state=3): >>>import 'site' # <<< 15330 1726882251.45049: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15330 1726882251.45775: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15330 1726882251.45800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15330 1726882251.45826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15330 1726882251.45855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15330 1726882251.45998: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf14fda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf14ffb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15330 1726882251.46150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf187770> <<< 15330 1726882251.46192: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15330 1726882251.46216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf187e00> import '_collections' # <<< 15330 1726882251.46289: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf167a40> <<< 15330 1726882251.46323: stdout chunk (state=3): >>>import '_functools' # <<< 15330 1726882251.46410: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf165160> <<< 15330 1726882251.46470: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf14cf50> <<< 15330 1726882251.46543: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 15330 1726882251.46644: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15330 1726882251.46883: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1a76b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1a62d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf166030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1a4b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1dc6b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf14c1d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882251.46886: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf1dcb60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1dca10> <<< 15330 1726882251.47198: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf1dcdd0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf14acf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15330 1726882251.47202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1dd4c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1dd190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 15330 1726882251.47232: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1de3c0> import 'importlib.util' # <<< 15330 1726882251.47363: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1f85c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf1f9d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15330 1726882251.47383: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 15330 1726882251.47407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1faba0> <<< 15330 1726882251.47457: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf1fb200> <<< 15330 1726882251.47476: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1fa0f0> <<< 15330 1726882251.47508: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 15330 1726882251.47517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15330 1726882251.47571: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf1fbc80> <<< 15330 1726882251.47668: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1fb3b0> <<< 15330 1726882251.47671: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1de330> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15330 1726882251.47718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15330 1726882251.47775: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15330 1726882251.47799: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8beeebbf0> <<< 15330 1726882251.47880: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15330 1726882251.47956: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bef146e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef14440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bef14710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15330 1726882251.48024: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882251.48213: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bef14fe0> <<< 15330 1726882251.48363: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882251.48392: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bef159d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef14890> <<< 15330 1726882251.48434: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8beee9d90> <<< 15330 1726882251.48438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15330 1726882251.48495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15330 1726882251.48499: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15330 1726882251.48515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15330 1726882251.48662: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef16db0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef15af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1deae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15330 1726882251.48684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882251.48696: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15330 1726882251.48738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15330 1726882251.48769: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef3f110> <<< 15330 1726882251.48848: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15330 1726882251.48870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882251.48898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15330 1726882251.48916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15330 1726882251.48979: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef63470> <<< 15330 1726882251.48999: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15330 1726882251.49063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15330 1726882251.49145: stdout chunk (state=3): >>>import 'ntpath' # <<< 15330 1726882251.49175: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8befc4290> <<< 15330 1726882251.49211: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15330 1726882251.49242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15330 1726882251.49275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15330 1726882251.49460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8befc69f0> <<< 15330 1726882251.49572: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8befc43b0> <<< 15330 1726882251.49614: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef91280> <<< 15330 1726882251.49659: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bedc93d0> <<< 15330 1726882251.49690: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef62270> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef17ce0> <<< 15330 1726882251.49959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15330 1726882251.49985: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa8bef62870> <<< 15330 1726882251.50617: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_kapb56nn/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15330 1726882251.50705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15330 1726882251.50734: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee2f170> <<< 15330 1726882251.50761: stdout chunk (state=3): >>>import '_typing' # <<< 15330 1726882251.51037: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee0e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee0d1f0> <<< 15330 1726882251.51052: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.51083: stdout chunk (state=3): >>>import 'ansible' # <<< 15330 1726882251.51106: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.51253: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 15330 1726882251.53397: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.55242: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee2d010> <<< 15330 1726882251.55274: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882251.55310: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 15330 1726882251.55341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15330 1726882251.55365: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15330 1726882251.55391: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882251.55413: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bee5eb40> <<< 15330 1726882251.55461: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee5e900> <<< 15330 1726882251.55578: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee5e210> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15330 1726882251.55604: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee5e930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee2fb90> <<< 15330 1726882251.55755: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bee5f890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bee5fad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15330 1726882251.55821: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee5ffb0> <<< 15330 1726882251.55840: stdout chunk (state=3): >>>import 'pwd' # <<< 15330 1726882251.55859: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15330 1726882251.55896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15330 1726882251.55988: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be729c10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882251.56029: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be72b3e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15330 1726882251.56046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15330 1726882251.56096: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be72c290> <<< 15330 1726882251.56132: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15330 1726882251.56146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15330 1726882251.56169: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be72d3d0> <<< 15330 1726882251.56192: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15330 1726882251.56244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15330 1726882251.56585: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be72fec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf14ade0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be72e180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15330 1726882251.56909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15330 1726882251.56986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be737ef0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be7369c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be736720> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be736c90> <<< 15330 1726882251.57043: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be72e690> <<< 15330 1726882251.57221: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be77bec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be77c0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be77dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be77d9d0> <<< 15330 1726882251.57258: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15330 1726882251.57291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15330 1726882251.57480: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be780200> <<< 15330 1726882251.57547: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be77e300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be7839e0> <<< 15330 1726882251.57726: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be7803b0> <<< 15330 1726882251.57805: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be7847a0> <<< 15330 1726882251.57845: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be784a10> <<< 15330 1726882251.57902: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be784da0> <<< 15330 1726882251.57920: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be77c320> <<< 15330 1726882251.57973: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15330 1726882251.58008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15330 1726882251.58077: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882251.58235: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be6103e0> <<< 15330 1726882251.58405: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882251.58411: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be6113d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be786b70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be787f20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be7867b0> <<< 15330 1726882251.58435: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.58438: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 15330 1726882251.58463: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.58810: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15330 1726882251.58959: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.59140: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.60025: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.60924: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 15330 1726882251.60956: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 15330 1726882251.60999: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15330 1726882251.61028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882251.61066: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be6156d0> <<< 15330 1726882251.61233: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15330 1726882251.61281: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be616540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be611610> <<< 15330 1726882251.61328: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 15330 1726882251.61332: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.61343: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 15330 1726882251.61547: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.61590: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.62045: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be616c30> # zipimport: zlib available <<< 15330 1726882251.62603: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.63329: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.63431: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.63545: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15330 1726882251.63557: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.63606: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.63841: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882251.63889: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15330 1726882251.63904: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.63924: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 15330 1726882251.63946: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.64002: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.64046: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15330 1726882251.64068: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.64434: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.64933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 15330 1726882251.65023: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6176e0> <<< 15330 1726882251.65027: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.65127: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.65222: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 15330 1726882251.65255: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 15330 1726882251.65274: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.65331: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.65383: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15330 1726882251.65409: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.65464: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.65521: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.65598: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.65857: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be6222d0> <<< 15330 1726882251.65924: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be61d0a0> <<< 15330 1726882251.65965: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15330 1726882251.65969: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.66060: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.66154: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.66184: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.66240: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882251.66322: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15330 1726882251.66479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15330 1726882251.66537: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be70ac30> <<< 15330 1726882251.66609: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be7fe900> <<< 15330 1726882251.66719: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6224e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be622060> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15330 1726882251.66738: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.66857: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15330 1726882251.66892: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15330 1726882251.66933: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.66947: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15330 1726882251.67032: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.67179: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.67209: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.67233: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.67280: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.67459: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 15330 1726882251.67827: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 15330 1726882251.68127: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.68531: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 15330 1726882251.68535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15330 1726882251.68569: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6b6360> <<< 15330 1726882251.68584: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 15330 1726882251.68644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15330 1726882251.68783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be3102f0> <<< 15330 1726882251.68954: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be310560> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be69c8f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6b6f00> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6b4a40> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6b5430> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15330 1726882251.69032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15330 1726882251.69065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 15330 1726882251.69121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15330 1726882251.69159: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be313680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be312f30> <<< 15330 1726882251.69231: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be313110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be312360> <<< 15330 1726882251.69248: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15330 1726882251.69419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15330 1726882251.69460: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be313830> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15330 1726882251.69498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15330 1726882251.69540: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be372360> <<< 15330 1726882251.69664: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be370380> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6b47a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 15330 1726882251.69718: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.69791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15330 1726882251.69810: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.69883: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.69965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 15330 1726882251.69978: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.70001: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 15330 1726882251.70043: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.70088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 15330 1726882251.70165: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.70236: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 15330 1726882251.70349: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # <<< 15330 1726882251.70364: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.70445: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.70527: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.70609: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.70681: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 15330 1726882251.70705: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.71488: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.72047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 15330 1726882251.72065: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.72319: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 15330 1726882251.72323: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.72374: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15330 1726882251.72387: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.72423: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.72446: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 15330 1726882251.72458: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.72552: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 15330 1726882251.72613: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.72692: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 15330 1726882251.72773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be3725d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15330 1726882251.72949: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be373230> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 15330 1726882251.72988: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.73041: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 15330 1726882251.73308: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15330 1726882251.73314: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.73409: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.73504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15330 1726882251.73599: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15330 1726882251.73641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15330 1726882251.73706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15330 1726882251.73796: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882251.73887: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be3b2780> <<< 15330 1726882251.74192: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be3a23f0> <<< 15330 1726882251.74204: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 15330 1726882251.74214: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.74299: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.74373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 15330 1726882251.74459: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.74647: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.74664: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.74852: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.75091: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 15330 1726882251.75109: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 15330 1726882251.75133: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.75192: stdout chunk (state=3): >>># zipimport: zlib available<<< 15330 1726882251.75212: stdout chunk (state=3): >>> <<< 15330 1726882251.75250: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15330 1726882251.75277: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.75331: stdout chunk (state=3): >>># zipimport: zlib available<<< 15330 1726882251.75430: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15330 1726882251.75523: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882251.75529: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be3c5d00> <<< 15330 1726882251.75582: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be3a32c0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882251.75589: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 15330 1726882251.75622: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.75636: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.75665: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 15330 1726882251.75684: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.75874: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.76060: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 15330 1726882251.76064: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.76161: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.76264: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.76305: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.76350: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 15330 1726882251.76353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 15330 1726882251.76382: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.76385: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.76422: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.76548: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.76702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 15330 1726882251.76708: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.77020: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # <<< 15330 1726882251.77024: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.77077: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.77132: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.78049: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.78857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 15330 1726882251.78863: stdout chunk (state=3): >>> <<< 15330 1726882251.78899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available<<< 15330 1726882251.78902: stdout chunk (state=3): >>> <<< 15330 1726882251.79212: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15330 1726882251.79224: stdout chunk (state=3): >>> <<< 15330 1726882251.79237: stdout chunk (state=3): >>># zipimport: zlib available<<< 15330 1726882251.79254: stdout chunk (state=3): >>> <<< 15330 1726882251.79407: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.79514: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 15330 1726882251.79824: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.79855: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 15330 1726882251.79882: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.79914: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.79956: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 15330 1726882251.80072: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.80165: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.80352: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.80547: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 15330 1726882251.80569: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 15330 1726882251.80608: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.80639: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 15330 1726882251.80870: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 15330 1726882251.80925: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 15330 1726882251.80958: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.81016: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 15330 1726882251.81070: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.81144: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 15330 1726882251.81392: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.81649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15330 1726882251.81674: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.81715: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.81779: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15330 1726882251.81806: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.81821: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.81848: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15330 1726882251.81864: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.81905: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.81930: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 15330 1726882251.81963: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82008: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 15330 1726882251.82025: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82085: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82178: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15330 1726882251.82184: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82209: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 15330 1726882251.82244: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 15330 1726882251.82322: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82340: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82384: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82427: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82499: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 15330 1726882251.82602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 15330 1726882251.82635: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.82691: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15330 1726882251.82965: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882251.83089: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 15330 1726882251.83224: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882251.83268: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 15330 1726882251.83729: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 15330 1726882251.84052: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 15330 1726882251.84079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15330 1726882251.84150: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be1cf3b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be1cd9a0> <<< 15330 1726882251.84179: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be1ccc20> <<< 15330 1726882251.96589: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be216000> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be2149e0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be216150> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be215b80> <<< 15330 1726882251.96756: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 15330 1726882252.22347: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2975, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 556, "free": 2975}, "nocache": {"free": 3290, "used": 241}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_fact<<< 15330 1726882252.22361: stdout chunk (state=3): >>>or": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 559, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805019136, "block_size": 4096, "block_total": 65519099, "block_available": 63917241, "block_used": 1601858, "inode_total": 131070960, "inode_available": 131029130, "inode_used": 41830, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl<<< 15330 1726882252.22374: stdout chunk (state=3): >>>_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_ch<<< 15330 1726882252.22579: stdout chunk (state=3): >>>ecksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "52", "epoch": "1726882252", "epoch_int": "1726882252", "date": "2024-09-20", "time": "21:30:52", "iso8601_micro": "2024-09-21T01:30:52.218203Z", "iso8601": "2024-09-21T01:30:52Z", "iso8601_basic": "20240920T213052218203", "iso8601_basic_short": "20240920T213052", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_loadavg": {"1m": 1.04052734375, "5m": 0.45654296875, "15m": 0.21044921875}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:<<< 15330 1726882252.22586: stdout chunk (state=3): >>>/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882252.23338: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools <<< 15330 1726882252.23389: stdout chunk (state=3): >>># cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno <<< 15330 1726882252.23398: stdout chunk (state=3): >>># cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors <<< 15330 1726882252.23553: stdout chunk (state=3): >>># cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local <<< 15330 1726882252.23562: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux <<< 15330 1726882252.23644: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 15330 1726882252.23917: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15330 1726882252.23946: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 15330 1726882252.24016: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib <<< 15330 1726882252.24050: stdout chunk (state=3): >>># destroy zipfile._path.glob # destroy ipaddress <<< 15330 1726882252.24090: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 15330 1726882252.24110: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 15330 1726882252.24235: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors <<< 15330 1726882252.24252: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 15330 1726882252.24327: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 15330 1726882252.24331: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl <<< 15330 1726882252.24342: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 15330 1726882252.24397: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 15330 1726882252.24478: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon <<< 15330 1726882252.24484: stdout chunk (state=3): >>># cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 15330 1726882252.24561: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 15330 1726882252.24590: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 15330 1726882252.24610: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15330 1726882252.24772: stdout chunk (state=3): >>># destroy sys.monitoring <<< 15330 1726882252.24777: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 15330 1726882252.24810: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 15330 1726882252.24859: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15330 1726882252.24880: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 15330 1726882252.24902: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15330 1726882252.25003: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 15330 1726882252.25029: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 15330 1726882252.25072: stdout chunk (state=3): >>># destroy _hashlib <<< 15330 1726882252.25110: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15330 1726882252.25512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882252.25516: stdout chunk (state=3): >>><<< 15330 1726882252.25518: stderr chunk (state=3): >>><<< 15330 1726882252.25701: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf3604d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf32fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf362a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf371130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf371fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf14fda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf14ffb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf187770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf187e00> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf167a40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf165160> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf14cf50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1a76b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1a62d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf166030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1a4b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1dc6b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf14c1d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf1dcb60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1dca10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf1dcdd0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf14acf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1dd4c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1dd190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1de3c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1f85c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf1f9d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1faba0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf1fb200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1fa0f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf1fbc80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1fb3b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1de330> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8beeebbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bef146e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef14440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bef14710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bef14fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bef159d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef14890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8beee9d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef16db0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef15af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bf1deae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef3f110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef63470> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8befc4290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8befc69f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8befc43b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef91280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bedc93d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef62270> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bef17ce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa8bef62870> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_kapb56nn/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee2f170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee0e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee0d1f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee2d010> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bee5eb40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee5e900> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee5e210> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee5e930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee2fb90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bee5f890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bee5fad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8bee5ffb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be729c10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be72b3e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be72c290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be72d3d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be72fec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8bf14ade0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be72e180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be737ef0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be7369c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be736720> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be736c90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be72e690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be77bec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be77c0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be77dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be77d9d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be780200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be77e300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be7839e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be7803b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be7847a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be784a10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be784da0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be77c320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be6103e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be6113d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be786b70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be787f20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be7867b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be6156d0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be616540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be611610> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be616c30> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6176e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be6222d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be61d0a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be70ac30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be7fe900> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6224e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be622060> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6b6360> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be3102f0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be310560> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be69c8f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6b6f00> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6b4a40> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6b5430> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be313680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be312f30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be313110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be312360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be313830> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be372360> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be370380> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be6b47a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be3725d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be373230> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be3b2780> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be3a23f0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be3c5d00> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be3a32c0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8be1cf3b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be1cd9a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be1ccc20> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be216000> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be2149e0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be216150> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8be215b80> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2975, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 556, "free": 2975}, "nocache": {"free": 3290, "used": 241}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 559, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805019136, "block_size": 4096, "block_total": 65519099, "block_available": 63917241, "block_used": 1601858, "inode_total": 131070960, "inode_available": 131029130, "inode_used": 41830, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "52", "epoch": "1726882252", "epoch_int": "1726882252", "date": "2024-09-20", "time": "21:30:52", "iso8601_micro": "2024-09-21T01:30:52.218203Z", "iso8601": "2024-09-21T01:30:52Z", "iso8601_basic": "20240920T213052218203", "iso8601_basic_short": "20240920T213052", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_loadavg": {"1m": 1.04052734375, "5m": 0.45654296875, "15m": 0.21044921875}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 15330 1726882252.26870: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882252.26889: _low_level_execute_command(): starting 15330 1726882252.26892: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882250.856447-15360-196240918800867/ > /dev/null 2>&1 && sleep 0' 15330 1726882252.27346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882252.27349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882252.27352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882252.27356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882252.27358: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882252.27360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882252.27410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882252.27449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882252.27496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 1 <<< 15330 1726882252.29755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882252.29776: stderr chunk (state=3): >>><<< 15330 1726882252.29780: stdout chunk (state=3): >>><<< 15330 1726882252.29795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 1 debug2: Received exit status from master 0 15330 1726882252.29804: handler run complete 15330 1726882252.29878: variable 'ansible_facts' from source: unknown 15330 1726882252.29942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882252.30136: variable 'ansible_facts' from source: unknown 15330 1726882252.30188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882252.30273: attempt loop complete, returning result 15330 1726882252.30276: _execute() done 15330 1726882252.30278: dumping result to json 15330 1726882252.30289: done dumping result, returning 15330 1726882252.30301: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-00000000007e] 15330 1726882252.30303: sending task result for task 12673a56-9f93-e4fe-1358-00000000007e ok: [managed_node3] 15330 1726882252.31198: no more pending results, returning what we have 15330 1726882252.31212: results queue empty 15330 1726882252.31214: checking for any_errors_fatal 15330 1726882252.31215: done checking for any_errors_fatal 15330 1726882252.31216: checking for max_fail_percentage 15330 1726882252.31218: done checking for max_fail_percentage 15330 1726882252.31218: checking to see if all hosts have failed and the running result is not ok 15330 1726882252.31219: done checking to see if all hosts have failed 15330 1726882252.31221: getting the remaining hosts for this loop 15330 1726882252.31223: done getting the remaining hosts for this loop 15330 1726882252.31226: getting the next task for host managed_node3 15330 1726882252.31233: done getting next task for host managed_node3 15330 1726882252.31234: ^ task is: TASK: meta (flush_handlers) 15330 1726882252.31237: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882252.31240: getting variables 15330 1726882252.31242: in VariableManager get_vars() 15330 1726882252.31265: Calling all_inventory to load vars for managed_node3 15330 1726882252.31268: Calling groups_inventory to load vars for managed_node3 15330 1726882252.31271: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882252.31278: done sending task result for task 12673a56-9f93-e4fe-1358-00000000007e 15330 1726882252.31281: WORKER PROCESS EXITING 15330 1726882252.31291: Calling all_plugins_play to load vars for managed_node3 15330 1726882252.31295: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882252.31299: Calling groups_plugins_play to load vars for managed_node3 15330 1726882252.31505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882252.31741: done with get_vars() 15330 1726882252.31762: done getting variables 15330 1726882252.31836: in VariableManager get_vars() 15330 1726882252.31846: Calling all_inventory to load vars for managed_node3 15330 1726882252.31849: Calling groups_inventory to load vars for managed_node3 15330 1726882252.31851: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882252.31866: Calling all_plugins_play to load vars for managed_node3 15330 1726882252.31869: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882252.31873: Calling groups_plugins_play to load vars for managed_node3 15330 1726882252.32027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882252.32220: done with get_vars() 15330 1726882252.32232: done queuing things up, now waiting for results queue to drain 15330 1726882252.32234: results queue empty 15330 1726882252.32235: checking for any_errors_fatal 15330 1726882252.32237: done checking for any_errors_fatal 15330 1726882252.32242: checking for max_fail_percentage 15330 1726882252.32243: done checking for max_fail_percentage 15330 1726882252.32244: checking to see if all hosts have failed and the running result is not ok 15330 1726882252.32245: done checking to see if all hosts have failed 15330 1726882252.32245: getting the remaining hosts for this loop 15330 1726882252.32246: done getting the remaining hosts for this loop 15330 1726882252.32248: getting the next task for host managed_node3 15330 1726882252.32252: done getting next task for host managed_node3 15330 1726882252.32254: ^ task is: TASK: Include the task 'el_repo_setup.yml' 15330 1726882252.32256: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882252.32258: getting variables 15330 1726882252.32259: in VariableManager get_vars() 15330 1726882252.32266: Calling all_inventory to load vars for managed_node3 15330 1726882252.32268: Calling groups_inventory to load vars for managed_node3 15330 1726882252.32270: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882252.32274: Calling all_plugins_play to load vars for managed_node3 15330 1726882252.32276: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882252.32278: Calling groups_plugins_play to load vars for managed_node3 15330 1726882252.32419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882252.32902: done with get_vars() 15330 1726882252.32910: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Friday 20 September 2024 21:30:52 -0400 (0:00:01.524) 0:00:01.537 ****** 15330 1726882252.33115: entering _queue_task() for managed_node3/include_tasks 15330 1726882252.33117: Creating lock for include_tasks 15330 1726882252.33907: worker is 1 (out of 1 available) 15330 1726882252.33918: exiting _queue_task() for managed_node3/include_tasks 15330 1726882252.33930: done queuing things up, now waiting for results queue to drain 15330 1726882252.33932: waiting for pending results... 15330 1726882252.34317: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 15330 1726882252.34503: in run() - task 12673a56-9f93-e4fe-1358-000000000006 15330 1726882252.34507: variable 'ansible_search_path' from source: unknown 15330 1726882252.34539: calling self._execute() 15330 1726882252.34635: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882252.34700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882252.34714: variable 'omit' from source: magic vars 15330 1726882252.34783: _execute() done 15330 1726882252.34792: dumping result to json 15330 1726882252.34803: done dumping result, returning 15330 1726882252.34826: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-e4fe-1358-000000000006] 15330 1726882252.34841: sending task result for task 12673a56-9f93-e4fe-1358-000000000006 15330 1726882252.35074: done sending task result for task 12673a56-9f93-e4fe-1358-000000000006 15330 1726882252.35077: WORKER PROCESS EXITING 15330 1726882252.35159: no more pending results, returning what we have 15330 1726882252.35166: in VariableManager get_vars() 15330 1726882252.35203: Calling all_inventory to load vars for managed_node3 15330 1726882252.35205: Calling groups_inventory to load vars for managed_node3 15330 1726882252.35209: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882252.35223: Calling all_plugins_play to load vars for managed_node3 15330 1726882252.35226: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882252.35228: Calling groups_plugins_play to load vars for managed_node3 15330 1726882252.35582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882252.35783: done with get_vars() 15330 1726882252.35790: variable 'ansible_search_path' from source: unknown 15330 1726882252.35805: we have included files to process 15330 1726882252.35807: generating all_blocks data 15330 1726882252.35808: done generating all_blocks data 15330 1726882252.35809: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15330 1726882252.35810: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15330 1726882252.35813: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15330 1726882252.37264: in VariableManager get_vars() 15330 1726882252.37280: done with get_vars() 15330 1726882252.37292: done processing included file 15330 1726882252.37509: iterating over new_blocks loaded from include file 15330 1726882252.37512: in VariableManager get_vars() 15330 1726882252.37526: done with get_vars() 15330 1726882252.37528: filtering new block on tags 15330 1726882252.37696: done filtering new block on tags 15330 1726882252.37700: in VariableManager get_vars() 15330 1726882252.37711: done with get_vars() 15330 1726882252.37712: filtering new block on tags 15330 1726882252.37734: done filtering new block on tags 15330 1726882252.37741: in VariableManager get_vars() 15330 1726882252.37752: done with get_vars() 15330 1726882252.37754: filtering new block on tags 15330 1726882252.37767: done filtering new block on tags 15330 1726882252.37769: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 15330 1726882252.37775: extending task lists for all hosts with included blocks 15330 1726882252.37823: done extending task lists 15330 1726882252.37824: done processing included files 15330 1726882252.37825: results queue empty 15330 1726882252.37826: checking for any_errors_fatal 15330 1726882252.37827: done checking for any_errors_fatal 15330 1726882252.37828: checking for max_fail_percentage 15330 1726882252.37829: done checking for max_fail_percentage 15330 1726882252.37830: checking to see if all hosts have failed and the running result is not ok 15330 1726882252.38030: done checking to see if all hosts have failed 15330 1726882252.38032: getting the remaining hosts for this loop 15330 1726882252.38033: done getting the remaining hosts for this loop 15330 1726882252.38036: getting the next task for host managed_node3 15330 1726882252.38040: done getting next task for host managed_node3 15330 1726882252.38043: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 15330 1726882252.38045: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882252.38048: getting variables 15330 1726882252.38048: in VariableManager get_vars() 15330 1726882252.38056: Calling all_inventory to load vars for managed_node3 15330 1726882252.38059: Calling groups_inventory to load vars for managed_node3 15330 1726882252.38061: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882252.38071: Calling all_plugins_play to load vars for managed_node3 15330 1726882252.38073: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882252.38076: Calling groups_plugins_play to load vars for managed_node3 15330 1726882252.38351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882252.38777: done with get_vars() 15330 1726882252.38786: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:30:52 -0400 (0:00:00.058) 0:00:01.595 ****** 15330 1726882252.38964: entering _queue_task() for managed_node3/setup 15330 1726882252.39699: worker is 1 (out of 1 available) 15330 1726882252.39711: exiting _queue_task() for managed_node3/setup 15330 1726882252.39722: done queuing things up, now waiting for results queue to drain 15330 1726882252.39724: waiting for pending results... 15330 1726882252.40812: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 15330 1726882252.40817: in run() - task 12673a56-9f93-e4fe-1358-00000000008f 15330 1726882252.40821: variable 'ansible_search_path' from source: unknown 15330 1726882252.40824: variable 'ansible_search_path' from source: unknown 15330 1726882252.40828: calling self._execute() 15330 1726882252.40860: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882252.40872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882252.40885: variable 'omit' from source: magic vars 15330 1726882252.42202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882252.46112: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882252.46182: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882252.46439: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882252.46490: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882252.46525: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882252.47000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882252.47007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882252.47011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882252.47013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882252.47015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882252.47278: variable 'ansible_facts' from source: unknown 15330 1726882252.47354: variable 'network_test_required_facts' from source: task vars 15330 1726882252.47398: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 15330 1726882252.47800: variable 'omit' from source: magic vars 15330 1726882252.47804: variable 'omit' from source: magic vars 15330 1726882252.47807: variable 'omit' from source: magic vars 15330 1726882252.47810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882252.47813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882252.47816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882252.47819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882252.47822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882252.48005: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882252.48014: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882252.48022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882252.48112: Set connection var ansible_pipelining to False 15330 1726882252.48128: Set connection var ansible_timeout to 10 15330 1726882252.48133: Set connection var ansible_connection to ssh 15330 1726882252.48138: Set connection var ansible_shell_type to sh 15330 1726882252.48146: Set connection var ansible_shell_executable to /bin/sh 15330 1726882252.48155: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882252.48180: variable 'ansible_shell_executable' from source: unknown 15330 1726882252.48187: variable 'ansible_connection' from source: unknown 15330 1726882252.48198: variable 'ansible_module_compression' from source: unknown 15330 1726882252.48206: variable 'ansible_shell_type' from source: unknown 15330 1726882252.48599: variable 'ansible_shell_executable' from source: unknown 15330 1726882252.48602: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882252.48605: variable 'ansible_pipelining' from source: unknown 15330 1726882252.48607: variable 'ansible_timeout' from source: unknown 15330 1726882252.48609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882252.48611: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882252.48614: variable 'omit' from source: magic vars 15330 1726882252.48616: starting attempt loop 15330 1726882252.48618: running the handler 15330 1726882252.48620: _low_level_execute_command(): starting 15330 1726882252.48622: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882252.50388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882252.50399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882252.50549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882252.50582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882252.50684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882252.50758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882252.52423: stdout chunk (state=3): >>>/root <<< 15330 1726882252.52611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882252.52677: stderr chunk (state=3): >>><<< 15330 1726882252.52681: stdout chunk (state=3): >>><<< 15330 1726882252.52719: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882252.52731: _low_level_execute_command(): starting 15330 1726882252.52740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794 `" && echo ansible-tmp-1726882252.5271947-15437-173364677889794="` echo /root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794 `" ) && sleep 0' 15330 1726882252.54454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882252.54469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 15330 1726882252.54480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882252.54751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882252.54833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882252.55112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882252.56720: stdout chunk (state=3): >>>ansible-tmp-1726882252.5271947-15437-173364677889794=/root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794 <<< 15330 1726882252.56827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882252.56866: stderr chunk (state=3): >>><<< 15330 1726882252.56869: stdout chunk (state=3): >>><<< 15330 1726882252.56927: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882252.5271947-15437-173364677889794=/root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882252.56960: variable 'ansible_module_compression' from source: unknown 15330 1726882252.57165: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882252.57278: variable 'ansible_facts' from source: unknown 15330 1726882252.57586: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794/AnsiballZ_setup.py 15330 1726882252.58017: Sending initial data 15330 1726882252.58156: Sent initial data (154 bytes) 15330 1726882252.59861: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882252.60019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882252.60101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882252.62398: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882252.62500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882252.62552: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpiqvom_7v /root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794/AnsiballZ_setup.py <<< 15330 1726882252.62579: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794/AnsiballZ_setup.py" <<< 15330 1726882252.62611: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpiqvom_7v" to remote "/root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794/AnsiballZ_setup.py" <<< 15330 1726882252.66704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882252.66708: stderr chunk (state=3): >>><<< 15330 1726882252.66711: stdout chunk (state=3): >>><<< 15330 1726882252.66713: done transferring module to remote 15330 1726882252.66715: _low_level_execute_command(): starting 15330 1726882252.66717: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794/ /root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794/AnsiballZ_setup.py && sleep 0' 15330 1726882252.68012: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882252.68127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882252.68216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15330 1726882252.70785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882252.70788: stdout chunk (state=3): >>><<< 15330 1726882252.70791: stderr chunk (state=3): >>><<< 15330 1726882252.70811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15330 1726882252.70821: _low_level_execute_command(): starting 15330 1726882252.70830: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794/AnsiballZ_setup.py && sleep 0' 15330 1726882252.72417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882252.72450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882252.72474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882252.72514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882252.72784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882252.75927: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 15330 1726882252.75933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882252.75951: stdout chunk (state=3): >>>import '_codecs' # <<< 15330 1726882252.75964: stdout chunk (state=3): >>>import 'codecs' # <<< 15330 1726882252.76211: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47875104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47874dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787512a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 15330 1726882252.76223: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15330 1726882252.76328: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15330 1726882252.76355: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15330 1726882252.76389: stdout chunk (state=3): >>>import 'os' # <<< 15330 1726882252.76412: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 15330 1726882252.76510: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 15330 1726882252.76840: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872c1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15330 1726882252.77515: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15330 1726882252.77540: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872ffe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872fff20> <<< 15330 1726882252.77553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15330 1726882252.77588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15330 1726882252.77615: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15330 1726882252.77687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882252.77692: stdout chunk (state=3): >>>import 'itertools' # <<< 15330 1726882252.77787: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787337890> <<< 15330 1726882252.77849: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787337f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787317b30> import '_functools' # <<< 15330 1726882252.77877: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787315250> <<< 15330 1726882252.78038: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872fd010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15330 1726882252.78220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15330 1726882252.78420: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787357800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787356450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787316120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787354cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872fc290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478738cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882252.78435: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478738cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872fadb0> <<< 15330 1726882252.78490: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15330 1726882252.78526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15330 1726882252.78660: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738d370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738e5a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15330 1726882252.78702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15330 1726882252.78859: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47873a47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47873a5e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47873a6d20> <<< 15330 1726882252.78887: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47873a7320> <<< 15330 1726882252.78924: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47873a6270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15330 1726882252.78983: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47873a7da0> <<< 15330 1726882252.79001: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47873a74d0> <<< 15330 1726882252.79046: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738e510> <<< 15330 1726882252.79067: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15330 1726882252.79104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15330 1726882252.79124: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15330 1726882252.79149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15330 1726882252.79357: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478709bbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47870c4740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870c44a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47870c4680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15330 1726882252.79382: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882252.79572: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47870c4fe0> <<< 15330 1726882252.79845: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47870c5910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870c48c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787099d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15330 1726882252.79849: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15330 1726882252.79854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870c6d20> <<< 15330 1726882252.79869: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870c5a60> <<< 15330 1726882252.79899: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738e750> <<< 15330 1726882252.79920: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15330 1726882252.80007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882252.80024: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15330 1726882252.80349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870ef080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787113440> <<< 15330 1726882252.80367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15330 1726882252.80427: stdout chunk (state=3): >>>import 'ntpath' # <<< 15330 1726882252.80466: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 15330 1726882252.80529: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787174230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15330 1726882252.80551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15330 1726882252.80680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15330 1726882252.80725: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787176990> <<< 15330 1726882252.80823: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787174350> <<< 15330 1726882252.80911: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787141250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786f7d310> <<< 15330 1726882252.80939: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787112240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870c7c50> <<< 15330 1726882252.81200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15330 1726882252.81528: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4786f7d5b0> <<< 15330 1726882252.81573: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_pwwz2nr7/ansible_setup_payload.zip' <<< 15330 1726882252.81597: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.81810: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.81871: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 15330 1726882252.81888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15330 1726882252.81936: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15330 1726882252.82051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15330 1726882252.82092: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 15330 1726882252.82109: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786fe6fc0> <<< 15330 1726882252.82126: stdout chunk (state=3): >>>import '_typing' # <<< 15330 1726882252.82422: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786fc5eb0> <<< 15330 1726882252.82426: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786fc50a0> <<< 15330 1726882252.82435: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.82461: stdout chunk (state=3): >>>import 'ansible' # <<< 15330 1726882252.82495: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15330 1726882252.82516: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 15330 1726882252.82746: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.84698: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.86456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 15330 1726882252.86488: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786fe52b0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882252.86518: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 15330 1726882252.86522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15330 1726882252.86548: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15330 1726882252.86584: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4787016990> <<< 15330 1726882252.86627: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787016720> <<< 15330 1726882252.86670: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787016030> <<< 15330 1726882252.86687: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 15330 1726882252.86715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15330 1726882252.86755: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787016480> <<< 15330 1726882252.86768: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786fe7c50> import 'atexit' # <<< 15330 1726882252.86790: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882252.86796: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4787017710> <<< 15330 1726882252.86828: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4787017920> <<< 15330 1726882252.86842: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15330 1726882252.86913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15330 1726882252.86916: stdout chunk (state=3): >>>import '_locale' # <<< 15330 1726882252.87076: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787017e60> <<< 15330 1726882252.87100: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15330 1726882252.87129: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786929c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478692b890> <<< 15330 1726882252.87142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15330 1726882252.87210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478692c230> <<< 15330 1726882252.87214: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15330 1726882252.87259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15330 1726882252.87262: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478692d3a0> <<< 15330 1726882252.87338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15330 1726882252.87360: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 15330 1726882252.87374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15330 1726882252.87425: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478692fe00> <<< 15330 1726882252.87471: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47872faea0> <<< 15330 1726882252.87527: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478692e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15330 1726882252.87557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15330 1726882252.87590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15330 1726882252.87741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15330 1726882252.87778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786937e30> <<< 15330 1726882252.87860: stdout chunk (state=3): >>>import '_tokenize' # <<< 15330 1726882252.87883: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786936900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786936660> <<< 15330 1726882252.87909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15330 1726882252.88012: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786936bd0> <<< 15330 1726882252.88049: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478692e5d0> <<< 15330 1726882252.88119: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882252.88123: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478697bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478697c1d0> <<< 15330 1726882252.88140: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 15330 1726882252.88170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 15330 1726882252.88215: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478697dc70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478697da30> <<< 15330 1726882252.88259: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15330 1726882252.88324: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786980230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478697e360> <<< 15330 1726882252.88351: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15330 1726882252.88422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 15330 1726882252.88502: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47869839e0> <<< 15330 1726882252.88676: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47869803e0> <<< 15330 1726882252.88754: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786984ad0> <<< 15330 1726882252.88792: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786984bc0> <<< 15330 1726882252.88848: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786984c50> <<< 15330 1726882252.88913: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478697c320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15330 1726882252.88931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15330 1726882252.88979: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478680c2f0> <<< 15330 1726882252.89213: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478680d100> <<< 15330 1726882252.89269: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786986ae0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882252.89272: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786987e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47869866f0> <<< 15330 1726882252.89285: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.89311: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15330 1726882252.89428: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.89575: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 15330 1726882252.89602: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 15330 1726882252.89645: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.89791: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.89961: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.90849: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.91724: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 15330 1726882252.91761: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882252.91852: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786815370> <<< 15330 1726882252.91926: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 15330 1726882252.92038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786816210> <<< 15330 1726882252.92065: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478680ff20> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 15330 1726882252.92080: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.92296: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.92526: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15330 1726882252.92560: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786816900> # zipimport: zlib available <<< 15330 1726882252.93350: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.93970: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.94067: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.94173: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 15330 1726882252.94227: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.94277: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15330 1726882252.94279: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.94406: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.94513: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882252.94541: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15330 1726882252.94637: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 15330 1726882252.94648: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.94999: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.95362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15330 1726882252.95440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15330 1726882252.95556: stdout chunk (state=3): >>>import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868174a0> <<< 15330 1726882252.95574: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.95661: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.95783: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15330 1726882252.95787: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 15330 1726882252.95814: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 15330 1726882252.95867: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.96201: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15330 1726882252.96213: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.96241: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882252.96365: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786821f10> <<< 15330 1726882252.96458: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478681d700> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15330 1726882252.96544: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.96661: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15330 1726882252.96715: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882252.96746: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15330 1726882252.96770: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15330 1726882252.96846: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15330 1726882252.96876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15330 1726882252.96890: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15330 1726882252.96914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15330 1726882252.96990: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478690a8d0> <<< 15330 1726882252.97045: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870425a0> <<< 15330 1726882252.97149: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786822030> <<< 15330 1726882252.97170: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786986b10> # destroy ansible.module_utils.distro <<< 15330 1726882252.97204: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882252.97242: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15330 1726882252.97322: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 15330 1726882252.97355: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15330 1726882252.97431: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.97537: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15330 1726882252.97557: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.97655: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.97670: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.97711: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.97749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 15330 1726882252.97849: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.97869: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.97978: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.98006: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.98046: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 15330 1726882252.98324: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.98628: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15330 1726882252.98698: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882252.98737: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15330 1726882252.98788: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15330 1726882252.98805: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868b1e80> <<< 15330 1726882252.98866: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15330 1726882252.98915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15330 1726882252.98958: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 15330 1726882252.98968: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47864c7fe0> <<< 15330 1726882252.99026: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47864cc350> <<< 15330 1726882252.99098: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868987a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868b29c0> <<< 15330 1726882252.99128: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868b05f0> <<< 15330 1726882252.99179: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868b01a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15330 1726882252.99460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47864cf290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47864ceb40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47864cecf0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47864cdf70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15330 1726882252.99519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15330 1726882252.99539: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47864cf200> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15330 1726882252.99574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15330 1726882252.99627: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478652dd90> <<< 15330 1726882252.99647: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47864cfd70> <<< 15330 1726882252.99686: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868b0200> import 'ansible.module_utils.facts.timeout' # <<< 15330 1726882252.99713: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 15330 1726882252.99743: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 15330 1726882252.99817: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15330 1726882252.99902: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15330 1726882252.99906: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882252.99978: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.00050: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15330 1726882253.00057: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.00102: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 15330 1726882253.00149: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 15330 1726882253.00172: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.00215: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.00269: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 15330 1726882253.00602: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882253.00623: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.00710: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 15330 1726882253.00745: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.01508: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.02222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 15330 1726882253.02225: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.02360: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15330 1726882253.02404: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.02459: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 15330 1726882253.02500: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.02542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 15330 1726882253.02565: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.02643: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.02857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 15330 1726882253.02884: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.02898: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 15330 1726882253.02935: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.03047: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.03184: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 15330 1726882253.03202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15330 1726882253.03235: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478652f050> <<< 15330 1726882253.03267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 15330 1726882253.03331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15330 1726882253.03523: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478652e780> <<< 15330 1726882253.03526: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 15330 1726882253.03543: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.03631: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.03728: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 15330 1726882253.03748: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.03875: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.04158: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882253.04229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15330 1726882253.04233: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.04281: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.04362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15330 1726882253.04435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15330 1726882253.04508: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.04608: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.04619: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786569ee0> <<< 15330 1726882253.04917: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786559be0> <<< 15330 1726882253.04942: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 15330 1726882253.04946: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.05027: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.05108: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 15330 1726882253.05132: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.05262: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.05401: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.05567: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.05779: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 15330 1726882253.05783: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 15330 1726882253.05812: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.05871: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.05944: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 15330 1726882253.05996: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.06157: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478657d940> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478655a720> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882253.06165: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 15330 1726882253.06186: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.06239: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.06300: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 15330 1726882253.06310: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.06556: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.06799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 15330 1726882253.06823: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.07022: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.07125: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.07256: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 15330 1726882253.07280: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.07335: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.07525: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.07945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882253.08116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 15330 1726882253.08127: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.08179: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.08228: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.09095: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.09954: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 15330 1726882253.10085: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.10232: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15330 1726882253.10284: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.10403: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.10541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 15330 1726882253.10643: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.10802: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.11033: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 15330 1726882253.11067: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.11073: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.11091: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 15330 1726882253.11111: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.11177: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.11232: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 15330 1726882253.11258: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.11408: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.11644: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.11868: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.12185: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 15330 1726882253.12224: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 15330 1726882253.12227: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.12267: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.12321: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15330 1726882253.12338: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.12378: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.12416: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 15330 1726882253.12430: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.12525: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.12857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 15330 1726882253.12873: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.12960: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.13044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 15330 1726882253.13063: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.13502: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.13867: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15330 1726882253.13886: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.13973: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14051: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15330 1726882253.14072: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14121: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14166: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15330 1726882253.14185: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14235: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14280: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 15330 1726882253.14303: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14353: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 15330 1726882253.14415: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14534: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15330 1726882253.14679: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14696: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14711: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 15330 1726882253.14728: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14797: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 15330 1726882253.14878: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14907: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.14937: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.15042: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.15087: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.15196: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.15351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 15330 1726882253.15354: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 15330 1726882253.15432: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.15502: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 15330 1726882253.15519: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.15864: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.16124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 15330 1726882253.16145: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.16244: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.16265: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 15330 1726882253.16289: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.16357: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.16425: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 15330 1726882253.16438: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.16646: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.16674: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 15330 1726882253.16691: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 15330 1726882253.16712: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.16845: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.16980: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 15330 1726882253.16991: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # <<< 15330 1726882253.16999: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # <<< 15330 1726882253.17109: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.17816: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 15330 1726882253.17832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15330 1726882253.17858: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 15330 1726882253.17949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478637a270> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786378b30> <<< 15330 1726882253.18006: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786373b90> <<< 15330 1726882253.19153: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-e<<< 15330 1726882253.19171: stdout chunk (state=3): >>>ast-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "53", "epoch": "1726882253", "epoch_int": "1726882253", "date": "2024-09-20", "time": "21:30:53", "iso8601_micro": "2024-09-21T01:30:53.188458Z", "iso8601": "2024-09-21T01:30:53Z", "iso8601_basic": "20240920T213053188458", "iso8601_basic_short": "20240920T213053", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882253.20174: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanu<<< 15330 1726882253.20192: stdout chunk (state=3): >>>p[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing<<< 15330 1726882253.20227: stdout chunk (state=3): >>> ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai <<< 15330 1726882253.20237: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system<<< 15330 1726882253.20265: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns <<< 15330 1726882253.20274: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb <<< 15330 1726882253.20298: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr <<< 15330 1726882253.20302: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd <<< 15330 1726882253.20333: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn <<< 15330 1726882253.20349: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos <<< 15330 1726882253.20362: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base <<< 15330 1726882253.20368: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 15330 1726882253.20810: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15330 1726882253.20825: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15330 1726882253.20869: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression <<< 15330 1726882253.20878: stdout chunk (state=3): >>># destroy _lzma <<< 15330 1726882253.20892: stdout chunk (state=3): >>># destroy _blake2 <<< 15330 1726882253.20920: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 15330 1726882253.20926: stdout chunk (state=3): >>># destroy zipfile._path <<< 15330 1726882253.21054: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal <<< 15330 1726882253.21063: stdout chunk (state=3): >>># destroy _posixsubprocess <<< 15330 1726882253.21066: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 15330 1726882253.21128: stdout chunk (state=3): >>># destroy selinux <<< 15330 1726882253.21130: stdout chunk (state=3): >>># destroy shutil <<< 15330 1726882253.21157: stdout chunk (state=3): >>># destroy distro # destroy distro.distro <<< 15330 1726882253.21162: stdout chunk (state=3): >>># destroy argparse # destroy logging <<< 15330 1726882253.21216: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 15330 1726882253.21234: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection <<< 15330 1726882253.21239: stdout chunk (state=3): >>># destroy multiprocessing.pool # destroy signal <<< 15330 1726882253.21263: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context <<< 15330 1726882253.21267: stdout chunk (state=3): >>># destroy array <<< 15330 1726882253.21284: stdout chunk (state=3): >>># destroy _compat_pickle <<< 15330 1726882253.21564: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 15330 1726882253.21568: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap <<< 15330 1726882253.21588: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants <<< 15330 1726882253.21599: stdout chunk (state=3): >>># destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 15330 1726882253.21617: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 15330 1726882253.21644: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 15330 1726882253.21656: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections <<< 15330 1726882253.21670: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 15330 1726882253.21690: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig <<< 15330 1726882253.21705: stdout chunk (state=3): >>># cleanup[3] wiping os <<< 15330 1726882253.21724: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath <<< 15330 1726882253.21730: stdout chunk (state=3): >>># cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io<<< 15330 1726882253.21757: stdout chunk (state=3): >>> # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases <<< 15330 1726882253.21764: stdout chunk (state=3): >>># cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 15330 1726882253.21788: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 15330 1726882253.21804: stdout chunk (state=3): >>># cleanup[3] wiping marshal <<< 15330 1726882253.21820: stdout chunk (state=3): >>># cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 15330 1726882253.21824: stdout chunk (state=3): >>># cleanup[3] wiping sys <<< 15330 1726882253.21850: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 15330 1726882253.21859: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon <<< 15330 1726882253.21868: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15330 1726882253.22034: stdout chunk (state=3): >>># destroy sys.monitoring <<< 15330 1726882253.22048: stdout chunk (state=3): >>># destroy _socket <<< 15330 1726882253.22066: stdout chunk (state=3): >>># destroy _collections <<< 15330 1726882253.22107: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 15330 1726882253.22112: stdout chunk (state=3): >>># destroy stat # destroy genericpath <<< 15330 1726882253.22131: stdout chunk (state=3): >>># destroy re._parser <<< 15330 1726882253.22136: stdout chunk (state=3): >>># destroy tokenize <<< 15330 1726882253.22348: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 15330 1726882253.22375: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 15330 1726882253.22409: stdout chunk (state=3): >>># destroy _hashlib <<< 15330 1726882253.22423: stdout chunk (state=3): >>># destroy _operator <<< 15330 1726882253.22431: stdout chunk (state=3): >>># destroy _sre <<< 15330 1726882253.22444: stdout chunk (state=3): >>># destroy _string # destroy re <<< 15330 1726882253.22461: stdout chunk (state=3): >>># destroy itertools <<< 15330 1726882253.22474: stdout chunk (state=3): >>># destroy _abc <<< 15330 1726882253.22486: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins <<< 15330 1726882253.22496: stdout chunk (state=3): >>># destroy _thread <<< 15330 1726882253.22511: stdout chunk (state=3): >>># clear sys.audit hooks <<< 15330 1726882253.22958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882253.22996: stderr chunk (state=3): >>><<< 15330 1726882253.22999: stdout chunk (state=3): >>><<< 15330 1726882253.23107: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47875104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47874dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787512a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872c1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872ffe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872fff20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787337890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787337f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787317b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787315250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872fd010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787357800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787356450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787316120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787354cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872fc290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478738cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478738cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47872fadb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738d370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738e5a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47873a47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47873a5e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47873a6d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47873a7320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47873a6270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47873a7da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47873a74d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738e510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478709bbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47870c4740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870c44a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47870c4680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47870c4fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47870c5910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870c48c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787099d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870c6d20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870c5a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478738e750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870ef080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787113440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787174230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787176990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787174350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787141250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786f7d310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787112240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870c7c50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4786f7d5b0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_pwwz2nr7/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786fe6fc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786fc5eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786fc50a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786fe52b0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4787016990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787016720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787016030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787016480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786fe7c50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4787017710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4787017920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4787017e60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786929c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478692b890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478692c230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478692d3a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478692fe00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47872faea0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478692e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786937e30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786936900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786936660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786936bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478692e5d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478697bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478697c1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478697dc70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478697da30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786980230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478697e360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47869839e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47869803e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786984ad0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786984bc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786984c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478697c320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478680c2f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478680d100> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786986ae0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786987e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47869866f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786815370> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786816210> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478680ff20> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786816900> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868174a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786821f10> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478681d700> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478690a8d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47870425a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786822030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786986b10> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868b1e80> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47864c7fe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47864cc350> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868987a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868b29c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868b05f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868b01a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47864cf290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47864ceb40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f47864cecf0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47864cdf70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47864cf200> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478652dd90> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47864cfd70> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f47868b0200> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478652f050> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478652e780> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4786569ee0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786559be0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478657d940> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f478655a720> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f478637a270> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786378b30> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4786373b90> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "53", "epoch": "1726882253", "epoch_int": "1726882253", "date": "2024-09-20", "time": "21:30:53", "iso8601_micro": "2024-09-21T01:30:53.188458Z", "iso8601": "2024-09-21T01:30:53Z", "iso8601_basic": "20240920T213053188458", "iso8601_basic_short": "20240920T213053", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15330 1726882253.23912: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882253.23916: _low_level_execute_command(): starting 15330 1726882253.23918: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882252.5271947-15437-173364677889794/ > /dev/null 2>&1 && sleep 0' 15330 1726882253.23921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882253.23923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882253.23925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882253.23927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882253.23939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882253.23942: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.23951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882253.23967: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882253.23969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.24022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882253.24026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882253.24035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882253.24097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882253.26682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882253.26712: stderr chunk (state=3): >>><<< 15330 1726882253.26715: stdout chunk (state=3): >>><<< 15330 1726882253.26733: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882253.26739: handler run complete 15330 1726882253.26767: variable 'ansible_facts' from source: unknown 15330 1726882253.26812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882253.26883: variable 'ansible_facts' from source: unknown 15330 1726882253.26919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882253.26954: attempt loop complete, returning result 15330 1726882253.26957: _execute() done 15330 1726882253.26960: dumping result to json 15330 1726882253.26969: done dumping result, returning 15330 1726882253.26977: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-e4fe-1358-00000000008f] 15330 1726882253.26979: sending task result for task 12673a56-9f93-e4fe-1358-00000000008f 15330 1726882253.27111: done sending task result for task 12673a56-9f93-e4fe-1358-00000000008f 15330 1726882253.27114: WORKER PROCESS EXITING ok: [managed_node3] 15330 1726882253.27213: no more pending results, returning what we have 15330 1726882253.27216: results queue empty 15330 1726882253.27217: checking for any_errors_fatal 15330 1726882253.27218: done checking for any_errors_fatal 15330 1726882253.27219: checking for max_fail_percentage 15330 1726882253.27221: done checking for max_fail_percentage 15330 1726882253.27222: checking to see if all hosts have failed and the running result is not ok 15330 1726882253.27222: done checking to see if all hosts have failed 15330 1726882253.27223: getting the remaining hosts for this loop 15330 1726882253.27225: done getting the remaining hosts for this loop 15330 1726882253.27229: getting the next task for host managed_node3 15330 1726882253.27236: done getting next task for host managed_node3 15330 1726882253.27239: ^ task is: TASK: Check if system is ostree 15330 1726882253.27241: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882253.27244: getting variables 15330 1726882253.27246: in VariableManager get_vars() 15330 1726882253.27273: Calling all_inventory to load vars for managed_node3 15330 1726882253.27275: Calling groups_inventory to load vars for managed_node3 15330 1726882253.27278: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882253.27287: Calling all_plugins_play to load vars for managed_node3 15330 1726882253.27290: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882253.27292: Calling groups_plugins_play to load vars for managed_node3 15330 1726882253.27443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882253.27555: done with get_vars() 15330 1726882253.27562: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:30:53 -0400 (0:00:00.886) 0:00:02.482 ****** 15330 1726882253.27632: entering _queue_task() for managed_node3/stat 15330 1726882253.27835: worker is 1 (out of 1 available) 15330 1726882253.27848: exiting _queue_task() for managed_node3/stat 15330 1726882253.27860: done queuing things up, now waiting for results queue to drain 15330 1726882253.27861: waiting for pending results... 15330 1726882253.28002: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 15330 1726882253.28063: in run() - task 12673a56-9f93-e4fe-1358-000000000091 15330 1726882253.28073: variable 'ansible_search_path' from source: unknown 15330 1726882253.28076: variable 'ansible_search_path' from source: unknown 15330 1726882253.28109: calling self._execute() 15330 1726882253.28163: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882253.28167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882253.28175: variable 'omit' from source: magic vars 15330 1726882253.28507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882253.28679: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882253.28716: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882253.28743: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882253.28785: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882253.28856: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882253.28874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882253.28892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882253.28914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882253.29006: Evaluated conditional (not __network_is_ostree is defined): True 15330 1726882253.29010: variable 'omit' from source: magic vars 15330 1726882253.29037: variable 'omit' from source: magic vars 15330 1726882253.29067: variable 'omit' from source: magic vars 15330 1726882253.29084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882253.29109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882253.29123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882253.29135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882253.29143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882253.29169: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882253.29172: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882253.29174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882253.29241: Set connection var ansible_pipelining to False 15330 1726882253.29251: Set connection var ansible_timeout to 10 15330 1726882253.29254: Set connection var ansible_connection to ssh 15330 1726882253.29256: Set connection var ansible_shell_type to sh 15330 1726882253.29261: Set connection var ansible_shell_executable to /bin/sh 15330 1726882253.29265: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882253.29284: variable 'ansible_shell_executable' from source: unknown 15330 1726882253.29287: variable 'ansible_connection' from source: unknown 15330 1726882253.29290: variable 'ansible_module_compression' from source: unknown 15330 1726882253.29292: variable 'ansible_shell_type' from source: unknown 15330 1726882253.29295: variable 'ansible_shell_executable' from source: unknown 15330 1726882253.29297: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882253.29306: variable 'ansible_pipelining' from source: unknown 15330 1726882253.29308: variable 'ansible_timeout' from source: unknown 15330 1726882253.29310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882253.29404: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882253.29416: variable 'omit' from source: magic vars 15330 1726882253.29419: starting attempt loop 15330 1726882253.29421: running the handler 15330 1726882253.29432: _low_level_execute_command(): starting 15330 1726882253.29438: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882253.29949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882253.29955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15330 1726882253.29958: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882253.29960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.30015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882253.30018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882253.30021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882253.30084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882253.32373: stdout chunk (state=3): >>>/root <<< 15330 1726882253.32527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882253.32561: stderr chunk (state=3): >>><<< 15330 1726882253.32565: stdout chunk (state=3): >>><<< 15330 1726882253.32587: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882253.32604: _low_level_execute_command(): starting 15330 1726882253.32612: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752 `" && echo ansible-tmp-1726882253.3258646-15492-271986201395752="` echo /root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752 `" ) && sleep 0' 15330 1726882253.33066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882253.33069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882253.33072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.33074: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882253.33076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.33133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882253.33140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882253.33142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882253.33196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882253.35949: stdout chunk (state=3): >>>ansible-tmp-1726882253.3258646-15492-271986201395752=/root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752 <<< 15330 1726882253.36100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882253.36131: stderr chunk (state=3): >>><<< 15330 1726882253.36134: stdout chunk (state=3): >>><<< 15330 1726882253.36149: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882253.3258646-15492-271986201395752=/root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882253.36192: variable 'ansible_module_compression' from source: unknown 15330 1726882253.36244: ANSIBALLZ: Using lock for stat 15330 1726882253.36247: ANSIBALLZ: Acquiring lock 15330 1726882253.36249: ANSIBALLZ: Lock acquired: 140238208143568 15330 1726882253.36252: ANSIBALLZ: Creating module 15330 1726882253.43528: ANSIBALLZ: Writing module into payload 15330 1726882253.43590: ANSIBALLZ: Writing module 15330 1726882253.43609: ANSIBALLZ: Renaming module 15330 1726882253.43614: ANSIBALLZ: Done creating module 15330 1726882253.43632: variable 'ansible_facts' from source: unknown 15330 1726882253.43676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752/AnsiballZ_stat.py 15330 1726882253.43778: Sending initial data 15330 1726882253.43781: Sent initial data (153 bytes) 15330 1726882253.44241: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882253.44245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882253.44247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.44249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882253.44251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.44301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882253.44318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882253.44374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882253.46676: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882253.46724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882253.46776: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp98_rxifj /root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752/AnsiballZ_stat.py <<< 15330 1726882253.46782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752/AnsiballZ_stat.py" <<< 15330 1726882253.46833: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp98_rxifj" to remote "/root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752/AnsiballZ_stat.py" <<< 15330 1726882253.46836: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752/AnsiballZ_stat.py" <<< 15330 1726882253.47391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882253.47440: stderr chunk (state=3): >>><<< 15330 1726882253.47444: stdout chunk (state=3): >>><<< 15330 1726882253.47466: done transferring module to remote 15330 1726882253.47477: _low_level_execute_command(): starting 15330 1726882253.47482: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752/ /root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752/AnsiballZ_stat.py && sleep 0' 15330 1726882253.47940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882253.47946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882253.47948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.47950: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882253.47952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882253.47954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.48007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882253.48011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882253.48015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882253.48063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882253.50586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882253.50615: stderr chunk (state=3): >>><<< 15330 1726882253.50618: stdout chunk (state=3): >>><<< 15330 1726882253.50631: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882253.50634: _low_level_execute_command(): starting 15330 1726882253.50640: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752/AnsiballZ_stat.py && sleep 0' 15330 1726882253.51066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882253.51070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.51099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882253.51102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.51148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882253.51155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882253.51157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882253.51216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882253.55599: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca4184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca3e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca41aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca1c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca1c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15330 1726882253.55638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15330 1726882253.55651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15330 1726882253.55688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15330 1726882253.55709: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca207e60> <<< 15330 1726882253.55746: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15330 1726882253.55766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15330 1726882253.55805: stdout chunk (state=3): >>>import '_operator' # <<< 15330 1726882253.55810: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca207f20> <<< 15330 1726882253.55850: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15330 1726882253.55885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15330 1726882253.56056: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca23f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 15330 1726882253.56059: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca23ff20> <<< 15330 1726882253.56087: stdout chunk (state=3): >>>import '_collections' # <<< 15330 1726882253.56152: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca21fb30> <<< 15330 1726882253.56176: stdout chunk (state=3): >>>import '_functools' # <<< 15330 1726882253.56213: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca21d250> <<< 15330 1726882253.56350: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca205010> <<< 15330 1726882253.56386: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15330 1726882253.56415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15330 1726882253.56435: stdout chunk (state=3): >>>import '_sre' # <<< 15330 1726882253.56472: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15330 1726882253.56510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15330 1726882253.56536: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 15330 1726882253.56552: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15330 1726882253.56600: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca25f800> <<< 15330 1726882253.56853: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca25e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca21e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca25ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca294860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca204290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca294d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca294bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca294fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca202db0> <<< 15330 1726882253.56869: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882253.56910: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15330 1726882253.56942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15330 1726882253.56964: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2956a0> <<< 15330 1726882253.56971: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca295370> <<< 15330 1726882253.56986: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 15330 1726882253.57026: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 15330 1726882253.57033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 15330 1726882253.57052: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2965a0> <<< 15330 1726882253.57079: stdout chunk (state=3): >>>import 'importlib.util' # <<< 15330 1726882253.57088: stdout chunk (state=3): >>>import 'runpy' # <<< 15330 1726882253.57126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15330 1726882253.57170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15330 1726882253.57205: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 15330 1726882253.57212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 15330 1726882253.57232: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2ac7a0> <<< 15330 1726882253.57242: stdout chunk (state=3): >>>import 'errno' # <<< 15330 1726882253.57279: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.57296: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.57303: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca2ade80> <<< 15330 1726882253.57325: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 15330 1726882253.57348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15330 1726882253.57375: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 15330 1726882253.57392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15330 1726882253.57404: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2aed20> <<< 15330 1726882253.57446: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.57457: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.57477: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca2af320> <<< 15330 1726882253.57479: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2ae270> <<< 15330 1726882253.57514: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 15330 1726882253.57522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15330 1726882253.57751: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca2afda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2af4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca296510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15330 1726882253.57763: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.57778: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca04bbf0> <<< 15330 1726882253.57813: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 15330 1726882253.57819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15330 1726882253.57855: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.57858: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.57874: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca074740> <<< 15330 1726882253.57879: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca0744a0> <<< 15330 1726882253.57916: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.57928: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca074680> <<< 15330 1726882253.57977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 15330 1726882253.57985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15330 1726882253.58076: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.58264: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.58269: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca074fe0> <<< 15330 1726882253.58420: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.58442: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.58452: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca075910> <<< 15330 1726882253.58465: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca0748c0> <<< 15330 1726882253.58485: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca049d90> <<< 15330 1726882253.58519: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15330 1726882253.58550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15330 1726882253.58646: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca076d20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca075a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca296750> <<< 15330 1726882253.58683: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15330 1726882253.58774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882253.58802: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15330 1726882253.58850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15330 1726882253.58894: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca09f080> <<< 15330 1726882253.58970: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15330 1726882253.58995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882253.59047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15330 1726882253.59052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15330 1726882253.59119: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca0c3440> <<< 15330 1726882253.59244: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15330 1726882253.59304: stdout chunk (state=3): >>>import 'ntpath' # <<< 15330 1726882253.59341: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 15330 1726882253.59344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882253.59356: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca124230> <<< 15330 1726882253.59386: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15330 1726882253.59428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15330 1726882253.59465: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15330 1726882253.59518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15330 1726882253.59646: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca126990> <<< 15330 1726882253.59746: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca124350> <<< 15330 1726882253.59797: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca0f1250> <<< 15330 1726882253.60049: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f29310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca0c2240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca077c50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f90c9f295b0> <<< 15330 1726882253.60230: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_w9do05di/ansible_stat_payload.zip' <<< 15330 1726882253.60241: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.60453: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.60487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 15330 1726882253.60513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15330 1726882253.60564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15330 1726882253.60671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15330 1726882253.60714: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 15330 1726882253.60724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 15330 1726882253.60727: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f7ef90> <<< 15330 1726882253.60747: stdout chunk (state=3): >>>import '_typing' # <<< 15330 1726882253.61010: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f5de80> <<< 15330 1726882253.61032: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f5d0a0> <<< 15330 1726882253.61040: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.61082: stdout chunk (state=3): >>>import 'ansible' # <<< 15330 1726882253.61101: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.61129: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.61151: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.61175: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 15330 1726882253.61200: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.63348: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.65106: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 15330 1726882253.65109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 15330 1726882253.65124: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f7d280> <<< 15330 1726882253.65155: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 15330 1726882253.65158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882253.65199: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 15330 1726882253.65209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15330 1726882253.65238: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 15330 1726882253.65350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9faa9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9faa750> <<< 15330 1726882253.65359: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9faa060> <<< 15330 1726882253.65386: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 15330 1726882253.65406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15330 1726882253.65456: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9faa4b0> <<< 15330 1726882253.65472: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f7fc20> <<< 15330 1726882253.65480: stdout chunk (state=3): >>>import 'atexit' # <<< 15330 1726882253.65521: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.65530: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.65539: stdout chunk (state=3): >>>import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9fab710> <<< 15330 1726882253.65567: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.65578: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9fab950> <<< 15330 1726882253.65853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9fabe90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9915cd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c99178c0> <<< 15330 1726882253.65887: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15330 1726882253.65906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15330 1726882253.65960: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9918260> <<< 15330 1726882253.65987: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15330 1726882253.66028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15330 1726882253.66056: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9919400> <<< 15330 1726882253.66089: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15330 1726882253.66137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15330 1726882253.66172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 15330 1726882253.66177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15330 1726882253.66260: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c991be60> <<< 15330 1726882253.66311: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.66316: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca207d10> <<< 15330 1726882253.66342: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c991a120> <<< 15330 1726882253.66374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15330 1726882253.66415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15330 1726882253.66446: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 15330 1726882253.66449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 15330 1726882253.66485: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15330 1726882253.66525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15330 1726882253.66557: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 15330 1726882253.66562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15330 1726882253.66583: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9923dd0> <<< 15330 1726882253.66602: stdout chunk (state=3): >>>import '_tokenize' # <<< 15330 1726882253.66700: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c99228a0> <<< 15330 1726882253.66705: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9922600> <<< 15330 1726882253.66741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 15330 1726882253.66752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15330 1726882253.66872: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9922b70> <<< 15330 1726882253.67149: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c991a630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9922b10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c996c140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c996dbe0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c996d9a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15330 1726882253.67241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15330 1726882253.67303: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.67307: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c99701d0> <<< 15330 1726882253.67319: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c996e2d0> <<< 15330 1726882253.67350: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15330 1726882253.67402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882253.67427: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15330 1726882253.67442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 15330 1726882253.67463: stdout chunk (state=3): >>>import '_string' # <<< 15330 1726882253.67524: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9973980> <<< 15330 1726882253.67710: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9970380> <<< 15330 1726882253.67781: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.67787: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9974770> <<< 15330 1726882253.67828: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.67844: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9974950> <<< 15330 1726882253.67905: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.67914: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9974a70> <<< 15330 1726882253.67932: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c996c2c0> <<< 15330 1726882253.67967: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 15330 1726882253.67971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15330 1726882253.68007: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15330 1726882253.68038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15330 1726882253.68074: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.68110: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.68116: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c99fc260> <<< 15330 1726882253.68351: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.68373: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c99fd070> <<< 15330 1726882253.68396: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c99769f0> <<< 15330 1726882253.68425: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.68443: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9977da0> <<< 15330 1726882253.68644: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9976630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882253.68750: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.68770: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.68786: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 15330 1726882253.68806: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.68833: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.68840: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 15330 1726882253.68867: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.69050: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.69234: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.70088: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.70975: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15330 1726882253.70986: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 15330 1726882253.71014: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 15330 1726882253.71021: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 15330 1726882253.71059: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15330 1726882253.71084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882253.71158: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c98052b0> <<< 15330 1726882253.71264: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 15330 1726882253.71278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15330 1726882253.71299: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9806150> <<< 15330 1726882253.71319: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c99fd220> <<< 15330 1726882253.71379: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15330 1726882253.71402: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.71428: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.71546: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15330 1726882253.71687: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.71916: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15330 1726882253.71922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15330 1726882253.71943: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9806870> <<< 15330 1726882253.71961: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.72711: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.73440: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.73549: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.73655: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15330 1726882253.73675: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.73727: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.73949: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882253.73999: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15330 1726882253.74019: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.74040: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.74052: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 15330 1726882253.74068: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.74130: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.74181: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15330 1726882253.74205: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.74552: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.74916: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15330 1726882253.75007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15330 1726882253.75030: stdout chunk (state=3): >>>import '_ast' # <<< 15330 1726882253.75132: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9807470> <<< 15330 1726882253.75158: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.75266: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.75370: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15330 1726882253.75379: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 15330 1726882253.75397: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 15330 1726882253.75410: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 15330 1726882253.75445: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.75505: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.75560: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15330 1726882253.75573: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.75849: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15330 1726882253.75855: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15330 1726882253.75920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882253.76028: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 15330 1726882253.76044: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9811f40> <<< 15330 1726882253.76098: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c980d6d0> <<< 15330 1726882253.76141: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15330 1726882253.76160: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.76258: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.76342: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.76383: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.76449: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 15330 1726882253.76455: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15330 1726882253.76485: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15330 1726882253.76524: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15330 1726882253.76551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15330 1726882253.76636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15330 1726882253.76662: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15330 1726882253.76696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15330 1726882253.76783: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c99027b0> <<< 15330 1726882253.76848: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9fe2480> <<< 15330 1726882253.76958: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9811c70> <<< 15330 1726882253.76971: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9806030> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15330 1726882253.76985: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.77028: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.77072: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 15330 1726882253.77075: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 15330 1726882253.77244: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15330 1726882253.77521: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.77728: stdout chunk (state=3): >>># zipimport: zlib available <<< 15330 1726882253.77874: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 15330 1726882253.77891: stdout chunk (state=3): >>># destroy __main__ <<< 15330 1726882253.78317: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 15330 1726882253.78363: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys <<< 15330 1726882253.78388: stdout chunk (state=3): >>># cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread <<< 15330 1726882253.78621: stdout chunk (state=3): >>># cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon <<< 15330 1726882253.78765: stdout chunk (state=3): >>># cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15330 1726882253.78906: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15330 1726882253.78945: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15330 1726882253.79002: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 <<< 15330 1726882253.79006: stdout chunk (state=3): >>># destroy lzma <<< 15330 1726882253.79032: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 15330 1726882253.79091: stdout chunk (state=3): >>># destroy ntpath <<< 15330 1726882253.79121: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ <<< 15330 1726882253.79133: stdout chunk (state=3): >>># destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 15330 1726882253.79274: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 15330 1726882253.79328: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 15330 1726882253.79357: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 15330 1726882253.79399: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 15330 1726882253.79451: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 15330 1726882253.79458: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external <<< 15330 1726882253.79526: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools<<< 15330 1726882253.79568: stdout chunk (state=3): >>> # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 15330 1726882253.79589: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath <<< 15330 1726882253.79605: stdout chunk (state=3): >>># cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 15330 1726882253.79658: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15330 1726882253.79818: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 15330 1726882253.79981: stdout chunk (state=3): >>># destroy _collections <<< 15330 1726882253.80001: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 15330 1726882253.80054: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15330 1726882253.80126: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs <<< 15330 1726882253.80139: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 15330 1726882253.80167: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time <<< 15330 1726882253.80206: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 15330 1726882253.80321: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools <<< 15330 1726882253.80324: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools <<< 15330 1726882253.80326: stdout chunk (state=3): >>># destroy builtins # destroy _thread <<< 15330 1726882253.80339: stdout chunk (state=3): >>># clear sys.audit hooks <<< 15330 1726882253.80799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882253.80902: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 15330 1726882253.80905: stdout chunk (state=3): >>><<< 15330 1726882253.80907: stderr chunk (state=3): >>><<< 15330 1726882253.80920: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca4184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca3e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca41aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca1c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca1c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca207e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca207f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca23f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca23ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca21fb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca21d250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca205010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca25f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca25e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca21e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca25ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca294860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca204290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca294d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca294bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca294fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca202db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2956a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca295370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2965a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2ac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca2ade80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2aed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca2af320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2ae270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca2afda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca2af4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca296510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca04bbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca074740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca0744a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca074680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca074fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca075910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca0748c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca049d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca076d20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca075a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca296750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca09f080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca0c3440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca124230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca126990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca124350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca0f1250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f29310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca0c2240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90ca077c50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f90c9f295b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_w9do05di/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f7ef90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f5de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f5d0a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f7d280> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9faa9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9faa750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9faa060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9faa4b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9f7fc20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9fab710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9fab950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9fabe90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9915cd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c99178c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9918260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9919400> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c991be60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90ca207d10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c991a120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9923dd0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c99228a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9922600> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9922b70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c991a630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9922b10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c996c140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c996dbe0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c996d9a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c99701d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c996e2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9973980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9970380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9974770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9974950> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9974a70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c996c2c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c99fc260> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c99fd070> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c99769f0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9977da0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9976630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c98052b0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9806150> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c99fd220> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9806870> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9807470> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f90c9811f40> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c980d6d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c99027b0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9fe2480> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9811c70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f90c9806030> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15330 1726882253.81554: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882253.81558: _low_level_execute_command(): starting 15330 1726882253.81560: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882253.3258646-15492-271986201395752/ > /dev/null 2>&1 && sleep 0' 15330 1726882253.81981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882253.81985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882253.81988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.81990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882253.81992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882253.82003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882253.82058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882253.82065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882253.82120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882253.84726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882253.84730: stdout chunk (state=3): >>><<< 15330 1726882253.84733: stderr chunk (state=3): >>><<< 15330 1726882253.84751: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882253.84763: handler run complete 15330 1726882253.84901: attempt loop complete, returning result 15330 1726882253.84904: _execute() done 15330 1726882253.84907: dumping result to json 15330 1726882253.84909: done dumping result, returning 15330 1726882253.84912: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [12673a56-9f93-e4fe-1358-000000000091] 15330 1726882253.84914: sending task result for task 12673a56-9f93-e4fe-1358-000000000091 15330 1726882253.84979: done sending task result for task 12673a56-9f93-e4fe-1358-000000000091 15330 1726882253.84982: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 15330 1726882253.85052: no more pending results, returning what we have 15330 1726882253.85056: results queue empty 15330 1726882253.85057: checking for any_errors_fatal 15330 1726882253.85064: done checking for any_errors_fatal 15330 1726882253.85065: checking for max_fail_percentage 15330 1726882253.85067: done checking for max_fail_percentage 15330 1726882253.85067: checking to see if all hosts have failed and the running result is not ok 15330 1726882253.85068: done checking to see if all hosts have failed 15330 1726882253.85069: getting the remaining hosts for this loop 15330 1726882253.85070: done getting the remaining hosts for this loop 15330 1726882253.85074: getting the next task for host managed_node3 15330 1726882253.85080: done getting next task for host managed_node3 15330 1726882253.85083: ^ task is: TASK: Set flag to indicate system is ostree 15330 1726882253.85086: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882253.85089: getting variables 15330 1726882253.85097: in VariableManager get_vars() 15330 1726882253.85128: Calling all_inventory to load vars for managed_node3 15330 1726882253.85130: Calling groups_inventory to load vars for managed_node3 15330 1726882253.85134: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882253.85146: Calling all_plugins_play to load vars for managed_node3 15330 1726882253.85149: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882253.85153: Calling groups_plugins_play to load vars for managed_node3 15330 1726882253.85653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882253.85885: done with get_vars() 15330 1726882253.85899: done getting variables 15330 1726882253.86002: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:30:53 -0400 (0:00:00.583) 0:00:03.066 ****** 15330 1726882253.86031: entering _queue_task() for managed_node3/set_fact 15330 1726882253.86033: Creating lock for set_fact 15330 1726882253.86436: worker is 1 (out of 1 available) 15330 1726882253.86446: exiting _queue_task() for managed_node3/set_fact 15330 1726882253.86457: done queuing things up, now waiting for results queue to drain 15330 1726882253.86458: waiting for pending results... 15330 1726882253.86812: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 15330 1726882253.86817: in run() - task 12673a56-9f93-e4fe-1358-000000000092 15330 1726882253.86822: variable 'ansible_search_path' from source: unknown 15330 1726882253.86824: variable 'ansible_search_path' from source: unknown 15330 1726882253.86827: calling self._execute() 15330 1726882253.86866: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882253.86879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882253.86902: variable 'omit' from source: magic vars 15330 1726882253.87404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882253.87734: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882253.87791: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882253.87834: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882253.87871: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882253.87967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882253.88012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882253.88045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882253.88107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882253.88217: Evaluated conditional (not __network_is_ostree is defined): True 15330 1726882253.88228: variable 'omit' from source: magic vars 15330 1726882253.88270: variable 'omit' from source: magic vars 15330 1726882253.88402: variable '__ostree_booted_stat' from source: set_fact 15330 1726882253.88542: variable 'omit' from source: magic vars 15330 1726882253.88545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882253.88548: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882253.88551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882253.88571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882253.88586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882253.88623: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882253.88632: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882253.88648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882253.88770: Set connection var ansible_pipelining to False 15330 1726882253.88789: Set connection var ansible_timeout to 10 15330 1726882253.88801: Set connection var ansible_connection to ssh 15330 1726882253.88809: Set connection var ansible_shell_type to sh 15330 1726882253.88821: Set connection var ansible_shell_executable to /bin/sh 15330 1726882253.88832: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882253.88869: variable 'ansible_shell_executable' from source: unknown 15330 1726882253.88902: variable 'ansible_connection' from source: unknown 15330 1726882253.88905: variable 'ansible_module_compression' from source: unknown 15330 1726882253.88907: variable 'ansible_shell_type' from source: unknown 15330 1726882253.88909: variable 'ansible_shell_executable' from source: unknown 15330 1726882253.88911: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882253.88913: variable 'ansible_pipelining' from source: unknown 15330 1726882253.88915: variable 'ansible_timeout' from source: unknown 15330 1726882253.88918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882253.89036: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882253.89085: variable 'omit' from source: magic vars 15330 1726882253.89088: starting attempt loop 15330 1726882253.89091: running the handler 15330 1726882253.89097: handler run complete 15330 1726882253.89100: attempt loop complete, returning result 15330 1726882253.89105: _execute() done 15330 1726882253.89198: dumping result to json 15330 1726882253.89202: done dumping result, returning 15330 1726882253.89205: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [12673a56-9f93-e4fe-1358-000000000092] 15330 1726882253.89207: sending task result for task 12673a56-9f93-e4fe-1358-000000000092 15330 1726882253.89268: done sending task result for task 12673a56-9f93-e4fe-1358-000000000092 15330 1726882253.89272: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 15330 1726882253.89337: no more pending results, returning what we have 15330 1726882253.89340: results queue empty 15330 1726882253.89342: checking for any_errors_fatal 15330 1726882253.89348: done checking for any_errors_fatal 15330 1726882253.89348: checking for max_fail_percentage 15330 1726882253.89350: done checking for max_fail_percentage 15330 1726882253.89351: checking to see if all hosts have failed and the running result is not ok 15330 1726882253.89352: done checking to see if all hosts have failed 15330 1726882253.89352: getting the remaining hosts for this loop 15330 1726882253.89353: done getting the remaining hosts for this loop 15330 1726882253.89357: getting the next task for host managed_node3 15330 1726882253.89365: done getting next task for host managed_node3 15330 1726882253.89368: ^ task is: TASK: Fix CentOS6 Base repo 15330 1726882253.89370: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882253.89375: getting variables 15330 1726882253.89377: in VariableManager get_vars() 15330 1726882253.89609: Calling all_inventory to load vars for managed_node3 15330 1726882253.89612: Calling groups_inventory to load vars for managed_node3 15330 1726882253.89615: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882253.89625: Calling all_plugins_play to load vars for managed_node3 15330 1726882253.89628: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882253.89637: Calling groups_plugins_play to load vars for managed_node3 15330 1726882253.89903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882253.90110: done with get_vars() 15330 1726882253.90119: done getting variables 15330 1726882253.90246: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:30:53 -0400 (0:00:00.042) 0:00:03.108 ****** 15330 1726882253.90274: entering _queue_task() for managed_node3/copy 15330 1726882253.90528: worker is 1 (out of 1 available) 15330 1726882253.90538: exiting _queue_task() for managed_node3/copy 15330 1726882253.90550: done queuing things up, now waiting for results queue to drain 15330 1726882253.90551: waiting for pending results... 15330 1726882253.90916: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 15330 1726882253.90921: in run() - task 12673a56-9f93-e4fe-1358-000000000094 15330 1726882253.90936: variable 'ansible_search_path' from source: unknown 15330 1726882253.90943: variable 'ansible_search_path' from source: unknown 15330 1726882253.90980: calling self._execute() 15330 1726882253.91061: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882253.91076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882253.91092: variable 'omit' from source: magic vars 15330 1726882253.91592: variable 'ansible_distribution' from source: facts 15330 1726882253.91662: Evaluated conditional (ansible_distribution == 'CentOS'): True 15330 1726882253.91754: variable 'ansible_distribution_major_version' from source: facts 15330 1726882253.91779: Evaluated conditional (ansible_distribution_major_version == '6'): False 15330 1726882253.91788: when evaluation is False, skipping this task 15330 1726882253.91800: _execute() done 15330 1726882253.91878: dumping result to json 15330 1726882253.91881: done dumping result, returning 15330 1726882253.91885: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [12673a56-9f93-e4fe-1358-000000000094] 15330 1726882253.91887: sending task result for task 12673a56-9f93-e4fe-1358-000000000094 15330 1726882253.91957: done sending task result for task 12673a56-9f93-e4fe-1358-000000000094 15330 1726882253.91961: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15330 1726882253.92047: no more pending results, returning what we have 15330 1726882253.92051: results queue empty 15330 1726882253.92052: checking for any_errors_fatal 15330 1726882253.92057: done checking for any_errors_fatal 15330 1726882253.92058: checking for max_fail_percentage 15330 1726882253.92059: done checking for max_fail_percentage 15330 1726882253.92060: checking to see if all hosts have failed and the running result is not ok 15330 1726882253.92061: done checking to see if all hosts have failed 15330 1726882253.92062: getting the remaining hosts for this loop 15330 1726882253.92063: done getting the remaining hosts for this loop 15330 1726882253.92066: getting the next task for host managed_node3 15330 1726882253.92073: done getting next task for host managed_node3 15330 1726882253.92075: ^ task is: TASK: Include the task 'enable_epel.yml' 15330 1726882253.92078: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882253.92082: getting variables 15330 1726882253.92084: in VariableManager get_vars() 15330 1726882253.92209: Calling all_inventory to load vars for managed_node3 15330 1726882253.92212: Calling groups_inventory to load vars for managed_node3 15330 1726882253.92222: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882253.92234: Calling all_plugins_play to load vars for managed_node3 15330 1726882253.92237: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882253.92241: Calling groups_plugins_play to load vars for managed_node3 15330 1726882253.92514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882253.92722: done with get_vars() 15330 1726882253.92731: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:30:53 -0400 (0:00:00.025) 0:00:03.134 ****** 15330 1726882253.92828: entering _queue_task() for managed_node3/include_tasks 15330 1726882253.93065: worker is 1 (out of 1 available) 15330 1726882253.93076: exiting _queue_task() for managed_node3/include_tasks 15330 1726882253.93205: done queuing things up, now waiting for results queue to drain 15330 1726882253.93207: waiting for pending results... 15330 1726882253.93435: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 15330 1726882253.93439: in run() - task 12673a56-9f93-e4fe-1358-000000000095 15330 1726882253.93457: variable 'ansible_search_path' from source: unknown 15330 1726882253.93465: variable 'ansible_search_path' from source: unknown 15330 1726882253.93509: calling self._execute() 15330 1726882253.93592: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882253.93639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882253.93643: variable 'omit' from source: magic vars 15330 1726882253.94173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882253.96388: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882253.96477: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882253.96586: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882253.96590: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882253.96592: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882253.96672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882253.96720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882253.96750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882253.96804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882253.96830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882253.96956: variable '__network_is_ostree' from source: set_fact 15330 1726882253.96977: Evaluated conditional (not __network_is_ostree | d(false)): True 15330 1726882253.97023: _execute() done 15330 1726882253.97026: dumping result to json 15330 1726882253.97029: done dumping result, returning 15330 1726882253.97031: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-e4fe-1358-000000000095] 15330 1726882253.97033: sending task result for task 12673a56-9f93-e4fe-1358-000000000095 15330 1726882253.97333: done sending task result for task 12673a56-9f93-e4fe-1358-000000000095 15330 1726882253.97336: WORKER PROCESS EXITING 15330 1726882253.97363: no more pending results, returning what we have 15330 1726882253.97368: in VariableManager get_vars() 15330 1726882253.97401: Calling all_inventory to load vars for managed_node3 15330 1726882253.97403: Calling groups_inventory to load vars for managed_node3 15330 1726882253.97407: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882253.97416: Calling all_plugins_play to load vars for managed_node3 15330 1726882253.97419: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882253.97422: Calling groups_plugins_play to load vars for managed_node3 15330 1726882253.97737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882253.97948: done with get_vars() 15330 1726882253.97955: variable 'ansible_search_path' from source: unknown 15330 1726882253.97956: variable 'ansible_search_path' from source: unknown 15330 1726882253.98003: we have included files to process 15330 1726882253.98004: generating all_blocks data 15330 1726882253.98006: done generating all_blocks data 15330 1726882253.98011: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15330 1726882253.98013: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15330 1726882253.98016: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15330 1726882253.98753: done processing included file 15330 1726882253.98755: iterating over new_blocks loaded from include file 15330 1726882253.98871: in VariableManager get_vars() 15330 1726882253.98883: done with get_vars() 15330 1726882253.98885: filtering new block on tags 15330 1726882253.98910: done filtering new block on tags 15330 1726882253.98913: in VariableManager get_vars() 15330 1726882253.98987: done with get_vars() 15330 1726882253.98989: filtering new block on tags 15330 1726882253.99005: done filtering new block on tags 15330 1726882253.99007: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 15330 1726882253.99012: extending task lists for all hosts with included blocks 15330 1726882253.99280: done extending task lists 15330 1726882253.99282: done processing included files 15330 1726882253.99283: results queue empty 15330 1726882253.99283: checking for any_errors_fatal 15330 1726882253.99286: done checking for any_errors_fatal 15330 1726882253.99287: checking for max_fail_percentage 15330 1726882253.99288: done checking for max_fail_percentage 15330 1726882253.99289: checking to see if all hosts have failed and the running result is not ok 15330 1726882253.99289: done checking to see if all hosts have failed 15330 1726882253.99290: getting the remaining hosts for this loop 15330 1726882253.99291: done getting the remaining hosts for this loop 15330 1726882253.99297: getting the next task for host managed_node3 15330 1726882253.99302: done getting next task for host managed_node3 15330 1726882253.99304: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 15330 1726882253.99307: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882253.99309: getting variables 15330 1726882253.99310: in VariableManager get_vars() 15330 1726882253.99317: Calling all_inventory to load vars for managed_node3 15330 1726882253.99319: Calling groups_inventory to load vars for managed_node3 15330 1726882253.99321: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882253.99326: Calling all_plugins_play to load vars for managed_node3 15330 1726882253.99333: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882253.99336: Calling groups_plugins_play to load vars for managed_node3 15330 1726882253.99610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.00167: done with get_vars() 15330 1726882254.00176: done getting variables 15330 1726882254.00245: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15330 1726882254.00544: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:30:54 -0400 (0:00:00.077) 0:00:03.211 ****** 15330 1726882254.00590: entering _queue_task() for managed_node3/command 15330 1726882254.00592: Creating lock for command 15330 1726882254.00932: worker is 1 (out of 1 available) 15330 1726882254.00945: exiting _queue_task() for managed_node3/command 15330 1726882254.00956: done queuing things up, now waiting for results queue to drain 15330 1726882254.00958: waiting for pending results... 15330 1726882254.01145: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 15330 1726882254.01257: in run() - task 12673a56-9f93-e4fe-1358-0000000000af 15330 1726882254.01302: variable 'ansible_search_path' from source: unknown 15330 1726882254.01306: variable 'ansible_search_path' from source: unknown 15330 1726882254.01324: calling self._execute() 15330 1726882254.01401: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.01499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.01503: variable 'omit' from source: magic vars 15330 1726882254.01774: variable 'ansible_distribution' from source: facts 15330 1726882254.01794: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15330 1726882254.01923: variable 'ansible_distribution_major_version' from source: facts 15330 1726882254.01935: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15330 1726882254.01943: when evaluation is False, skipping this task 15330 1726882254.01951: _execute() done 15330 1726882254.01958: dumping result to json 15330 1726882254.01966: done dumping result, returning 15330 1726882254.01977: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [12673a56-9f93-e4fe-1358-0000000000af] 15330 1726882254.01986: sending task result for task 12673a56-9f93-e4fe-1358-0000000000af skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15330 1726882254.02248: no more pending results, returning what we have 15330 1726882254.02252: results queue empty 15330 1726882254.02253: checking for any_errors_fatal 15330 1726882254.02254: done checking for any_errors_fatal 15330 1726882254.02255: checking for max_fail_percentage 15330 1726882254.02257: done checking for max_fail_percentage 15330 1726882254.02257: checking to see if all hosts have failed and the running result is not ok 15330 1726882254.02258: done checking to see if all hosts have failed 15330 1726882254.02259: getting the remaining hosts for this loop 15330 1726882254.02260: done getting the remaining hosts for this loop 15330 1726882254.02264: getting the next task for host managed_node3 15330 1726882254.02270: done getting next task for host managed_node3 15330 1726882254.02273: ^ task is: TASK: Install yum-utils package 15330 1726882254.02276: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882254.02281: getting variables 15330 1726882254.02283: in VariableManager get_vars() 15330 1726882254.02472: Calling all_inventory to load vars for managed_node3 15330 1726882254.02474: Calling groups_inventory to load vars for managed_node3 15330 1726882254.02478: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882254.02484: done sending task result for task 12673a56-9f93-e4fe-1358-0000000000af 15330 1726882254.02486: WORKER PROCESS EXITING 15330 1726882254.02498: Calling all_plugins_play to load vars for managed_node3 15330 1726882254.02501: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882254.02504: Calling groups_plugins_play to load vars for managed_node3 15330 1726882254.02683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.02864: done with get_vars() 15330 1726882254.02872: done getting variables 15330 1726882254.02966: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:30:54 -0400 (0:00:00.023) 0:00:03.235 ****** 15330 1726882254.02994: entering _queue_task() for managed_node3/package 15330 1726882254.02996: Creating lock for package 15330 1726882254.03247: worker is 1 (out of 1 available) 15330 1726882254.03257: exiting _queue_task() for managed_node3/package 15330 1726882254.03267: done queuing things up, now waiting for results queue to drain 15330 1726882254.03269: waiting for pending results... 15330 1726882254.03502: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 15330 1726882254.03608: in run() - task 12673a56-9f93-e4fe-1358-0000000000b0 15330 1726882254.03631: variable 'ansible_search_path' from source: unknown 15330 1726882254.03639: variable 'ansible_search_path' from source: unknown 15330 1726882254.03680: calling self._execute() 15330 1726882254.03760: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.03771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.03783: variable 'omit' from source: magic vars 15330 1726882254.04133: variable 'ansible_distribution' from source: facts 15330 1726882254.04161: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15330 1726882254.04276: variable 'ansible_distribution_major_version' from source: facts 15330 1726882254.04301: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15330 1726882254.04304: when evaluation is False, skipping this task 15330 1726882254.04307: _execute() done 15330 1726882254.04309: dumping result to json 15330 1726882254.04502: done dumping result, returning 15330 1726882254.04506: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [12673a56-9f93-e4fe-1358-0000000000b0] 15330 1726882254.04509: sending task result for task 12673a56-9f93-e4fe-1358-0000000000b0 15330 1726882254.04571: done sending task result for task 12673a56-9f93-e4fe-1358-0000000000b0 15330 1726882254.04574: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15330 1726882254.04613: no more pending results, returning what we have 15330 1726882254.04616: results queue empty 15330 1726882254.04617: checking for any_errors_fatal 15330 1726882254.04622: done checking for any_errors_fatal 15330 1726882254.04623: checking for max_fail_percentage 15330 1726882254.04624: done checking for max_fail_percentage 15330 1726882254.04625: checking to see if all hosts have failed and the running result is not ok 15330 1726882254.04626: done checking to see if all hosts have failed 15330 1726882254.04626: getting the remaining hosts for this loop 15330 1726882254.04628: done getting the remaining hosts for this loop 15330 1726882254.04630: getting the next task for host managed_node3 15330 1726882254.04636: done getting next task for host managed_node3 15330 1726882254.04638: ^ task is: TASK: Enable EPEL 7 15330 1726882254.04642: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882254.04646: getting variables 15330 1726882254.04647: in VariableManager get_vars() 15330 1726882254.04676: Calling all_inventory to load vars for managed_node3 15330 1726882254.04678: Calling groups_inventory to load vars for managed_node3 15330 1726882254.04682: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882254.04695: Calling all_plugins_play to load vars for managed_node3 15330 1726882254.04700: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882254.04704: Calling groups_plugins_play to load vars for managed_node3 15330 1726882254.04942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.05127: done with get_vars() 15330 1726882254.05136: done getting variables 15330 1726882254.05190: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:30:54 -0400 (0:00:00.022) 0:00:03.258 ****** 15330 1726882254.05223: entering _queue_task() for managed_node3/command 15330 1726882254.05456: worker is 1 (out of 1 available) 15330 1726882254.05467: exiting _queue_task() for managed_node3/command 15330 1726882254.05478: done queuing things up, now waiting for results queue to drain 15330 1726882254.05479: waiting for pending results... 15330 1726882254.05718: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 15330 1726882254.05826: in run() - task 12673a56-9f93-e4fe-1358-0000000000b1 15330 1726882254.05844: variable 'ansible_search_path' from source: unknown 15330 1726882254.05852: variable 'ansible_search_path' from source: unknown 15330 1726882254.05899: calling self._execute() 15330 1726882254.05964: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.06098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.06102: variable 'omit' from source: magic vars 15330 1726882254.06406: variable 'ansible_distribution' from source: facts 15330 1726882254.06424: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15330 1726882254.06558: variable 'ansible_distribution_major_version' from source: facts 15330 1726882254.06569: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15330 1726882254.06576: when evaluation is False, skipping this task 15330 1726882254.06583: _execute() done 15330 1726882254.06589: dumping result to json 15330 1726882254.06600: done dumping result, returning 15330 1726882254.06611: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [12673a56-9f93-e4fe-1358-0000000000b1] 15330 1726882254.06620: sending task result for task 12673a56-9f93-e4fe-1358-0000000000b1 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15330 1726882254.06802: no more pending results, returning what we have 15330 1726882254.06806: results queue empty 15330 1726882254.06807: checking for any_errors_fatal 15330 1726882254.06813: done checking for any_errors_fatal 15330 1726882254.06814: checking for max_fail_percentage 15330 1726882254.06816: done checking for max_fail_percentage 15330 1726882254.06817: checking to see if all hosts have failed and the running result is not ok 15330 1726882254.06818: done checking to see if all hosts have failed 15330 1726882254.06819: getting the remaining hosts for this loop 15330 1726882254.06821: done getting the remaining hosts for this loop 15330 1726882254.06825: getting the next task for host managed_node3 15330 1726882254.06832: done getting next task for host managed_node3 15330 1726882254.06834: ^ task is: TASK: Enable EPEL 8 15330 1726882254.06839: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882254.06843: getting variables 15330 1726882254.06845: in VariableManager get_vars() 15330 1726882254.06875: Calling all_inventory to load vars for managed_node3 15330 1726882254.06877: Calling groups_inventory to load vars for managed_node3 15330 1726882254.06882: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882254.06896: Calling all_plugins_play to load vars for managed_node3 15330 1726882254.06899: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882254.06903: Calling groups_plugins_play to load vars for managed_node3 15330 1726882254.07288: done sending task result for task 12673a56-9f93-e4fe-1358-0000000000b1 15330 1726882254.07292: WORKER PROCESS EXITING 15330 1726882254.07319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.07496: done with get_vars() 15330 1726882254.07505: done getting variables 15330 1726882254.07560: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:30:54 -0400 (0:00:00.023) 0:00:03.281 ****** 15330 1726882254.07588: entering _queue_task() for managed_node3/command 15330 1726882254.07820: worker is 1 (out of 1 available) 15330 1726882254.07832: exiting _queue_task() for managed_node3/command 15330 1726882254.07844: done queuing things up, now waiting for results queue to drain 15330 1726882254.07845: waiting for pending results... 15330 1726882254.08070: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 15330 1726882254.08183: in run() - task 12673a56-9f93-e4fe-1358-0000000000b2 15330 1726882254.08208: variable 'ansible_search_path' from source: unknown 15330 1726882254.08216: variable 'ansible_search_path' from source: unknown 15330 1726882254.08252: calling self._execute() 15330 1726882254.08331: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.08342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.08356: variable 'omit' from source: magic vars 15330 1726882254.08717: variable 'ansible_distribution' from source: facts 15330 1726882254.08739: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15330 1726882254.08868: variable 'ansible_distribution_major_version' from source: facts 15330 1726882254.08880: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15330 1726882254.08888: when evaluation is False, skipping this task 15330 1726882254.08899: _execute() done 15330 1726882254.08907: dumping result to json 15330 1726882254.08915: done dumping result, returning 15330 1726882254.08925: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [12673a56-9f93-e4fe-1358-0000000000b2] 15330 1726882254.08934: sending task result for task 12673a56-9f93-e4fe-1358-0000000000b2 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15330 1726882254.09113: no more pending results, returning what we have 15330 1726882254.09116: results queue empty 15330 1726882254.09117: checking for any_errors_fatal 15330 1726882254.09122: done checking for any_errors_fatal 15330 1726882254.09123: checking for max_fail_percentage 15330 1726882254.09125: done checking for max_fail_percentage 15330 1726882254.09126: checking to see if all hosts have failed and the running result is not ok 15330 1726882254.09127: done checking to see if all hosts have failed 15330 1726882254.09127: getting the remaining hosts for this loop 15330 1726882254.09128: done getting the remaining hosts for this loop 15330 1726882254.09132: getting the next task for host managed_node3 15330 1726882254.09142: done getting next task for host managed_node3 15330 1726882254.09145: ^ task is: TASK: Enable EPEL 6 15330 1726882254.09150: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882254.09153: getting variables 15330 1726882254.09155: in VariableManager get_vars() 15330 1726882254.09184: Calling all_inventory to load vars for managed_node3 15330 1726882254.09187: Calling groups_inventory to load vars for managed_node3 15330 1726882254.09191: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882254.09205: Calling all_plugins_play to load vars for managed_node3 15330 1726882254.09209: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882254.09212: Calling groups_plugins_play to load vars for managed_node3 15330 1726882254.09556: done sending task result for task 12673a56-9f93-e4fe-1358-0000000000b2 15330 1726882254.09559: WORKER PROCESS EXITING 15330 1726882254.09582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.09769: done with get_vars() 15330 1726882254.09780: done getting variables 15330 1726882254.09837: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:30:54 -0400 (0:00:00.022) 0:00:03.304 ****** 15330 1726882254.09865: entering _queue_task() for managed_node3/copy 15330 1726882254.10322: worker is 1 (out of 1 available) 15330 1726882254.10326: exiting _queue_task() for managed_node3/copy 15330 1726882254.10335: done queuing things up, now waiting for results queue to drain 15330 1726882254.10337: waiting for pending results... 15330 1726882254.10313: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 15330 1726882254.10438: in run() - task 12673a56-9f93-e4fe-1358-0000000000b4 15330 1726882254.10458: variable 'ansible_search_path' from source: unknown 15330 1726882254.10465: variable 'ansible_search_path' from source: unknown 15330 1726882254.10504: calling self._execute() 15330 1726882254.10578: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.10589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.10606: variable 'omit' from source: magic vars 15330 1726882254.11007: variable 'ansible_distribution' from source: facts 15330 1726882254.11024: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15330 1726882254.11136: variable 'ansible_distribution_major_version' from source: facts 15330 1726882254.11147: Evaluated conditional (ansible_distribution_major_version == '6'): False 15330 1726882254.11155: when evaluation is False, skipping this task 15330 1726882254.11163: _execute() done 15330 1726882254.11170: dumping result to json 15330 1726882254.11179: done dumping result, returning 15330 1726882254.11190: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [12673a56-9f93-e4fe-1358-0000000000b4] 15330 1726882254.11204: sending task result for task 12673a56-9f93-e4fe-1358-0000000000b4 15330 1726882254.11298: done sending task result for task 12673a56-9f93-e4fe-1358-0000000000b4 15330 1726882254.11305: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15330 1726882254.11356: no more pending results, returning what we have 15330 1726882254.11359: results queue empty 15330 1726882254.11359: checking for any_errors_fatal 15330 1726882254.11365: done checking for any_errors_fatal 15330 1726882254.11366: checking for max_fail_percentage 15330 1726882254.11368: done checking for max_fail_percentage 15330 1726882254.11369: checking to see if all hosts have failed and the running result is not ok 15330 1726882254.11369: done checking to see if all hosts have failed 15330 1726882254.11370: getting the remaining hosts for this loop 15330 1726882254.11371: done getting the remaining hosts for this loop 15330 1726882254.11375: getting the next task for host managed_node3 15330 1726882254.11383: done getting next task for host managed_node3 15330 1726882254.11386: ^ task is: TASK: Set network provider to 'nm' 15330 1726882254.11388: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882254.11392: getting variables 15330 1726882254.11397: in VariableManager get_vars() 15330 1726882254.11423: Calling all_inventory to load vars for managed_node3 15330 1726882254.11425: Calling groups_inventory to load vars for managed_node3 15330 1726882254.11429: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882254.11441: Calling all_plugins_play to load vars for managed_node3 15330 1726882254.11444: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882254.11447: Calling groups_plugins_play to load vars for managed_node3 15330 1726882254.11825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.11992: done with get_vars() 15330 1726882254.12004: done getting variables 15330 1726882254.12059: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Friday 20 September 2024 21:30:54 -0400 (0:00:00.022) 0:00:03.326 ****** 15330 1726882254.12084: entering _queue_task() for managed_node3/set_fact 15330 1726882254.12295: worker is 1 (out of 1 available) 15330 1726882254.12503: exiting _queue_task() for managed_node3/set_fact 15330 1726882254.12513: done queuing things up, now waiting for results queue to drain 15330 1726882254.12514: waiting for pending results... 15330 1726882254.12536: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 15330 1726882254.12613: in run() - task 12673a56-9f93-e4fe-1358-000000000007 15330 1726882254.12636: variable 'ansible_search_path' from source: unknown 15330 1726882254.12674: calling self._execute() 15330 1726882254.12846: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.12849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.12851: variable 'omit' from source: magic vars 15330 1726882254.12872: variable 'omit' from source: magic vars 15330 1726882254.12906: variable 'omit' from source: magic vars 15330 1726882254.12938: variable 'omit' from source: magic vars 15330 1726882254.12984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882254.13029: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882254.13054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882254.13079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882254.13096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882254.13124: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882254.13131: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.13137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.13242: Set connection var ansible_pipelining to False 15330 1726882254.13262: Set connection var ansible_timeout to 10 15330 1726882254.13270: Set connection var ansible_connection to ssh 15330 1726882254.13281: Set connection var ansible_shell_type to sh 15330 1726882254.13290: Set connection var ansible_shell_executable to /bin/sh 15330 1726882254.13300: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882254.13388: variable 'ansible_shell_executable' from source: unknown 15330 1726882254.13391: variable 'ansible_connection' from source: unknown 15330 1726882254.13395: variable 'ansible_module_compression' from source: unknown 15330 1726882254.13398: variable 'ansible_shell_type' from source: unknown 15330 1726882254.13400: variable 'ansible_shell_executable' from source: unknown 15330 1726882254.13403: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.13405: variable 'ansible_pipelining' from source: unknown 15330 1726882254.13407: variable 'ansible_timeout' from source: unknown 15330 1726882254.13409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.13500: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882254.13515: variable 'omit' from source: magic vars 15330 1726882254.13523: starting attempt loop 15330 1726882254.13529: running the handler 15330 1726882254.13541: handler run complete 15330 1726882254.13551: attempt loop complete, returning result 15330 1726882254.13557: _execute() done 15330 1726882254.13561: dumping result to json 15330 1726882254.13567: done dumping result, returning 15330 1726882254.13575: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [12673a56-9f93-e4fe-1358-000000000007] 15330 1726882254.13582: sending task result for task 12673a56-9f93-e4fe-1358-000000000007 ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 15330 1726882254.13751: no more pending results, returning what we have 15330 1726882254.13753: results queue empty 15330 1726882254.13754: checking for any_errors_fatal 15330 1726882254.13759: done checking for any_errors_fatal 15330 1726882254.13760: checking for max_fail_percentage 15330 1726882254.13761: done checking for max_fail_percentage 15330 1726882254.13762: checking to see if all hosts have failed and the running result is not ok 15330 1726882254.13763: done checking to see if all hosts have failed 15330 1726882254.13763: getting the remaining hosts for this loop 15330 1726882254.13764: done getting the remaining hosts for this loop 15330 1726882254.13768: getting the next task for host managed_node3 15330 1726882254.13773: done getting next task for host managed_node3 15330 1726882254.13775: ^ task is: TASK: meta (flush_handlers) 15330 1726882254.13777: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882254.13780: getting variables 15330 1726882254.13782: in VariableManager get_vars() 15330 1726882254.13810: Calling all_inventory to load vars for managed_node3 15330 1726882254.13812: Calling groups_inventory to load vars for managed_node3 15330 1726882254.13815: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882254.13827: Calling all_plugins_play to load vars for managed_node3 15330 1726882254.13830: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882254.13833: Calling groups_plugins_play to load vars for managed_node3 15330 1726882254.14156: done sending task result for task 12673a56-9f93-e4fe-1358-000000000007 15330 1726882254.14160: WORKER PROCESS EXITING 15330 1726882254.14182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.14540: done with get_vars() 15330 1726882254.14548: done getting variables 15330 1726882254.14607: in VariableManager get_vars() 15330 1726882254.14615: Calling all_inventory to load vars for managed_node3 15330 1726882254.14617: Calling groups_inventory to load vars for managed_node3 15330 1726882254.14619: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882254.14623: Calling all_plugins_play to load vars for managed_node3 15330 1726882254.14625: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882254.14628: Calling groups_plugins_play to load vars for managed_node3 15330 1726882254.14750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.14928: done with get_vars() 15330 1726882254.14940: done queuing things up, now waiting for results queue to drain 15330 1726882254.14941: results queue empty 15330 1726882254.14942: checking for any_errors_fatal 15330 1726882254.14944: done checking for any_errors_fatal 15330 1726882254.14944: checking for max_fail_percentage 15330 1726882254.14945: done checking for max_fail_percentage 15330 1726882254.14947: checking to see if all hosts have failed and the running result is not ok 15330 1726882254.14948: done checking to see if all hosts have failed 15330 1726882254.14949: getting the remaining hosts for this loop 15330 1726882254.14950: done getting the remaining hosts for this loop 15330 1726882254.14952: getting the next task for host managed_node3 15330 1726882254.14955: done getting next task for host managed_node3 15330 1726882254.14957: ^ task is: TASK: meta (flush_handlers) 15330 1726882254.14958: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882254.14966: getting variables 15330 1726882254.14967: in VariableManager get_vars() 15330 1726882254.14974: Calling all_inventory to load vars for managed_node3 15330 1726882254.14976: Calling groups_inventory to load vars for managed_node3 15330 1726882254.14978: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882254.14982: Calling all_plugins_play to load vars for managed_node3 15330 1726882254.14985: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882254.14988: Calling groups_plugins_play to load vars for managed_node3 15330 1726882254.15114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.15308: done with get_vars() 15330 1726882254.15315: done getting variables 15330 1726882254.15357: in VariableManager get_vars() 15330 1726882254.15365: Calling all_inventory to load vars for managed_node3 15330 1726882254.15367: Calling groups_inventory to load vars for managed_node3 15330 1726882254.15369: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882254.15373: Calling all_plugins_play to load vars for managed_node3 15330 1726882254.15375: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882254.15378: Calling groups_plugins_play to load vars for managed_node3 15330 1726882254.15506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.15682: done with get_vars() 15330 1726882254.15695: done queuing things up, now waiting for results queue to drain 15330 1726882254.15697: results queue empty 15330 1726882254.15697: checking for any_errors_fatal 15330 1726882254.15698: done checking for any_errors_fatal 15330 1726882254.15699: checking for max_fail_percentage 15330 1726882254.15700: done checking for max_fail_percentage 15330 1726882254.15701: checking to see if all hosts have failed and the running result is not ok 15330 1726882254.15702: done checking to see if all hosts have failed 15330 1726882254.15702: getting the remaining hosts for this loop 15330 1726882254.15703: done getting the remaining hosts for this loop 15330 1726882254.15705: getting the next task for host managed_node3 15330 1726882254.15707: done getting next task for host managed_node3 15330 1726882254.15708: ^ task is: None 15330 1726882254.15709: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882254.15710: done queuing things up, now waiting for results queue to drain 15330 1726882254.15711: results queue empty 15330 1726882254.15712: checking for any_errors_fatal 15330 1726882254.15713: done checking for any_errors_fatal 15330 1726882254.15713: checking for max_fail_percentage 15330 1726882254.15714: done checking for max_fail_percentage 15330 1726882254.15715: checking to see if all hosts have failed and the running result is not ok 15330 1726882254.15715: done checking to see if all hosts have failed 15330 1726882254.15717: getting the next task for host managed_node3 15330 1726882254.15719: done getting next task for host managed_node3 15330 1726882254.15720: ^ task is: None 15330 1726882254.15721: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882254.15767: in VariableManager get_vars() 15330 1726882254.15781: done with get_vars() 15330 1726882254.15787: in VariableManager get_vars() 15330 1726882254.15800: done with get_vars() 15330 1726882254.15804: variable 'omit' from source: magic vars 15330 1726882254.15834: in VariableManager get_vars() 15330 1726882254.15844: done with get_vars() 15330 1726882254.15864: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 15330 1726882254.16037: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15330 1726882254.16064: getting the remaining hosts for this loop 15330 1726882254.16065: done getting the remaining hosts for this loop 15330 1726882254.16067: getting the next task for host managed_node3 15330 1726882254.16070: done getting next task for host managed_node3 15330 1726882254.16072: ^ task is: TASK: Gathering Facts 15330 1726882254.16073: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882254.16075: getting variables 15330 1726882254.16076: in VariableManager get_vars() 15330 1726882254.16083: Calling all_inventory to load vars for managed_node3 15330 1726882254.16085: Calling groups_inventory to load vars for managed_node3 15330 1726882254.16088: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882254.16092: Calling all_plugins_play to load vars for managed_node3 15330 1726882254.16109: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882254.16112: Calling groups_plugins_play to load vars for managed_node3 15330 1726882254.16274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882254.16450: done with get_vars() 15330 1726882254.16457: done getting variables 15330 1726882254.16496: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Friday 20 September 2024 21:30:54 -0400 (0:00:00.044) 0:00:03.371 ****** 15330 1726882254.16519: entering _queue_task() for managed_node3/gather_facts 15330 1726882254.16789: worker is 1 (out of 1 available) 15330 1726882254.16999: exiting _queue_task() for managed_node3/gather_facts 15330 1726882254.17008: done queuing things up, now waiting for results queue to drain 15330 1726882254.17009: waiting for pending results... 15330 1726882254.17052: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882254.17147: in run() - task 12673a56-9f93-e4fe-1358-0000000000da 15330 1726882254.17167: variable 'ansible_search_path' from source: unknown 15330 1726882254.17209: calling self._execute() 15330 1726882254.17342: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.17345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.17348: variable 'omit' from source: magic vars 15330 1726882254.17681: variable 'ansible_distribution_major_version' from source: facts 15330 1726882254.17701: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882254.17711: variable 'omit' from source: magic vars 15330 1726882254.17743: variable 'omit' from source: magic vars 15330 1726882254.17786: variable 'omit' from source: magic vars 15330 1726882254.17830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882254.17870: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882254.17902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882254.18000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882254.18004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882254.18007: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882254.18009: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.18011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.18091: Set connection var ansible_pipelining to False 15330 1726882254.18115: Set connection var ansible_timeout to 10 15330 1726882254.18123: Set connection var ansible_connection to ssh 15330 1726882254.18130: Set connection var ansible_shell_type to sh 15330 1726882254.18140: Set connection var ansible_shell_executable to /bin/sh 15330 1726882254.18149: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882254.18174: variable 'ansible_shell_executable' from source: unknown 15330 1726882254.18182: variable 'ansible_connection' from source: unknown 15330 1726882254.18190: variable 'ansible_module_compression' from source: unknown 15330 1726882254.18199: variable 'ansible_shell_type' from source: unknown 15330 1726882254.18210: variable 'ansible_shell_executable' from source: unknown 15330 1726882254.18217: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882254.18316: variable 'ansible_pipelining' from source: unknown 15330 1726882254.18319: variable 'ansible_timeout' from source: unknown 15330 1726882254.18321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882254.18410: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882254.18430: variable 'omit' from source: magic vars 15330 1726882254.18440: starting attempt loop 15330 1726882254.18447: running the handler 15330 1726882254.18467: variable 'ansible_facts' from source: unknown 15330 1726882254.18492: _low_level_execute_command(): starting 15330 1726882254.18508: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882254.19236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882254.19310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882254.19367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882254.19387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882254.19417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882254.19508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882254.21724: stdout chunk (state=3): >>>/root <<< 15330 1726882254.21932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882254.21936: stdout chunk (state=3): >>><<< 15330 1726882254.21939: stderr chunk (state=3): >>><<< 15330 1726882254.21960: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882254.22066: _low_level_execute_command(): starting 15330 1726882254.22071: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739 `" && echo ansible-tmp-1726882254.2196796-15527-233414280224739="` echo /root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739 `" ) && sleep 0' 15330 1726882254.22663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882254.22680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882254.22700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882254.22821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882254.22837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882254.22840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882254.22886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882254.25472: stdout chunk (state=3): >>>ansible-tmp-1726882254.2196796-15527-233414280224739=/root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739 <<< 15330 1726882254.25684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882254.25688: stdout chunk (state=3): >>><<< 15330 1726882254.25690: stderr chunk (state=3): >>><<< 15330 1726882254.25713: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882254.2196796-15527-233414280224739=/root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882254.25748: variable 'ansible_module_compression' from source: unknown 15330 1726882254.25899: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882254.25903: variable 'ansible_facts' from source: unknown 15330 1726882254.26113: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739/AnsiballZ_setup.py 15330 1726882254.26385: Sending initial data 15330 1726882254.26388: Sent initial data (154 bytes) 15330 1726882254.26954: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882254.26974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882254.27056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882254.29204: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882254.29282: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882254.29337: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpa7hjxikq /root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739/AnsiballZ_setup.py <<< 15330 1726882254.29359: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739/AnsiballZ_setup.py" <<< 15330 1726882254.29388: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15330 1726882254.29391: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpa7hjxikq" to remote "/root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739/AnsiballZ_setup.py" <<< 15330 1726882254.31035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882254.31049: stderr chunk (state=3): >>><<< 15330 1726882254.31062: stdout chunk (state=3): >>><<< 15330 1726882254.31100: done transferring module to remote 15330 1726882254.31107: _low_level_execute_command(): starting 15330 1726882254.31186: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739/ /root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739/AnsiballZ_setup.py && sleep 0' 15330 1726882254.31760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882254.31774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882254.31789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882254.31810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882254.31830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882254.31931: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882254.31962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882254.32040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882254.34498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882254.34528: stderr chunk (state=3): >>><<< 15330 1726882254.34531: stdout chunk (state=3): >>><<< 15330 1726882254.34625: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882254.34633: _low_level_execute_command(): starting 15330 1726882254.34636: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739/AnsiballZ_setup.py && sleep 0' 15330 1726882254.35232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882254.35308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882254.35347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882254.35370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882254.35388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882254.35469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882255.14512: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 1.11767578125, "5m": 0.482421875, "15m": 0.22021484375}, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "54", "epoch": "1726882254", "epoch_int": "1726882254", "date": "2024-09-20", "time": "21:30:54", "iso8601_micro": "2024-09-21T01:30:54.706835Z", "iso8601": "2024-09-21T01:30:54Z", "iso8601_basic": "20240920T213054706835", "iso8601_basic_short": "20240920T213054", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2985, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 546, "free": 2985}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 561, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805051904, "block_size": 4096, "block_total": 65519099, "block_available": 63917249, "block_used": 1601850, "inode_total": 131070960, "inode_available": 131029132, "inode_used": 41828, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882255.16900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882255.16904: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 15330 1726882255.16907: stdout chunk (state=3): >>><<< 15330 1726882255.16909: stderr chunk (state=3): >>><<< 15330 1726882255.16912: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 1.11767578125, "5m": 0.482421875, "15m": 0.22021484375}, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "54", "epoch": "1726882254", "epoch_int": "1726882254", "date": "2024-09-20", "time": "21:30:54", "iso8601_micro": "2024-09-21T01:30:54.706835Z", "iso8601": "2024-09-21T01:30:54Z", "iso8601_basic": "20240920T213054706835", "iso8601_basic_short": "20240920T213054", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2985, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 546, "free": 2985}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 561, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805051904, "block_size": 4096, "block_total": 65519099, "block_available": 63917249, "block_used": 1601850, "inode_total": 131070960, "inode_available": 131029132, "inode_used": 41828, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882255.17645: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882255.17650: _low_level_execute_command(): starting 15330 1726882255.17653: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882254.2196796-15527-233414280224739/ > /dev/null 2>&1 && sleep 0' 15330 1726882255.19120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882255.19124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882255.19244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882255.19249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882255.19327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882255.21201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882255.21213: stdout chunk (state=3): >>><<< 15330 1726882255.21236: stderr chunk (state=3): >>><<< 15330 1726882255.21264: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882255.21277: handler run complete 15330 1726882255.21420: variable 'ansible_facts' from source: unknown 15330 1726882255.21578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882255.21943: variable 'ansible_facts' from source: unknown 15330 1726882255.22048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882255.22185: attempt loop complete, returning result 15330 1726882255.22200: _execute() done 15330 1726882255.22208: dumping result to json 15330 1726882255.22247: done dumping result, returning 15330 1726882255.22265: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-0000000000da] 15330 1726882255.22274: sending task result for task 12673a56-9f93-e4fe-1358-0000000000da 15330 1726882255.22846: done sending task result for task 12673a56-9f93-e4fe-1358-0000000000da 15330 1726882255.22849: WORKER PROCESS EXITING ok: [managed_node3] 15330 1726882255.23292: no more pending results, returning what we have 15330 1726882255.23305: results queue empty 15330 1726882255.23307: checking for any_errors_fatal 15330 1726882255.23308: done checking for any_errors_fatal 15330 1726882255.23309: checking for max_fail_percentage 15330 1726882255.23310: done checking for max_fail_percentage 15330 1726882255.23311: checking to see if all hosts have failed and the running result is not ok 15330 1726882255.23312: done checking to see if all hosts have failed 15330 1726882255.23313: getting the remaining hosts for this loop 15330 1726882255.23314: done getting the remaining hosts for this loop 15330 1726882255.23317: getting the next task for host managed_node3 15330 1726882255.23322: done getting next task for host managed_node3 15330 1726882255.23324: ^ task is: TASK: meta (flush_handlers) 15330 1726882255.23325: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882255.23329: getting variables 15330 1726882255.23330: in VariableManager get_vars() 15330 1726882255.23352: Calling all_inventory to load vars for managed_node3 15330 1726882255.23354: Calling groups_inventory to load vars for managed_node3 15330 1726882255.23357: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882255.23367: Calling all_plugins_play to load vars for managed_node3 15330 1726882255.23370: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882255.23373: Calling groups_plugins_play to load vars for managed_node3 15330 1726882255.23554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882255.23765: done with get_vars() 15330 1726882255.23775: done getting variables 15330 1726882255.23852: in VariableManager get_vars() 15330 1726882255.23862: Calling all_inventory to load vars for managed_node3 15330 1726882255.23864: Calling groups_inventory to load vars for managed_node3 15330 1726882255.23867: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882255.23871: Calling all_plugins_play to load vars for managed_node3 15330 1726882255.23874: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882255.23876: Calling groups_plugins_play to load vars for managed_node3 15330 1726882255.24032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882255.24222: done with get_vars() 15330 1726882255.24235: done queuing things up, now waiting for results queue to drain 15330 1726882255.24237: results queue empty 15330 1726882255.24237: checking for any_errors_fatal 15330 1726882255.24240: done checking for any_errors_fatal 15330 1726882255.24241: checking for max_fail_percentage 15330 1726882255.24243: done checking for max_fail_percentage 15330 1726882255.24243: checking to see if all hosts have failed and the running result is not ok 15330 1726882255.24249: done checking to see if all hosts have failed 15330 1726882255.24250: getting the remaining hosts for this loop 15330 1726882255.24251: done getting the remaining hosts for this loop 15330 1726882255.24253: getting the next task for host managed_node3 15330 1726882255.24257: done getting next task for host managed_node3 15330 1726882255.24259: ^ task is: TASK: Set interface={{ interface }} 15330 1726882255.24260: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882255.24262: getting variables 15330 1726882255.24270: in VariableManager get_vars() 15330 1726882255.24278: Calling all_inventory to load vars for managed_node3 15330 1726882255.24280: Calling groups_inventory to load vars for managed_node3 15330 1726882255.24282: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882255.24287: Calling all_plugins_play to load vars for managed_node3 15330 1726882255.24289: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882255.24292: Calling groups_plugins_play to load vars for managed_node3 15330 1726882255.24433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882255.24622: done with get_vars() 15330 1726882255.24630: done getting variables 15330 1726882255.24668: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882255.24806: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Friday 20 September 2024 21:30:55 -0400 (0:00:01.083) 0:00:04.454 ****** 15330 1726882255.24853: entering _queue_task() for managed_node3/set_fact 15330 1726882255.25272: worker is 1 (out of 1 available) 15330 1726882255.25283: exiting _queue_task() for managed_node3/set_fact 15330 1726882255.25299: done queuing things up, now waiting for results queue to drain 15330 1726882255.25301: waiting for pending results... 15330 1726882255.25537: running TaskExecutor() for managed_node3/TASK: Set interface=LSR-TST-br31 15330 1726882255.25785: in run() - task 12673a56-9f93-e4fe-1358-00000000000b 15330 1726882255.25858: variable 'ansible_search_path' from source: unknown 15330 1726882255.25914: calling self._execute() 15330 1726882255.26052: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882255.26102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882255.26187: variable 'omit' from source: magic vars 15330 1726882255.27158: variable 'ansible_distribution_major_version' from source: facts 15330 1726882255.27302: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882255.27305: variable 'omit' from source: magic vars 15330 1726882255.27307: variable 'omit' from source: magic vars 15330 1726882255.27530: variable 'interface' from source: play vars 15330 1726882255.27737: variable 'interface' from source: play vars 15330 1726882255.27830: variable 'omit' from source: magic vars 15330 1726882255.28020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882255.28138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882255.28264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882255.28267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882255.28478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882255.28481: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882255.28484: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882255.28486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882255.28718: Set connection var ansible_pipelining to False 15330 1726882255.28840: Set connection var ansible_timeout to 10 15330 1726882255.28843: Set connection var ansible_connection to ssh 15330 1726882255.28846: Set connection var ansible_shell_type to sh 15330 1726882255.28926: Set connection var ansible_shell_executable to /bin/sh 15330 1726882255.28930: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882255.28932: variable 'ansible_shell_executable' from source: unknown 15330 1726882255.28938: variable 'ansible_connection' from source: unknown 15330 1726882255.28942: variable 'ansible_module_compression' from source: unknown 15330 1726882255.28944: variable 'ansible_shell_type' from source: unknown 15330 1726882255.28946: variable 'ansible_shell_executable' from source: unknown 15330 1726882255.28948: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882255.28949: variable 'ansible_pipelining' from source: unknown 15330 1726882255.28951: variable 'ansible_timeout' from source: unknown 15330 1726882255.29082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882255.29665: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882255.29768: variable 'omit' from source: magic vars 15330 1726882255.29772: starting attempt loop 15330 1726882255.29774: running the handler 15330 1726882255.29779: handler run complete 15330 1726882255.29781: attempt loop complete, returning result 15330 1726882255.29783: _execute() done 15330 1726882255.29785: dumping result to json 15330 1726882255.29787: done dumping result, returning 15330 1726882255.29789: done running TaskExecutor() for managed_node3/TASK: Set interface=LSR-TST-br31 [12673a56-9f93-e4fe-1358-00000000000b] 15330 1726882255.29790: sending task result for task 12673a56-9f93-e4fe-1358-00000000000b ok: [managed_node3] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 15330 1726882255.29949: no more pending results, returning what we have 15330 1726882255.29951: results queue empty 15330 1726882255.29952: checking for any_errors_fatal 15330 1726882255.29954: done checking for any_errors_fatal 15330 1726882255.29955: checking for max_fail_percentage 15330 1726882255.29957: done checking for max_fail_percentage 15330 1726882255.29958: checking to see if all hosts have failed and the running result is not ok 15330 1726882255.29958: done checking to see if all hosts have failed 15330 1726882255.29959: getting the remaining hosts for this loop 15330 1726882255.29960: done getting the remaining hosts for this loop 15330 1726882255.29963: getting the next task for host managed_node3 15330 1726882255.29978: done getting next task for host managed_node3 15330 1726882255.29982: ^ task is: TASK: Include the task 'show_interfaces.yml' 15330 1726882255.29984: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882255.29988: getting variables 15330 1726882255.29990: in VariableManager get_vars() 15330 1726882255.30024: Calling all_inventory to load vars for managed_node3 15330 1726882255.30027: Calling groups_inventory to load vars for managed_node3 15330 1726882255.30031: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882255.30198: Calling all_plugins_play to load vars for managed_node3 15330 1726882255.30202: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882255.30304: Calling groups_plugins_play to load vars for managed_node3 15330 1726882255.31071: done sending task result for task 12673a56-9f93-e4fe-1358-00000000000b 15330 1726882255.31074: WORKER PROCESS EXITING 15330 1726882255.31139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882255.31551: done with get_vars() 15330 1726882255.31561: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Friday 20 September 2024 21:30:55 -0400 (0:00:00.068) 0:00:04.523 ****** 15330 1726882255.31746: entering _queue_task() for managed_node3/include_tasks 15330 1726882255.32329: worker is 1 (out of 1 available) 15330 1726882255.32341: exiting _queue_task() for managed_node3/include_tasks 15330 1726882255.32352: done queuing things up, now waiting for results queue to drain 15330 1726882255.32353: waiting for pending results... 15330 1726882255.32635: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 15330 1726882255.32721: in run() - task 12673a56-9f93-e4fe-1358-00000000000c 15330 1726882255.32826: variable 'ansible_search_path' from source: unknown 15330 1726882255.32868: calling self._execute() 15330 1726882255.32968: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882255.32981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882255.32998: variable 'omit' from source: magic vars 15330 1726882255.33414: variable 'ansible_distribution_major_version' from source: facts 15330 1726882255.33490: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882255.33577: _execute() done 15330 1726882255.33598: dumping result to json 15330 1726882255.33602: done dumping result, returning 15330 1726882255.33629: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-e4fe-1358-00000000000c] 15330 1726882255.33636: sending task result for task 12673a56-9f93-e4fe-1358-00000000000c 15330 1726882255.33789: done sending task result for task 12673a56-9f93-e4fe-1358-00000000000c 15330 1726882255.33945: no more pending results, returning what we have 15330 1726882255.33951: in VariableManager get_vars() 15330 1726882255.33994: Calling all_inventory to load vars for managed_node3 15330 1726882255.33997: Calling groups_inventory to load vars for managed_node3 15330 1726882255.34082: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882255.34148: WORKER PROCESS EXITING 15330 1726882255.34164: Calling all_plugins_play to load vars for managed_node3 15330 1726882255.34212: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882255.34218: Calling groups_plugins_play to load vars for managed_node3 15330 1726882255.34726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882255.35177: done with get_vars() 15330 1726882255.35192: variable 'ansible_search_path' from source: unknown 15330 1726882255.35213: we have included files to process 15330 1726882255.35214: generating all_blocks data 15330 1726882255.35216: done generating all_blocks data 15330 1726882255.35217: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15330 1726882255.35218: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15330 1726882255.35221: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15330 1726882255.35560: in VariableManager get_vars() 15330 1726882255.35577: done with get_vars() 15330 1726882255.35798: done processing included file 15330 1726882255.35800: iterating over new_blocks loaded from include file 15330 1726882255.35802: in VariableManager get_vars() 15330 1726882255.35814: done with get_vars() 15330 1726882255.35826: filtering new block on tags 15330 1726882255.35846: done filtering new block on tags 15330 1726882255.35848: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 15330 1726882255.35854: extending task lists for all hosts with included blocks 15330 1726882255.35953: done extending task lists 15330 1726882255.35955: done processing included files 15330 1726882255.35956: results queue empty 15330 1726882255.35956: checking for any_errors_fatal 15330 1726882255.35964: done checking for any_errors_fatal 15330 1726882255.35965: checking for max_fail_percentage 15330 1726882255.35966: done checking for max_fail_percentage 15330 1726882255.35967: checking to see if all hosts have failed and the running result is not ok 15330 1726882255.35968: done checking to see if all hosts have failed 15330 1726882255.35988: getting the remaining hosts for this loop 15330 1726882255.35990: done getting the remaining hosts for this loop 15330 1726882255.36016: getting the next task for host managed_node3 15330 1726882255.36021: done getting next task for host managed_node3 15330 1726882255.36024: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15330 1726882255.36027: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882255.36030: getting variables 15330 1726882255.36031: in VariableManager get_vars() 15330 1726882255.36056: Calling all_inventory to load vars for managed_node3 15330 1726882255.36058: Calling groups_inventory to load vars for managed_node3 15330 1726882255.36061: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882255.36066: Calling all_plugins_play to load vars for managed_node3 15330 1726882255.36068: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882255.36071: Calling groups_plugins_play to load vars for managed_node3 15330 1726882255.36313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882255.36519: done with get_vars() 15330 1726882255.36532: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:30:55 -0400 (0:00:00.048) 0:00:04.572 ****** 15330 1726882255.36604: entering _queue_task() for managed_node3/include_tasks 15330 1726882255.36939: worker is 1 (out of 1 available) 15330 1726882255.36951: exiting _queue_task() for managed_node3/include_tasks 15330 1726882255.36961: done queuing things up, now waiting for results queue to drain 15330 1726882255.36969: waiting for pending results... 15330 1726882255.37133: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 15330 1726882255.37303: in run() - task 12673a56-9f93-e4fe-1358-0000000000ee 15330 1726882255.37308: variable 'ansible_search_path' from source: unknown 15330 1726882255.37316: variable 'ansible_search_path' from source: unknown 15330 1726882255.37324: calling self._execute() 15330 1726882255.37563: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882255.37566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882255.37569: variable 'omit' from source: magic vars 15330 1726882255.37971: variable 'ansible_distribution_major_version' from source: facts 15330 1726882255.37997: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882255.38023: _execute() done 15330 1726882255.38032: dumping result to json 15330 1726882255.38040: done dumping result, returning 15330 1726882255.38050: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-e4fe-1358-0000000000ee] 15330 1726882255.38058: sending task result for task 12673a56-9f93-e4fe-1358-0000000000ee 15330 1726882255.38181: no more pending results, returning what we have 15330 1726882255.38187: in VariableManager get_vars() 15330 1726882255.38226: Calling all_inventory to load vars for managed_node3 15330 1726882255.38228: Calling groups_inventory to load vars for managed_node3 15330 1726882255.38232: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882255.38246: Calling all_plugins_play to load vars for managed_node3 15330 1726882255.38249: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882255.38252: Calling groups_plugins_play to load vars for managed_node3 15330 1726882255.38715: done sending task result for task 12673a56-9f93-e4fe-1358-0000000000ee 15330 1726882255.38718: WORKER PROCESS EXITING 15330 1726882255.38740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882255.38958: done with get_vars() 15330 1726882255.38966: variable 'ansible_search_path' from source: unknown 15330 1726882255.38967: variable 'ansible_search_path' from source: unknown 15330 1726882255.39022: we have included files to process 15330 1726882255.39024: generating all_blocks data 15330 1726882255.39025: done generating all_blocks data 15330 1726882255.39026: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15330 1726882255.39028: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15330 1726882255.39030: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15330 1726882255.39471: done processing included file 15330 1726882255.39472: iterating over new_blocks loaded from include file 15330 1726882255.39474: in VariableManager get_vars() 15330 1726882255.39484: done with get_vars() 15330 1726882255.39486: filtering new block on tags 15330 1726882255.39504: done filtering new block on tags 15330 1726882255.39507: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 15330 1726882255.39513: extending task lists for all hosts with included blocks 15330 1726882255.39607: done extending task lists 15330 1726882255.39609: done processing included files 15330 1726882255.39609: results queue empty 15330 1726882255.39610: checking for any_errors_fatal 15330 1726882255.39613: done checking for any_errors_fatal 15330 1726882255.39613: checking for max_fail_percentage 15330 1726882255.39614: done checking for max_fail_percentage 15330 1726882255.39615: checking to see if all hosts have failed and the running result is not ok 15330 1726882255.39616: done checking to see if all hosts have failed 15330 1726882255.39616: getting the remaining hosts for this loop 15330 1726882255.39618: done getting the remaining hosts for this loop 15330 1726882255.39620: getting the next task for host managed_node3 15330 1726882255.39624: done getting next task for host managed_node3 15330 1726882255.39626: ^ task is: TASK: Gather current interface info 15330 1726882255.39629: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882255.39631: getting variables 15330 1726882255.39631: in VariableManager get_vars() 15330 1726882255.39639: Calling all_inventory to load vars for managed_node3 15330 1726882255.39641: Calling groups_inventory to load vars for managed_node3 15330 1726882255.39643: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882255.39648: Calling all_plugins_play to load vars for managed_node3 15330 1726882255.39650: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882255.39653: Calling groups_plugins_play to load vars for managed_node3 15330 1726882255.39790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882255.40033: done with get_vars() 15330 1726882255.40044: done getting variables 15330 1726882255.40096: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:30:55 -0400 (0:00:00.035) 0:00:04.607 ****** 15330 1726882255.40183: entering _queue_task() for managed_node3/command 15330 1726882255.40582: worker is 1 (out of 1 available) 15330 1726882255.40810: exiting _queue_task() for managed_node3/command 15330 1726882255.40824: done queuing things up, now waiting for results queue to drain 15330 1726882255.40826: waiting for pending results... 15330 1726882255.40916: running TaskExecutor() for managed_node3/TASK: Gather current interface info 15330 1726882255.41112: in run() - task 12673a56-9f93-e4fe-1358-0000000000fd 15330 1726882255.41144: variable 'ansible_search_path' from source: unknown 15330 1726882255.41168: variable 'ansible_search_path' from source: unknown 15330 1726882255.41206: calling self._execute() 15330 1726882255.41351: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882255.41365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882255.41402: variable 'omit' from source: magic vars 15330 1726882255.41866: variable 'ansible_distribution_major_version' from source: facts 15330 1726882255.41904: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882255.42034: variable 'omit' from source: magic vars 15330 1726882255.42038: variable 'omit' from source: magic vars 15330 1726882255.42040: variable 'omit' from source: magic vars 15330 1726882255.42064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882255.42112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882255.42152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882255.42174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882255.42190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882255.42225: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882255.42234: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882255.42242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882255.42372: Set connection var ansible_pipelining to False 15330 1726882255.42405: Set connection var ansible_timeout to 10 15330 1726882255.42414: Set connection var ansible_connection to ssh 15330 1726882255.42421: Set connection var ansible_shell_type to sh 15330 1726882255.42436: Set connection var ansible_shell_executable to /bin/sh 15330 1726882255.42456: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882255.42506: variable 'ansible_shell_executable' from source: unknown 15330 1726882255.42516: variable 'ansible_connection' from source: unknown 15330 1726882255.42638: variable 'ansible_module_compression' from source: unknown 15330 1726882255.42645: variable 'ansible_shell_type' from source: unknown 15330 1726882255.42647: variable 'ansible_shell_executable' from source: unknown 15330 1726882255.42649: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882255.42651: variable 'ansible_pipelining' from source: unknown 15330 1726882255.42653: variable 'ansible_timeout' from source: unknown 15330 1726882255.42655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882255.43021: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882255.43026: variable 'omit' from source: magic vars 15330 1726882255.43091: starting attempt loop 15330 1726882255.43098: running the handler 15330 1726882255.43102: _low_level_execute_command(): starting 15330 1726882255.43104: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882255.44239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882255.44253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882255.44297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882255.44314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882255.44411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882255.44427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882255.44511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882255.46199: stdout chunk (state=3): >>>/root <<< 15330 1726882255.46398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882255.46416: stdout chunk (state=3): >>><<< 15330 1726882255.46462: stderr chunk (state=3): >>><<< 15330 1726882255.46496: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882255.46678: _low_level_execute_command(): starting 15330 1726882255.46681: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301 `" && echo ansible-tmp-1726882255.4650388-15584-39648630581301="` echo /root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301 `" ) && sleep 0' 15330 1726882255.48001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882255.48055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882255.48160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882255.50013: stdout chunk (state=3): >>>ansible-tmp-1726882255.4650388-15584-39648630581301=/root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301 <<< 15330 1726882255.50607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882255.50610: stdout chunk (state=3): >>><<< 15330 1726882255.50613: stderr chunk (state=3): >>><<< 15330 1726882255.50616: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882255.4650388-15584-39648630581301=/root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882255.50618: variable 'ansible_module_compression' from source: unknown 15330 1726882255.50620: ANSIBALLZ: Using generic lock for ansible.legacy.command 15330 1726882255.50622: ANSIBALLZ: Acquiring lock 15330 1726882255.50624: ANSIBALLZ: Lock acquired: 140238209361168 15330 1726882255.50626: ANSIBALLZ: Creating module 15330 1726882255.69789: ANSIBALLZ: Writing module into payload 15330 1726882255.69796: ANSIBALLZ: Writing module 15330 1726882255.69918: ANSIBALLZ: Renaming module 15330 1726882255.69930: ANSIBALLZ: Done creating module 15330 1726882255.69949: variable 'ansible_facts' from source: unknown 15330 1726882255.70138: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301/AnsiballZ_command.py 15330 1726882255.70469: Sending initial data 15330 1726882255.70479: Sent initial data (155 bytes) 15330 1726882255.71770: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882255.71912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882255.72118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882255.72145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882255.72227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882255.73860: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15330 1726882255.73914: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882255.73957: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpih8eyrbv /root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301/AnsiballZ_command.py <<< 15330 1726882255.73961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301/AnsiballZ_command.py" <<< 15330 1726882255.74055: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpih8eyrbv" to remote "/root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301/AnsiballZ_command.py" <<< 15330 1726882255.75278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882255.75400: stderr chunk (state=3): >>><<< 15330 1726882255.75403: stdout chunk (state=3): >>><<< 15330 1726882255.75442: done transferring module to remote 15330 1726882255.75445: _low_level_execute_command(): starting 15330 1726882255.75448: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301/ /root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301/AnsiballZ_command.py && sleep 0' 15330 1726882255.76528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882255.76603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882255.76672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882255.78416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882255.78479: stderr chunk (state=3): >>><<< 15330 1726882255.78498: stdout chunk (state=3): >>><<< 15330 1726882255.78609: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882255.78612: _low_level_execute_command(): starting 15330 1726882255.78615: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301/AnsiballZ_command.py && sleep 0' 15330 1726882255.79318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882255.79378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882255.79487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882255.79529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882255.79641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882255.95502: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:30:55.945519", "end": "2024-09-20 21:30:55.948633", "delta": "0:00:00.003114", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15330 1726882255.96521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882255.96580: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 15330 1726882255.96592: stdout chunk (state=3): >>><<< 15330 1726882255.96614: stderr chunk (state=3): >>><<< 15330 1726882255.96638: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:30:55.945519", "end": "2024-09-20 21:30:55.948633", "delta": "0:00:00.003114", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882255.96682: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882255.96982: _low_level_execute_command(): starting 15330 1726882255.96986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882255.4650388-15584-39648630581301/ > /dev/null 2>&1 && sleep 0' 15330 1726882255.97988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882255.98202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882255.98312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882255.98443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882256.00381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882256.00389: stdout chunk (state=3): >>><<< 15330 1726882256.00409: stderr chunk (state=3): >>><<< 15330 1726882256.00431: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882256.00490: handler run complete 15330 1726882256.00522: Evaluated conditional (False): False 15330 1726882256.00540: attempt loop complete, returning result 15330 1726882256.00573: _execute() done 15330 1726882256.00578: dumping result to json 15330 1726882256.00586: done dumping result, returning 15330 1726882256.00607: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [12673a56-9f93-e4fe-1358-0000000000fd] 15330 1726882256.00630: sending task result for task 12673a56-9f93-e4fe-1358-0000000000fd 15330 1726882256.00923: done sending task result for task 12673a56-9f93-e4fe-1358-0000000000fd 15330 1726882256.00926: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003114", "end": "2024-09-20 21:30:55.948633", "rc": 0, "start": "2024-09-20 21:30:55.945519" } STDOUT: bonding_masters eth0 lo 15330 1726882256.01068: no more pending results, returning what we have 15330 1726882256.01071: results queue empty 15330 1726882256.01072: checking for any_errors_fatal 15330 1726882256.01074: done checking for any_errors_fatal 15330 1726882256.01074: checking for max_fail_percentage 15330 1726882256.01076: done checking for max_fail_percentage 15330 1726882256.01077: checking to see if all hosts have failed and the running result is not ok 15330 1726882256.01078: done checking to see if all hosts have failed 15330 1726882256.01078: getting the remaining hosts for this loop 15330 1726882256.01080: done getting the remaining hosts for this loop 15330 1726882256.01083: getting the next task for host managed_node3 15330 1726882256.01089: done getting next task for host managed_node3 15330 1726882256.01092: ^ task is: TASK: Set current_interfaces 15330 1726882256.01098: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.01101: getting variables 15330 1726882256.01104: in VariableManager get_vars() 15330 1726882256.01133: Calling all_inventory to load vars for managed_node3 15330 1726882256.01135: Calling groups_inventory to load vars for managed_node3 15330 1726882256.01139: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.01149: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.01152: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.01155: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.02157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.02633: done with get_vars() 15330 1726882256.02645: done getting variables 15330 1726882256.02906: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:30:56 -0400 (0:00:00.627) 0:00:05.235 ****** 15330 1726882256.02938: entering _queue_task() for managed_node3/set_fact 15330 1726882256.03827: worker is 1 (out of 1 available) 15330 1726882256.03837: exiting _queue_task() for managed_node3/set_fact 15330 1726882256.03848: done queuing things up, now waiting for results queue to drain 15330 1726882256.03850: waiting for pending results... 15330 1726882256.04259: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 15330 1726882256.04527: in run() - task 12673a56-9f93-e4fe-1358-0000000000fe 15330 1726882256.04700: variable 'ansible_search_path' from source: unknown 15330 1726882256.04703: variable 'ansible_search_path' from source: unknown 15330 1726882256.04706: calling self._execute() 15330 1726882256.04709: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.04711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.04713: variable 'omit' from source: magic vars 15330 1726882256.05344: variable 'ansible_distribution_major_version' from source: facts 15330 1726882256.05413: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882256.05426: variable 'omit' from source: magic vars 15330 1726882256.05470: variable 'omit' from source: magic vars 15330 1726882256.05705: variable '_current_interfaces' from source: set_fact 15330 1726882256.05780: variable 'omit' from source: magic vars 15330 1726882256.05940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882256.05983: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882256.06066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882256.06090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882256.06169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882256.06209: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882256.06268: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.06276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.06503: Set connection var ansible_pipelining to False 15330 1726882256.06522: Set connection var ansible_timeout to 10 15330 1726882256.06529: Set connection var ansible_connection to ssh 15330 1726882256.06536: Set connection var ansible_shell_type to sh 15330 1726882256.06547: Set connection var ansible_shell_executable to /bin/sh 15330 1726882256.06556: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882256.06618: variable 'ansible_shell_executable' from source: unknown 15330 1726882256.06801: variable 'ansible_connection' from source: unknown 15330 1726882256.06805: variable 'ansible_module_compression' from source: unknown 15330 1726882256.06807: variable 'ansible_shell_type' from source: unknown 15330 1726882256.06809: variable 'ansible_shell_executable' from source: unknown 15330 1726882256.06810: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.06812: variable 'ansible_pipelining' from source: unknown 15330 1726882256.06814: variable 'ansible_timeout' from source: unknown 15330 1726882256.06815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.06999: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882256.07016: variable 'omit' from source: magic vars 15330 1726882256.07026: starting attempt loop 15330 1726882256.07033: running the handler 15330 1726882256.07150: handler run complete 15330 1726882256.07153: attempt loop complete, returning result 15330 1726882256.07155: _execute() done 15330 1726882256.07157: dumping result to json 15330 1726882256.07159: done dumping result, returning 15330 1726882256.07162: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [12673a56-9f93-e4fe-1358-0000000000fe] 15330 1726882256.07164: sending task result for task 12673a56-9f93-e4fe-1358-0000000000fe 15330 1726882256.07405: done sending task result for task 12673a56-9f93-e4fe-1358-0000000000fe 15330 1726882256.07409: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15330 1726882256.07467: no more pending results, returning what we have 15330 1726882256.07471: results queue empty 15330 1726882256.07472: checking for any_errors_fatal 15330 1726882256.07481: done checking for any_errors_fatal 15330 1726882256.07481: checking for max_fail_percentage 15330 1726882256.07483: done checking for max_fail_percentage 15330 1726882256.07484: checking to see if all hosts have failed and the running result is not ok 15330 1726882256.07484: done checking to see if all hosts have failed 15330 1726882256.07485: getting the remaining hosts for this loop 15330 1726882256.07486: done getting the remaining hosts for this loop 15330 1726882256.07490: getting the next task for host managed_node3 15330 1726882256.07499: done getting next task for host managed_node3 15330 1726882256.07502: ^ task is: TASK: Show current_interfaces 15330 1726882256.07505: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.07508: getting variables 15330 1726882256.07511: in VariableManager get_vars() 15330 1726882256.07538: Calling all_inventory to load vars for managed_node3 15330 1726882256.07540: Calling groups_inventory to load vars for managed_node3 15330 1726882256.07543: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.07554: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.07556: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.07559: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.08130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.08837: done with get_vars() 15330 1726882256.08848: done getting variables 15330 1726882256.09198: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:30:56 -0400 (0:00:00.063) 0:00:05.299 ****** 15330 1726882256.09328: entering _queue_task() for managed_node3/debug 15330 1726882256.09389: Creating lock for debug 15330 1726882256.10119: worker is 1 (out of 1 available) 15330 1726882256.10132: exiting _queue_task() for managed_node3/debug 15330 1726882256.10143: done queuing things up, now waiting for results queue to drain 15330 1726882256.10145: waiting for pending results... 15330 1726882256.10438: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 15330 1726882256.10862: in run() - task 12673a56-9f93-e4fe-1358-0000000000ef 15330 1726882256.10866: variable 'ansible_search_path' from source: unknown 15330 1726882256.10869: variable 'ansible_search_path' from source: unknown 15330 1726882256.10872: calling self._execute() 15330 1726882256.11040: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.11043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.11055: variable 'omit' from source: magic vars 15330 1726882256.12201: variable 'ansible_distribution_major_version' from source: facts 15330 1726882256.12204: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882256.12207: variable 'omit' from source: magic vars 15330 1726882256.12303: variable 'omit' from source: magic vars 15330 1726882256.12603: variable 'current_interfaces' from source: set_fact 15330 1726882256.12607: variable 'omit' from source: magic vars 15330 1726882256.12726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882256.12763: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882256.12785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882256.12910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882256.12964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882256.12977: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882256.12987: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.13033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.13137: Set connection var ansible_pipelining to False 15330 1726882256.13241: Set connection var ansible_timeout to 10 15330 1726882256.13245: Set connection var ansible_connection to ssh 15330 1726882256.13247: Set connection var ansible_shell_type to sh 15330 1726882256.13249: Set connection var ansible_shell_executable to /bin/sh 15330 1726882256.13251: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882256.13268: variable 'ansible_shell_executable' from source: unknown 15330 1726882256.13278: variable 'ansible_connection' from source: unknown 15330 1726882256.13285: variable 'ansible_module_compression' from source: unknown 15330 1726882256.13295: variable 'ansible_shell_type' from source: unknown 15330 1726882256.13303: variable 'ansible_shell_executable' from source: unknown 15330 1726882256.13374: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.13378: variable 'ansible_pipelining' from source: unknown 15330 1726882256.13381: variable 'ansible_timeout' from source: unknown 15330 1726882256.13384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.13494: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882256.13510: variable 'omit' from source: magic vars 15330 1726882256.13519: starting attempt loop 15330 1726882256.13526: running the handler 15330 1726882256.13572: handler run complete 15330 1726882256.13600: attempt loop complete, returning result 15330 1726882256.13608: _execute() done 15330 1726882256.13615: dumping result to json 15330 1726882256.13624: done dumping result, returning 15330 1726882256.13702: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [12673a56-9f93-e4fe-1358-0000000000ef] 15330 1726882256.13705: sending task result for task 12673a56-9f93-e4fe-1358-0000000000ef 15330 1726882256.13770: done sending task result for task 12673a56-9f93-e4fe-1358-0000000000ef 15330 1726882256.13775: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15330 1726882256.13826: no more pending results, returning what we have 15330 1726882256.13829: results queue empty 15330 1726882256.13830: checking for any_errors_fatal 15330 1726882256.13833: done checking for any_errors_fatal 15330 1726882256.13834: checking for max_fail_percentage 15330 1726882256.13835: done checking for max_fail_percentage 15330 1726882256.13836: checking to see if all hosts have failed and the running result is not ok 15330 1726882256.13837: done checking to see if all hosts have failed 15330 1726882256.13838: getting the remaining hosts for this loop 15330 1726882256.13839: done getting the remaining hosts for this loop 15330 1726882256.13842: getting the next task for host managed_node3 15330 1726882256.13850: done getting next task for host managed_node3 15330 1726882256.13853: ^ task is: TASK: Include the task 'assert_device_absent.yml' 15330 1726882256.13855: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.13859: getting variables 15330 1726882256.13861: in VariableManager get_vars() 15330 1726882256.13888: Calling all_inventory to load vars for managed_node3 15330 1726882256.13890: Calling groups_inventory to load vars for managed_node3 15330 1726882256.13897: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.13908: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.13910: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.13913: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.14189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.14389: done with get_vars() 15330 1726882256.14402: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Friday 20 September 2024 21:30:56 -0400 (0:00:00.051) 0:00:05.350 ****** 15330 1726882256.14498: entering _queue_task() for managed_node3/include_tasks 15330 1726882256.14763: worker is 1 (out of 1 available) 15330 1726882256.14776: exiting _queue_task() for managed_node3/include_tasks 15330 1726882256.14798: done queuing things up, now waiting for results queue to drain 15330 1726882256.14800: waiting for pending results... 15330 1726882256.15227: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_absent.yml' 15330 1726882256.15472: in run() - task 12673a56-9f93-e4fe-1358-00000000000d 15330 1726882256.15476: variable 'ansible_search_path' from source: unknown 15330 1726882256.15509: calling self._execute() 15330 1726882256.15810: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.15813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.15816: variable 'omit' from source: magic vars 15330 1726882256.16630: variable 'ansible_distribution_major_version' from source: facts 15330 1726882256.16633: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882256.16635: _execute() done 15330 1726882256.16637: dumping result to json 15330 1726882256.16639: done dumping result, returning 15330 1726882256.16642: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_absent.yml' [12673a56-9f93-e4fe-1358-00000000000d] 15330 1726882256.16645: sending task result for task 12673a56-9f93-e4fe-1358-00000000000d 15330 1726882256.17071: no more pending results, returning what we have 15330 1726882256.17077: in VariableManager get_vars() 15330 1726882256.17114: Calling all_inventory to load vars for managed_node3 15330 1726882256.17117: Calling groups_inventory to load vars for managed_node3 15330 1726882256.17121: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.17134: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.17137: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.17140: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.17904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.18248: done with get_vars() 15330 1726882256.18256: variable 'ansible_search_path' from source: unknown 15330 1726882256.18269: done sending task result for task 12673a56-9f93-e4fe-1358-00000000000d 15330 1726882256.18272: WORKER PROCESS EXITING 15330 1726882256.18278: we have included files to process 15330 1726882256.18279: generating all_blocks data 15330 1726882256.18281: done generating all_blocks data 15330 1726882256.18287: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15330 1726882256.18288: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15330 1726882256.18291: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15330 1726882256.18605: in VariableManager get_vars() 15330 1726882256.18621: done with get_vars() 15330 1726882256.18808: done processing included file 15330 1726882256.18810: iterating over new_blocks loaded from include file 15330 1726882256.18812: in VariableManager get_vars() 15330 1726882256.18824: done with get_vars() 15330 1726882256.18826: filtering new block on tags 15330 1726882256.18843: done filtering new block on tags 15330 1726882256.18845: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 15330 1726882256.18850: extending task lists for all hosts with included blocks 15330 1726882256.19023: done extending task lists 15330 1726882256.19025: done processing included files 15330 1726882256.19025: results queue empty 15330 1726882256.19026: checking for any_errors_fatal 15330 1726882256.19029: done checking for any_errors_fatal 15330 1726882256.19030: checking for max_fail_percentage 15330 1726882256.19031: done checking for max_fail_percentage 15330 1726882256.19031: checking to see if all hosts have failed and the running result is not ok 15330 1726882256.19032: done checking to see if all hosts have failed 15330 1726882256.19033: getting the remaining hosts for this loop 15330 1726882256.19034: done getting the remaining hosts for this loop 15330 1726882256.19036: getting the next task for host managed_node3 15330 1726882256.19040: done getting next task for host managed_node3 15330 1726882256.19042: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15330 1726882256.19045: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.19047: getting variables 15330 1726882256.19048: in VariableManager get_vars() 15330 1726882256.19057: Calling all_inventory to load vars for managed_node3 15330 1726882256.19059: Calling groups_inventory to load vars for managed_node3 15330 1726882256.19061: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.19065: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.19067: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.19070: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.19219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.19447: done with get_vars() 15330 1726882256.19455: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:30:56 -0400 (0:00:00.050) 0:00:05.401 ****** 15330 1726882256.19581: entering _queue_task() for managed_node3/include_tasks 15330 1726882256.20041: worker is 1 (out of 1 available) 15330 1726882256.20049: exiting _queue_task() for managed_node3/include_tasks 15330 1726882256.20059: done queuing things up, now waiting for results queue to drain 15330 1726882256.20060: waiting for pending results... 15330 1726882256.20195: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 15330 1726882256.20312: in run() - task 12673a56-9f93-e4fe-1358-000000000119 15330 1726882256.20330: variable 'ansible_search_path' from source: unknown 15330 1726882256.20337: variable 'ansible_search_path' from source: unknown 15330 1726882256.20373: calling self._execute() 15330 1726882256.20463: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.20475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.20490: variable 'omit' from source: magic vars 15330 1726882256.20861: variable 'ansible_distribution_major_version' from source: facts 15330 1726882256.20877: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882256.20887: _execute() done 15330 1726882256.20896: dumping result to json 15330 1726882256.20905: done dumping result, returning 15330 1726882256.20915: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-e4fe-1358-000000000119] 15330 1726882256.20924: sending task result for task 12673a56-9f93-e4fe-1358-000000000119 15330 1726882256.21088: no more pending results, returning what we have 15330 1726882256.21095: in VariableManager get_vars() 15330 1726882256.21127: Calling all_inventory to load vars for managed_node3 15330 1726882256.21130: Calling groups_inventory to load vars for managed_node3 15330 1726882256.21133: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.21263: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.21267: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.21271: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.21567: done sending task result for task 12673a56-9f93-e4fe-1358-000000000119 15330 1726882256.21571: WORKER PROCESS EXITING 15330 1726882256.21604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.21917: done with get_vars() 15330 1726882256.21924: variable 'ansible_search_path' from source: unknown 15330 1726882256.21933: variable 'ansible_search_path' from source: unknown 15330 1726882256.22068: we have included files to process 15330 1726882256.22070: generating all_blocks data 15330 1726882256.22071: done generating all_blocks data 15330 1726882256.22072: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15330 1726882256.22073: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15330 1726882256.22075: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15330 1726882256.22335: done processing included file 15330 1726882256.22337: iterating over new_blocks loaded from include file 15330 1726882256.22338: in VariableManager get_vars() 15330 1726882256.22349: done with get_vars() 15330 1726882256.22350: filtering new block on tags 15330 1726882256.22362: done filtering new block on tags 15330 1726882256.22364: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 15330 1726882256.22369: extending task lists for all hosts with included blocks 15330 1726882256.22816: done extending task lists 15330 1726882256.22818: done processing included files 15330 1726882256.22819: results queue empty 15330 1726882256.22826: checking for any_errors_fatal 15330 1726882256.22832: done checking for any_errors_fatal 15330 1726882256.22833: checking for max_fail_percentage 15330 1726882256.22834: done checking for max_fail_percentage 15330 1726882256.22835: checking to see if all hosts have failed and the running result is not ok 15330 1726882256.22835: done checking to see if all hosts have failed 15330 1726882256.22836: getting the remaining hosts for this loop 15330 1726882256.22837: done getting the remaining hosts for this loop 15330 1726882256.22839: getting the next task for host managed_node3 15330 1726882256.22843: done getting next task for host managed_node3 15330 1726882256.22845: ^ task is: TASK: Get stat for interface {{ interface }} 15330 1726882256.22848: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.22850: getting variables 15330 1726882256.22850: in VariableManager get_vars() 15330 1726882256.22858: Calling all_inventory to load vars for managed_node3 15330 1726882256.22860: Calling groups_inventory to load vars for managed_node3 15330 1726882256.22862: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.22868: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.22870: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.22873: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.23062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.23270: done with get_vars() 15330 1726882256.23278: done getting variables 15330 1726882256.23671: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:30:56 -0400 (0:00:00.042) 0:00:05.444 ****** 15330 1726882256.23814: entering _queue_task() for managed_node3/stat 15330 1726882256.24366: worker is 1 (out of 1 available) 15330 1726882256.24378: exiting _queue_task() for managed_node3/stat 15330 1726882256.24389: done queuing things up, now waiting for results queue to drain 15330 1726882256.24390: waiting for pending results... 15330 1726882256.24811: running TaskExecutor() for managed_node3/TASK: Get stat for interface LSR-TST-br31 15330 1726882256.25109: in run() - task 12673a56-9f93-e4fe-1358-000000000133 15330 1726882256.25120: variable 'ansible_search_path' from source: unknown 15330 1726882256.25123: variable 'ansible_search_path' from source: unknown 15330 1726882256.25158: calling self._execute() 15330 1726882256.25337: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.25341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.25348: variable 'omit' from source: magic vars 15330 1726882256.26758: variable 'ansible_distribution_major_version' from source: facts 15330 1726882256.26765: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882256.26772: variable 'omit' from source: magic vars 15330 1726882256.27026: variable 'omit' from source: magic vars 15330 1726882256.27122: variable 'interface' from source: set_fact 15330 1726882256.27142: variable 'omit' from source: magic vars 15330 1726882256.27179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882256.27417: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882256.27420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882256.27423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882256.27426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882256.27428: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882256.27526: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.27529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.27872: Set connection var ansible_pipelining to False 15330 1726882256.27875: Set connection var ansible_timeout to 10 15330 1726882256.27878: Set connection var ansible_connection to ssh 15330 1726882256.27880: Set connection var ansible_shell_type to sh 15330 1726882256.27882: Set connection var ansible_shell_executable to /bin/sh 15330 1726882256.27884: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882256.27886: variable 'ansible_shell_executable' from source: unknown 15330 1726882256.27889: variable 'ansible_connection' from source: unknown 15330 1726882256.27891: variable 'ansible_module_compression' from source: unknown 15330 1726882256.27895: variable 'ansible_shell_type' from source: unknown 15330 1726882256.27898: variable 'ansible_shell_executable' from source: unknown 15330 1726882256.27900: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.27901: variable 'ansible_pipelining' from source: unknown 15330 1726882256.27903: variable 'ansible_timeout' from source: unknown 15330 1726882256.27905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.28271: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882256.28280: variable 'omit' from source: magic vars 15330 1726882256.28290: starting attempt loop 15330 1726882256.28295: running the handler 15330 1726882256.28308: _low_level_execute_command(): starting 15330 1726882256.28316: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882256.29814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882256.29873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882256.30019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882256.30031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882256.30113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882256.31791: stdout chunk (state=3): >>>/root <<< 15330 1726882256.31924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882256.31930: stdout chunk (state=3): >>><<< 15330 1726882256.31938: stderr chunk (state=3): >>><<< 15330 1726882256.31960: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882256.31974: _low_level_execute_command(): starting 15330 1726882256.31980: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448 `" && echo ansible-tmp-1726882256.319603-15630-143908062079448="` echo /root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448 `" ) && sleep 0' 15330 1726882256.33032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882256.33300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882256.33513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882256.33588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882256.35582: stdout chunk (state=3): >>>ansible-tmp-1726882256.319603-15630-143908062079448=/root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448 <<< 15330 1726882256.35609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882256.35634: stderr chunk (state=3): >>><<< 15330 1726882256.35640: stdout chunk (state=3): >>><<< 15330 1726882256.35667: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882256.319603-15630-143908062079448=/root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882256.35719: variable 'ansible_module_compression' from source: unknown 15330 1726882256.35774: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15330 1726882256.35926: variable 'ansible_facts' from source: unknown 15330 1726882256.36111: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448/AnsiballZ_stat.py 15330 1726882256.36418: Sending initial data 15330 1726882256.36422: Sent initial data (152 bytes) 15330 1726882256.37830: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882256.37899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882256.37971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882256.37987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882256.38123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882256.39661: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15330 1726882256.39665: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882256.39750: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882256.39862: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpw1pbg_wq /root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448/AnsiballZ_stat.py <<< 15330 1726882256.39867: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448/AnsiballZ_stat.py" <<< 15330 1726882256.39870: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpw1pbg_wq" to remote "/root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448/AnsiballZ_stat.py" <<< 15330 1726882256.40690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882256.40753: stderr chunk (state=3): >>><<< 15330 1726882256.40761: stdout chunk (state=3): >>><<< 15330 1726882256.40832: done transferring module to remote 15330 1726882256.40919: _low_level_execute_command(): starting 15330 1726882256.40924: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448/ /root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448/AnsiballZ_stat.py && sleep 0' 15330 1726882256.41508: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882256.41520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882256.41603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882256.41644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882256.41660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882256.41679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882256.41748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882256.43538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882256.43876: stdout chunk (state=3): >>><<< 15330 1726882256.43880: stderr chunk (state=3): >>><<< 15330 1726882256.43884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882256.43890: _low_level_execute_command(): starting 15330 1726882256.43901: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448/AnsiballZ_stat.py && sleep 0' 15330 1726882256.44823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882256.44925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882256.44928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882256.44998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882256.45002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882256.45005: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882256.45007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882256.45010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882256.45012: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882256.45014: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15330 1726882256.45016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882256.45018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882256.45020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882256.45210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882256.45310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882256.45346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882256.45349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882256.45447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882256.60910: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15330 1726882256.62647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882256.62651: stdout chunk (state=3): >>><<< 15330 1726882256.62653: stderr chunk (state=3): >>><<< 15330 1726882256.62656: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882256.62660: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882256.62663: _low_level_execute_command(): starting 15330 1726882256.62666: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882256.319603-15630-143908062079448/ > /dev/null 2>&1 && sleep 0' 15330 1726882256.64070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882256.64313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882256.64400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882256.66267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882256.66283: stdout chunk (state=3): >>><<< 15330 1726882256.66299: stderr chunk (state=3): >>><<< 15330 1726882256.66329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882256.66347: handler run complete 15330 1726882256.66419: attempt loop complete, returning result 15330 1726882256.66429: _execute() done 15330 1726882256.66436: dumping result to json 15330 1726882256.66446: done dumping result, returning 15330 1726882256.66515: done running TaskExecutor() for managed_node3/TASK: Get stat for interface LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000133] 15330 1726882256.66525: sending task result for task 12673a56-9f93-e4fe-1358-000000000133 15330 1726882256.67829: done sending task result for task 12673a56-9f93-e4fe-1358-000000000133 15330 1726882256.67832: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 15330 1726882256.67960: no more pending results, returning what we have 15330 1726882256.67964: results queue empty 15330 1726882256.67965: checking for any_errors_fatal 15330 1726882256.67987: done checking for any_errors_fatal 15330 1726882256.67988: checking for max_fail_percentage 15330 1726882256.67990: done checking for max_fail_percentage 15330 1726882256.67991: checking to see if all hosts have failed and the running result is not ok 15330 1726882256.67992: done checking to see if all hosts have failed 15330 1726882256.67995: getting the remaining hosts for this loop 15330 1726882256.67996: done getting the remaining hosts for this loop 15330 1726882256.67999: getting the next task for host managed_node3 15330 1726882256.68006: done getting next task for host managed_node3 15330 1726882256.68008: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15330 1726882256.68011: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.68014: getting variables 15330 1726882256.68015: in VariableManager get_vars() 15330 1726882256.68044: Calling all_inventory to load vars for managed_node3 15330 1726882256.68047: Calling groups_inventory to load vars for managed_node3 15330 1726882256.68050: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.68060: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.68066: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.68070: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.68888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.69434: done with get_vars() 15330 1726882256.69446: done getting variables 15330 1726882256.69872: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15330 1726882256.70502: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:30:56 -0400 (0:00:00.467) 0:00:05.911 ****** 15330 1726882256.70534: entering _queue_task() for managed_node3/assert 15330 1726882256.70536: Creating lock for assert 15330 1726882256.71288: worker is 1 (out of 1 available) 15330 1726882256.71305: exiting _queue_task() for managed_node3/assert 15330 1726882256.71316: done queuing things up, now waiting for results queue to drain 15330 1726882256.71317: waiting for pending results... 15330 1726882256.72230: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15330 1726882256.72652: in run() - task 12673a56-9f93-e4fe-1358-00000000011a 15330 1726882256.72674: variable 'ansible_search_path' from source: unknown 15330 1726882256.72678: variable 'ansible_search_path' from source: unknown 15330 1726882256.72808: calling self._execute() 15330 1726882256.73021: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.73024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.73037: variable 'omit' from source: magic vars 15330 1726882256.73982: variable 'ansible_distribution_major_version' from source: facts 15330 1726882256.73996: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882256.74087: variable 'omit' from source: magic vars 15330 1726882256.74269: variable 'omit' from source: magic vars 15330 1726882256.74486: variable 'interface' from source: set_fact 15330 1726882256.74507: variable 'omit' from source: magic vars 15330 1726882256.74545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882256.74700: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882256.74718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882256.74736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882256.74747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882256.74775: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882256.74778: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.74781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.74988: Set connection var ansible_pipelining to False 15330 1726882256.75002: Set connection var ansible_timeout to 10 15330 1726882256.75006: Set connection var ansible_connection to ssh 15330 1726882256.75105: Set connection var ansible_shell_type to sh 15330 1726882256.75111: Set connection var ansible_shell_executable to /bin/sh 15330 1726882256.75123: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882256.75145: variable 'ansible_shell_executable' from source: unknown 15330 1726882256.75148: variable 'ansible_connection' from source: unknown 15330 1726882256.75151: variable 'ansible_module_compression' from source: unknown 15330 1726882256.75153: variable 'ansible_shell_type' from source: unknown 15330 1726882256.75155: variable 'ansible_shell_executable' from source: unknown 15330 1726882256.75157: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.75162: variable 'ansible_pipelining' from source: unknown 15330 1726882256.75164: variable 'ansible_timeout' from source: unknown 15330 1726882256.75169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.75441: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882256.75512: variable 'omit' from source: magic vars 15330 1726882256.75515: starting attempt loop 15330 1726882256.75518: running the handler 15330 1726882256.75900: variable 'interface_stat' from source: set_fact 15330 1726882256.75909: Evaluated conditional (not interface_stat.stat.exists): True 15330 1726882256.75915: handler run complete 15330 1726882256.75974: attempt loop complete, returning result 15330 1726882256.75976: _execute() done 15330 1726882256.75979: dumping result to json 15330 1726882256.75982: done dumping result, returning 15330 1726882256.75999: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'LSR-TST-br31' [12673a56-9f93-e4fe-1358-00000000011a] 15330 1726882256.76003: sending task result for task 12673a56-9f93-e4fe-1358-00000000011a 15330 1726882256.76364: done sending task result for task 12673a56-9f93-e4fe-1358-00000000011a 15330 1726882256.76367: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15330 1726882256.76424: no more pending results, returning what we have 15330 1726882256.76428: results queue empty 15330 1726882256.76429: checking for any_errors_fatal 15330 1726882256.76440: done checking for any_errors_fatal 15330 1726882256.76441: checking for max_fail_percentage 15330 1726882256.76442: done checking for max_fail_percentage 15330 1726882256.76443: checking to see if all hosts have failed and the running result is not ok 15330 1726882256.76444: done checking to see if all hosts have failed 15330 1726882256.76445: getting the remaining hosts for this loop 15330 1726882256.76446: done getting the remaining hosts for this loop 15330 1726882256.76450: getting the next task for host managed_node3 15330 1726882256.76459: done getting next task for host managed_node3 15330 1726882256.76461: ^ task is: TASK: meta (flush_handlers) 15330 1726882256.76463: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.76468: getting variables 15330 1726882256.76470: in VariableManager get_vars() 15330 1726882256.76609: Calling all_inventory to load vars for managed_node3 15330 1726882256.76612: Calling groups_inventory to load vars for managed_node3 15330 1726882256.76616: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.76627: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.76630: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.76633: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.77322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.77861: done with get_vars() 15330 1726882256.77872: done getting variables 15330 1726882256.77942: in VariableManager get_vars() 15330 1726882256.77952: Calling all_inventory to load vars for managed_node3 15330 1726882256.77954: Calling groups_inventory to load vars for managed_node3 15330 1726882256.77957: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.77961: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.77963: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.77966: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.78298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.78487: done with get_vars() 15330 1726882256.78710: done queuing things up, now waiting for results queue to drain 15330 1726882256.78712: results queue empty 15330 1726882256.78713: checking for any_errors_fatal 15330 1726882256.78715: done checking for any_errors_fatal 15330 1726882256.78716: checking for max_fail_percentage 15330 1726882256.78717: done checking for max_fail_percentage 15330 1726882256.78718: checking to see if all hosts have failed and the running result is not ok 15330 1726882256.78718: done checking to see if all hosts have failed 15330 1726882256.78723: getting the remaining hosts for this loop 15330 1726882256.78724: done getting the remaining hosts for this loop 15330 1726882256.78727: getting the next task for host managed_node3 15330 1726882256.78731: done getting next task for host managed_node3 15330 1726882256.78732: ^ task is: TASK: meta (flush_handlers) 15330 1726882256.78733: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.78736: getting variables 15330 1726882256.78737: in VariableManager get_vars() 15330 1726882256.78745: Calling all_inventory to load vars for managed_node3 15330 1726882256.78747: Calling groups_inventory to load vars for managed_node3 15330 1726882256.78749: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.78754: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.78756: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.78759: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.79117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.79559: done with get_vars() 15330 1726882256.79567: done getting variables 15330 1726882256.79744: in VariableManager get_vars() 15330 1726882256.79753: Calling all_inventory to load vars for managed_node3 15330 1726882256.79755: Calling groups_inventory to load vars for managed_node3 15330 1726882256.79758: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.79762: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.79764: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.79767: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.79982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.80291: done with get_vars() 15330 1726882256.80307: done queuing things up, now waiting for results queue to drain 15330 1726882256.80309: results queue empty 15330 1726882256.80310: checking for any_errors_fatal 15330 1726882256.80311: done checking for any_errors_fatal 15330 1726882256.80312: checking for max_fail_percentage 15330 1726882256.80313: done checking for max_fail_percentage 15330 1726882256.80313: checking to see if all hosts have failed and the running result is not ok 15330 1726882256.80314: done checking to see if all hosts have failed 15330 1726882256.80315: getting the remaining hosts for this loop 15330 1726882256.80316: done getting the remaining hosts for this loop 15330 1726882256.80318: getting the next task for host managed_node3 15330 1726882256.80321: done getting next task for host managed_node3 15330 1726882256.80322: ^ task is: None 15330 1726882256.80324: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.80325: done queuing things up, now waiting for results queue to drain 15330 1726882256.80326: results queue empty 15330 1726882256.80326: checking for any_errors_fatal 15330 1726882256.80327: done checking for any_errors_fatal 15330 1726882256.80327: checking for max_fail_percentage 15330 1726882256.80328: done checking for max_fail_percentage 15330 1726882256.80329: checking to see if all hosts have failed and the running result is not ok 15330 1726882256.80329: done checking to see if all hosts have failed 15330 1726882256.80331: getting the next task for host managed_node3 15330 1726882256.80334: done getting next task for host managed_node3 15330 1726882256.80335: ^ task is: None 15330 1726882256.80336: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.80610: in VariableManager get_vars() 15330 1726882256.80635: done with get_vars() 15330 1726882256.80641: in VariableManager get_vars() 15330 1726882256.80654: done with get_vars() 15330 1726882256.80659: variable 'omit' from source: magic vars 15330 1726882256.80688: in VariableManager get_vars() 15330 1726882256.80808: done with get_vars() 15330 1726882256.80836: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 15330 1726882256.82514: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15330 1726882256.82609: getting the remaining hosts for this loop 15330 1726882256.82611: done getting the remaining hosts for this loop 15330 1726882256.82614: getting the next task for host managed_node3 15330 1726882256.82616: done getting next task for host managed_node3 15330 1726882256.82619: ^ task is: TASK: Gathering Facts 15330 1726882256.82620: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882256.82622: getting variables 15330 1726882256.82623: in VariableManager get_vars() 15330 1726882256.82635: Calling all_inventory to load vars for managed_node3 15330 1726882256.82638: Calling groups_inventory to load vars for managed_node3 15330 1726882256.82640: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882256.82645: Calling all_plugins_play to load vars for managed_node3 15330 1726882256.82647: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882256.82650: Calling groups_plugins_play to load vars for managed_node3 15330 1726882256.82790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882256.83281: done with get_vars() 15330 1726882256.83290: done getting variables 15330 1726882256.83538: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Friday 20 September 2024 21:30:56 -0400 (0:00:00.130) 0:00:06.041 ****** 15330 1726882256.83563: entering _queue_task() for managed_node3/gather_facts 15330 1726882256.83979: worker is 1 (out of 1 available) 15330 1726882256.83991: exiting _queue_task() for managed_node3/gather_facts 15330 1726882256.84507: done queuing things up, now waiting for results queue to drain 15330 1726882256.84509: waiting for pending results... 15330 1726882256.84554: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882256.84903: in run() - task 12673a56-9f93-e4fe-1358-00000000014c 15330 1726882256.84907: variable 'ansible_search_path' from source: unknown 15330 1726882256.84909: calling self._execute() 15330 1726882256.84974: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.85227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.85231: variable 'omit' from source: magic vars 15330 1726882256.85752: variable 'ansible_distribution_major_version' from source: facts 15330 1726882256.85890: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882256.85906: variable 'omit' from source: magic vars 15330 1726882256.85933: variable 'omit' from source: magic vars 15330 1726882256.85972: variable 'omit' from source: magic vars 15330 1726882256.86135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882256.86174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882256.86502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882256.86506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882256.86508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882256.86511: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882256.86513: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.86515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.86590: Set connection var ansible_pipelining to False 15330 1726882256.86617: Set connection var ansible_timeout to 10 15330 1726882256.86625: Set connection var ansible_connection to ssh 15330 1726882256.86632: Set connection var ansible_shell_type to sh 15330 1726882256.86726: Set connection var ansible_shell_executable to /bin/sh 15330 1726882256.86737: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882256.86763: variable 'ansible_shell_executable' from source: unknown 15330 1726882256.86771: variable 'ansible_connection' from source: unknown 15330 1726882256.86778: variable 'ansible_module_compression' from source: unknown 15330 1726882256.86786: variable 'ansible_shell_type' from source: unknown 15330 1726882256.86792: variable 'ansible_shell_executable' from source: unknown 15330 1726882256.86804: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882256.86811: variable 'ansible_pipelining' from source: unknown 15330 1726882256.86818: variable 'ansible_timeout' from source: unknown 15330 1726882256.86831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882256.87186: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882256.87478: variable 'omit' from source: magic vars 15330 1726882256.87481: starting attempt loop 15330 1726882256.87483: running the handler 15330 1726882256.87484: variable 'ansible_facts' from source: unknown 15330 1726882256.87486: _low_level_execute_command(): starting 15330 1726882256.87488: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882256.88938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882256.88941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882256.88944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882256.88947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882256.88949: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882256.88952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882256.89119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882256.89226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882256.89461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882256.91228: stdout chunk (state=3): >>>/root <<< 15330 1726882256.91288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882256.91305: stdout chunk (state=3): >>><<< 15330 1726882256.91318: stderr chunk (state=3): >>><<< 15330 1726882256.91622: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882256.91626: _low_level_execute_command(): starting 15330 1726882256.91629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065 `" && echo ansible-tmp-1726882256.9152482-15655-65169084540065="` echo /root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065 `" ) && sleep 0' 15330 1726882256.93656: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882256.93921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882256.93944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882256.94027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882256.96004: stdout chunk (state=3): >>>ansible-tmp-1726882256.9152482-15655-65169084540065=/root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065 <<< 15330 1726882256.96016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882256.96121: stderr chunk (state=3): >>><<< 15330 1726882256.96139: stdout chunk (state=3): >>><<< 15330 1726882256.96171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882256.9152482-15655-65169084540065=/root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882256.96539: variable 'ansible_module_compression' from source: unknown 15330 1726882256.96542: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882256.96689: variable 'ansible_facts' from source: unknown 15330 1726882256.97392: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065/AnsiballZ_setup.py 15330 1726882256.98201: Sending initial data 15330 1726882256.98205: Sent initial data (153 bytes) 15330 1726882256.99926: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882256.99943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882256.99968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882257.00079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882257.01833: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882257.02084: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882257.02134: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpqd6zf73t /root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065/AnsiballZ_setup.py <<< 15330 1726882257.02137: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065/AnsiballZ_setup.py" <<< 15330 1726882257.02176: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpqd6zf73t" to remote "/root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065/AnsiballZ_setup.py" <<< 15330 1726882257.04714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882257.04787: stderr chunk (state=3): >>><<< 15330 1726882257.04790: stdout chunk (state=3): >>><<< 15330 1726882257.04819: done transferring module to remote 15330 1726882257.04836: _low_level_execute_command(): starting 15330 1726882257.04846: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065/ /root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065/AnsiballZ_setup.py && sleep 0' 15330 1726882257.06168: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882257.06213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882257.06355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882257.06413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882257.06471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882257.06499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882257.06700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882257.08419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882257.08529: stderr chunk (state=3): >>><<< 15330 1726882257.08532: stdout chunk (state=3): >>><<< 15330 1726882257.08700: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882257.08704: _low_level_execute_command(): starting 15330 1726882257.08706: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065/AnsiballZ_setup.py && sleep 0' 15330 1726882257.09854: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882257.09996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882257.10097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882257.10218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882257.10298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882257.73381: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansib<<< 15330 1726882257.73388: stdout chunk (state=3): >>>le_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2983, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 548, "free": 2983}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 564, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805232128, "block_size": 4096, "block_total": 65519099, "block_available": 63917293, "block_used": 1601806, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "57", "epoch": "1726882257", "epoch_int": "1726882257", "date": "2024-09-20", "time": "21:30:57", "iso8601_micro": "2024-09-21T01:30:57.694375Z", "iso8601": "2024-09-21T01:30:57Z", "iso8601_basic": "20240920T213057694375", "iso8601_basic_short": "20240920T213057", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_loadavg": {"1m": 1.11767578125, "5m": 0.482421875, "15m": 0.22021484375}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882257.75702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882257.75706: stdout chunk (state=3): >>><<< 15330 1726882257.75708: stderr chunk (state=3): >>><<< 15330 1726882257.75712: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2983, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 548, "free": 2983}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 564, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805232128, "block_size": 4096, "block_total": 65519099, "block_available": 63917293, "block_used": 1601806, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "57", "epoch": "1726882257", "epoch_int": "1726882257", "date": "2024-09-20", "time": "21:30:57", "iso8601_micro": "2024-09-21T01:30:57.694375Z", "iso8601": "2024-09-21T01:30:57Z", "iso8601_basic": "20240920T213057694375", "iso8601_basic_short": "20240920T213057", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_loadavg": {"1m": 1.11767578125, "5m": 0.482421875, "15m": 0.22021484375}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882257.76302: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882257.76390: _low_level_execute_command(): starting 15330 1726882257.76405: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882256.9152482-15655-65169084540065/ > /dev/null 2>&1 && sleep 0' 15330 1726882257.77851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882257.77855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882257.77858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15330 1726882257.77860: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882257.77862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882257.77958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882257.77973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882257.78090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882257.78145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882257.79942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882257.79977: stderr chunk (state=3): >>><<< 15330 1726882257.80009: stdout chunk (state=3): >>><<< 15330 1726882257.80026: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882257.80185: handler run complete 15330 1726882257.80330: variable 'ansible_facts' from source: unknown 15330 1726882257.80547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882257.81459: variable 'ansible_facts' from source: unknown 15330 1726882257.81462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882257.81840: attempt loop complete, returning result 15330 1726882257.81850: _execute() done 15330 1726882257.81857: dumping result to json 15330 1726882257.81891: done dumping result, returning 15330 1726882257.81961: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-00000000014c] 15330 1726882257.81970: sending task result for task 12673a56-9f93-e4fe-1358-00000000014c 15330 1726882257.82907: done sending task result for task 12673a56-9f93-e4fe-1358-00000000014c 15330 1726882257.82910: WORKER PROCESS EXITING ok: [managed_node3] 15330 1726882257.83262: no more pending results, returning what we have 15330 1726882257.83264: results queue empty 15330 1726882257.83265: checking for any_errors_fatal 15330 1726882257.83266: done checking for any_errors_fatal 15330 1726882257.83267: checking for max_fail_percentage 15330 1726882257.83268: done checking for max_fail_percentage 15330 1726882257.83269: checking to see if all hosts have failed and the running result is not ok 15330 1726882257.83270: done checking to see if all hosts have failed 15330 1726882257.83270: getting the remaining hosts for this loop 15330 1726882257.83271: done getting the remaining hosts for this loop 15330 1726882257.83275: getting the next task for host managed_node3 15330 1726882257.83279: done getting next task for host managed_node3 15330 1726882257.83280: ^ task is: TASK: meta (flush_handlers) 15330 1726882257.83282: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882257.83285: getting variables 15330 1726882257.83286: in VariableManager get_vars() 15330 1726882257.83467: Calling all_inventory to load vars for managed_node3 15330 1726882257.83470: Calling groups_inventory to load vars for managed_node3 15330 1726882257.83473: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882257.83482: Calling all_plugins_play to load vars for managed_node3 15330 1726882257.83485: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882257.83488: Calling groups_plugins_play to load vars for managed_node3 15330 1726882257.84112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882257.84470: done with get_vars() 15330 1726882257.84480: done getting variables 15330 1726882257.84702: in VariableManager get_vars() 15330 1726882257.84712: Calling all_inventory to load vars for managed_node3 15330 1726882257.84714: Calling groups_inventory to load vars for managed_node3 15330 1726882257.84716: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882257.84719: Calling all_plugins_play to load vars for managed_node3 15330 1726882257.84721: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882257.84723: Calling groups_plugins_play to load vars for managed_node3 15330 1726882257.85039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882257.85480: done with get_vars() 15330 1726882257.85528: done queuing things up, now waiting for results queue to drain 15330 1726882257.85531: results queue empty 15330 1726882257.85532: checking for any_errors_fatal 15330 1726882257.85535: done checking for any_errors_fatal 15330 1726882257.85539: checking for max_fail_percentage 15330 1726882257.85541: done checking for max_fail_percentage 15330 1726882257.85541: checking to see if all hosts have failed and the running result is not ok 15330 1726882257.85542: done checking to see if all hosts have failed 15330 1726882257.85543: getting the remaining hosts for this loop 15330 1726882257.85544: done getting the remaining hosts for this loop 15330 1726882257.85546: getting the next task for host managed_node3 15330 1726882257.85550: done getting next task for host managed_node3 15330 1726882257.85553: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15330 1726882257.85555: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882257.85564: getting variables 15330 1726882257.85565: in VariableManager get_vars() 15330 1726882257.85578: Calling all_inventory to load vars for managed_node3 15330 1726882257.85580: Calling groups_inventory to load vars for managed_node3 15330 1726882257.85582: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882257.85586: Calling all_plugins_play to load vars for managed_node3 15330 1726882257.85588: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882257.85591: Calling groups_plugins_play to load vars for managed_node3 15330 1726882257.85999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882257.86365: done with get_vars() 15330 1726882257.86373: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:30:57 -0400 (0:00:01.029) 0:00:07.071 ****** 15330 1726882257.86508: entering _queue_task() for managed_node3/include_tasks 15330 1726882257.86817: worker is 1 (out of 1 available) 15330 1726882257.86832: exiting _queue_task() for managed_node3/include_tasks 15330 1726882257.86845: done queuing things up, now waiting for results queue to drain 15330 1726882257.86846: waiting for pending results... 15330 1726882257.87609: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15330 1726882257.87614: in run() - task 12673a56-9f93-e4fe-1358-000000000014 15330 1726882257.87617: variable 'ansible_search_path' from source: unknown 15330 1726882257.87620: variable 'ansible_search_path' from source: unknown 15330 1726882257.87624: calling self._execute() 15330 1726882257.87690: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882257.87712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882257.87734: variable 'omit' from source: magic vars 15330 1726882257.88087: variable 'ansible_distribution_major_version' from source: facts 15330 1726882257.88105: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882257.88115: _execute() done 15330 1726882257.88122: dumping result to json 15330 1726882257.88134: done dumping result, returning 15330 1726882257.88145: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-e4fe-1358-000000000014] 15330 1726882257.88153: sending task result for task 12673a56-9f93-e4fe-1358-000000000014 15330 1726882257.88417: done sending task result for task 12673a56-9f93-e4fe-1358-000000000014 15330 1726882257.88421: WORKER PROCESS EXITING 15330 1726882257.88457: no more pending results, returning what we have 15330 1726882257.88462: in VariableManager get_vars() 15330 1726882257.88505: Calling all_inventory to load vars for managed_node3 15330 1726882257.88507: Calling groups_inventory to load vars for managed_node3 15330 1726882257.88510: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882257.88521: Calling all_plugins_play to load vars for managed_node3 15330 1726882257.88524: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882257.88527: Calling groups_plugins_play to load vars for managed_node3 15330 1726882257.88763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882257.88929: done with get_vars() 15330 1726882257.88938: variable 'ansible_search_path' from source: unknown 15330 1726882257.88939: variable 'ansible_search_path' from source: unknown 15330 1726882257.88968: we have included files to process 15330 1726882257.88970: generating all_blocks data 15330 1726882257.88971: done generating all_blocks data 15330 1726882257.88972: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15330 1726882257.88973: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15330 1726882257.88975: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15330 1726882257.89746: done processing included file 15330 1726882257.89748: iterating over new_blocks loaded from include file 15330 1726882257.89749: in VariableManager get_vars() 15330 1726882257.89769: done with get_vars() 15330 1726882257.89771: filtering new block on tags 15330 1726882257.89786: done filtering new block on tags 15330 1726882257.89789: in VariableManager get_vars() 15330 1726882257.89810: done with get_vars() 15330 1726882257.89811: filtering new block on tags 15330 1726882257.89829: done filtering new block on tags 15330 1726882257.89832: in VariableManager get_vars() 15330 1726882257.89850: done with get_vars() 15330 1726882257.89851: filtering new block on tags 15330 1726882257.89866: done filtering new block on tags 15330 1726882257.89868: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 15330 1726882257.89873: extending task lists for all hosts with included blocks 15330 1726882257.90625: done extending task lists 15330 1726882257.90627: done processing included files 15330 1726882257.90627: results queue empty 15330 1726882257.90628: checking for any_errors_fatal 15330 1726882257.90634: done checking for any_errors_fatal 15330 1726882257.90635: checking for max_fail_percentage 15330 1726882257.90636: done checking for max_fail_percentage 15330 1726882257.90637: checking to see if all hosts have failed and the running result is not ok 15330 1726882257.90638: done checking to see if all hosts have failed 15330 1726882257.90639: getting the remaining hosts for this loop 15330 1726882257.90640: done getting the remaining hosts for this loop 15330 1726882257.90642: getting the next task for host managed_node3 15330 1726882257.90646: done getting next task for host managed_node3 15330 1726882257.90649: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15330 1726882257.90651: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882257.90660: getting variables 15330 1726882257.90661: in VariableManager get_vars() 15330 1726882257.90673: Calling all_inventory to load vars for managed_node3 15330 1726882257.90674: Calling groups_inventory to load vars for managed_node3 15330 1726882257.90676: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882257.90681: Calling all_plugins_play to load vars for managed_node3 15330 1726882257.90684: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882257.90686: Calling groups_plugins_play to load vars for managed_node3 15330 1726882257.91060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882257.91467: done with get_vars() 15330 1726882257.91476: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:30:57 -0400 (0:00:00.051) 0:00:07.123 ****** 15330 1726882257.91702: entering _queue_task() for managed_node3/setup 15330 1726882257.92185: worker is 1 (out of 1 available) 15330 1726882257.92202: exiting _queue_task() for managed_node3/setup 15330 1726882257.92214: done queuing things up, now waiting for results queue to drain 15330 1726882257.92215: waiting for pending results... 15330 1726882257.92411: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15330 1726882257.92529: in run() - task 12673a56-9f93-e4fe-1358-00000000018d 15330 1726882257.92549: variable 'ansible_search_path' from source: unknown 15330 1726882257.92556: variable 'ansible_search_path' from source: unknown 15330 1726882257.92592: calling self._execute() 15330 1726882257.92683: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882257.92698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882257.92725: variable 'omit' from source: magic vars 15330 1726882257.93153: variable 'ansible_distribution_major_version' from source: facts 15330 1726882257.93157: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882257.93329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882257.95570: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882257.95642: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882257.95692: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882257.95738: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882257.95777: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882257.95886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882257.95909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882257.95941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882257.96100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882257.96105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882257.96108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882257.96110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882257.96135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882257.96179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882257.96207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882257.96392: variable '__network_required_facts' from source: role '' defaults 15330 1726882257.96411: variable 'ansible_facts' from source: unknown 15330 1726882257.96518: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15330 1726882257.96527: when evaluation is False, skipping this task 15330 1726882257.96540: _execute() done 15330 1726882257.96645: dumping result to json 15330 1726882257.96649: done dumping result, returning 15330 1726882257.96653: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-e4fe-1358-00000000018d] 15330 1726882257.96655: sending task result for task 12673a56-9f93-e4fe-1358-00000000018d 15330 1726882257.96721: done sending task result for task 12673a56-9f93-e4fe-1358-00000000018d 15330 1726882257.96724: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882257.96770: no more pending results, returning what we have 15330 1726882257.96774: results queue empty 15330 1726882257.96775: checking for any_errors_fatal 15330 1726882257.96777: done checking for any_errors_fatal 15330 1726882257.96778: checking for max_fail_percentage 15330 1726882257.96779: done checking for max_fail_percentage 15330 1726882257.96781: checking to see if all hosts have failed and the running result is not ok 15330 1726882257.96781: done checking to see if all hosts have failed 15330 1726882257.96782: getting the remaining hosts for this loop 15330 1726882257.96784: done getting the remaining hosts for this loop 15330 1726882257.96787: getting the next task for host managed_node3 15330 1726882257.96800: done getting next task for host managed_node3 15330 1726882257.96804: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15330 1726882257.96807: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882257.96822: getting variables 15330 1726882257.96824: in VariableManager get_vars() 15330 1726882257.96863: Calling all_inventory to load vars for managed_node3 15330 1726882257.96865: Calling groups_inventory to load vars for managed_node3 15330 1726882257.96867: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882257.96878: Calling all_plugins_play to load vars for managed_node3 15330 1726882257.96880: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882257.96882: Calling groups_plugins_play to load vars for managed_node3 15330 1726882257.97434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882257.97650: done with get_vars() 15330 1726882257.97666: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:30:57 -0400 (0:00:00.060) 0:00:07.183 ****** 15330 1726882257.97771: entering _queue_task() for managed_node3/stat 15330 1726882257.98037: worker is 1 (out of 1 available) 15330 1726882257.98049: exiting _queue_task() for managed_node3/stat 15330 1726882257.98061: done queuing things up, now waiting for results queue to drain 15330 1726882257.98063: waiting for pending results... 15330 1726882257.98436: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 15330 1726882257.98501: in run() - task 12673a56-9f93-e4fe-1358-00000000018f 15330 1726882257.98505: variable 'ansible_search_path' from source: unknown 15330 1726882257.98508: variable 'ansible_search_path' from source: unknown 15330 1726882257.98516: calling self._execute() 15330 1726882257.98607: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882257.98620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882257.98643: variable 'omit' from source: magic vars 15330 1726882257.99012: variable 'ansible_distribution_major_version' from source: facts 15330 1726882257.99070: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882257.99212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882257.99557: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882257.99615: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882257.99654: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882257.99721: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882257.99786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882257.99826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882257.99860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882257.99938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882257.99989: variable '__network_is_ostree' from source: set_fact 15330 1726882258.00006: Evaluated conditional (not __network_is_ostree is defined): False 15330 1726882258.00045: when evaluation is False, skipping this task 15330 1726882258.00053: _execute() done 15330 1726882258.00056: dumping result to json 15330 1726882258.00058: done dumping result, returning 15330 1726882258.00061: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-e4fe-1358-00000000018f] 15330 1726882258.00064: sending task result for task 12673a56-9f93-e4fe-1358-00000000018f skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15330 1726882258.00208: no more pending results, returning what we have 15330 1726882258.00212: results queue empty 15330 1726882258.00213: checking for any_errors_fatal 15330 1726882258.00217: done checking for any_errors_fatal 15330 1726882258.00218: checking for max_fail_percentage 15330 1726882258.00219: done checking for max_fail_percentage 15330 1726882258.00220: checking to see if all hosts have failed and the running result is not ok 15330 1726882258.00221: done checking to see if all hosts have failed 15330 1726882258.00222: getting the remaining hosts for this loop 15330 1726882258.00224: done getting the remaining hosts for this loop 15330 1726882258.00227: getting the next task for host managed_node3 15330 1726882258.00234: done getting next task for host managed_node3 15330 1726882258.00238: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15330 1726882258.00241: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882258.00253: getting variables 15330 1726882258.00255: in VariableManager get_vars() 15330 1726882258.00415: Calling all_inventory to load vars for managed_node3 15330 1726882258.00418: Calling groups_inventory to load vars for managed_node3 15330 1726882258.00422: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882258.00433: Calling all_plugins_play to load vars for managed_node3 15330 1726882258.00435: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882258.00438: Calling groups_plugins_play to load vars for managed_node3 15330 1726882258.00946: done sending task result for task 12673a56-9f93-e4fe-1358-00000000018f 15330 1726882258.00949: WORKER PROCESS EXITING 15330 1726882258.00970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882258.01179: done with get_vars() 15330 1726882258.01189: done getting variables 15330 1726882258.01252: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:30:58 -0400 (0:00:00.035) 0:00:07.218 ****** 15330 1726882258.01290: entering _queue_task() for managed_node3/set_fact 15330 1726882258.01552: worker is 1 (out of 1 available) 15330 1726882258.01565: exiting _queue_task() for managed_node3/set_fact 15330 1726882258.01577: done queuing things up, now waiting for results queue to drain 15330 1726882258.01578: waiting for pending results... 15330 1726882258.01835: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15330 1726882258.01956: in run() - task 12673a56-9f93-e4fe-1358-000000000190 15330 1726882258.01976: variable 'ansible_search_path' from source: unknown 15330 1726882258.01985: variable 'ansible_search_path' from source: unknown 15330 1726882258.02038: calling self._execute() 15330 1726882258.02129: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882258.02146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882258.02162: variable 'omit' from source: magic vars 15330 1726882258.02537: variable 'ansible_distribution_major_version' from source: facts 15330 1726882258.02553: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882258.02725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882258.03010: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882258.03057: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882258.03112: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882258.03141: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882258.03329: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882258.03333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882258.03336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882258.03338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882258.03424: variable '__network_is_ostree' from source: set_fact 15330 1726882258.03446: Evaluated conditional (not __network_is_ostree is defined): False 15330 1726882258.03455: when evaluation is False, skipping this task 15330 1726882258.03463: _execute() done 15330 1726882258.03470: dumping result to json 15330 1726882258.03479: done dumping result, returning 15330 1726882258.03491: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-e4fe-1358-000000000190] 15330 1726882258.03507: sending task result for task 12673a56-9f93-e4fe-1358-000000000190 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15330 1726882258.03702: no more pending results, returning what we have 15330 1726882258.03705: results queue empty 15330 1726882258.03707: checking for any_errors_fatal 15330 1726882258.03713: done checking for any_errors_fatal 15330 1726882258.03714: checking for max_fail_percentage 15330 1726882258.03716: done checking for max_fail_percentage 15330 1726882258.03717: checking to see if all hosts have failed and the running result is not ok 15330 1726882258.03717: done checking to see if all hosts have failed 15330 1726882258.03718: getting the remaining hosts for this loop 15330 1726882258.03719: done getting the remaining hosts for this loop 15330 1726882258.03723: getting the next task for host managed_node3 15330 1726882258.03731: done getting next task for host managed_node3 15330 1726882258.03734: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15330 1726882258.03737: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882258.03749: getting variables 15330 1726882258.03751: in VariableManager get_vars() 15330 1726882258.03788: Calling all_inventory to load vars for managed_node3 15330 1726882258.03790: Calling groups_inventory to load vars for managed_node3 15330 1726882258.03796: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882258.03807: Calling all_plugins_play to load vars for managed_node3 15330 1726882258.03811: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882258.03814: Calling groups_plugins_play to load vars for managed_node3 15330 1726882258.04182: done sending task result for task 12673a56-9f93-e4fe-1358-000000000190 15330 1726882258.04185: WORKER PROCESS EXITING 15330 1726882258.04211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882258.04428: done with get_vars() 15330 1726882258.04444: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:30:58 -0400 (0:00:00.032) 0:00:07.251 ****** 15330 1726882258.04534: entering _queue_task() for managed_node3/service_facts 15330 1726882258.04536: Creating lock for service_facts 15330 1726882258.05107: worker is 1 (out of 1 available) 15330 1726882258.05118: exiting _queue_task() for managed_node3/service_facts 15330 1726882258.05130: done queuing things up, now waiting for results queue to drain 15330 1726882258.05132: waiting for pending results... 15330 1726882258.05575: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 15330 1726882258.05868: in run() - task 12673a56-9f93-e4fe-1358-000000000192 15330 1726882258.05889: variable 'ansible_search_path' from source: unknown 15330 1726882258.05941: variable 'ansible_search_path' from source: unknown 15330 1726882258.05981: calling self._execute() 15330 1726882258.06132: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882258.06197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882258.06213: variable 'omit' from source: magic vars 15330 1726882258.07103: variable 'ansible_distribution_major_version' from source: facts 15330 1726882258.07107: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882258.07110: variable 'omit' from source: magic vars 15330 1726882258.07181: variable 'omit' from source: magic vars 15330 1726882258.07379: variable 'omit' from source: magic vars 15330 1726882258.07383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882258.07582: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882258.07810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882258.07816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882258.07819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882258.07821: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882258.07823: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882258.07825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882258.08091: Set connection var ansible_pipelining to False 15330 1726882258.08155: Set connection var ansible_timeout to 10 15330 1726882258.08164: Set connection var ansible_connection to ssh 15330 1726882258.08170: Set connection var ansible_shell_type to sh 15330 1726882258.08180: Set connection var ansible_shell_executable to /bin/sh 15330 1726882258.08375: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882258.08379: variable 'ansible_shell_executable' from source: unknown 15330 1726882258.08382: variable 'ansible_connection' from source: unknown 15330 1726882258.08385: variable 'ansible_module_compression' from source: unknown 15330 1726882258.08387: variable 'ansible_shell_type' from source: unknown 15330 1726882258.08389: variable 'ansible_shell_executable' from source: unknown 15330 1726882258.08391: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882258.08398: variable 'ansible_pipelining' from source: unknown 15330 1726882258.08400: variable 'ansible_timeout' from source: unknown 15330 1726882258.08402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882258.08943: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882258.09028: variable 'omit' from source: magic vars 15330 1726882258.09045: starting attempt loop 15330 1726882258.09048: running the handler 15330 1726882258.09065: _low_level_execute_command(): starting 15330 1726882258.09077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882258.10549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882258.10699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882258.10789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882258.10811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882258.11027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882258.11127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882258.12768: stdout chunk (state=3): >>>/root <<< 15330 1726882258.12926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882258.12930: stdout chunk (state=3): >>><<< 15330 1726882258.12933: stderr chunk (state=3): >>><<< 15330 1726882258.13155: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882258.13159: _low_level_execute_command(): starting 15330 1726882258.13162: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918 `" && echo ansible-tmp-1726882258.1305478-15721-265675861812918="` echo /root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918 `" ) && sleep 0' 15330 1726882258.14305: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882258.14383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882258.14402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882258.14474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882258.14641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882258.14699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882258.16573: stdout chunk (state=3): >>>ansible-tmp-1726882258.1305478-15721-265675861812918=/root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918 <<< 15330 1726882258.16670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882258.17104: stderr chunk (state=3): >>><<< 15330 1726882258.17107: stdout chunk (state=3): >>><<< 15330 1726882258.17110: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882258.1305478-15721-265675861812918=/root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882258.17113: variable 'ansible_module_compression' from source: unknown 15330 1726882258.17115: ANSIBALLZ: Using lock for service_facts 15330 1726882258.17117: ANSIBALLZ: Acquiring lock 15330 1726882258.17119: ANSIBALLZ: Lock acquired: 140238205914144 15330 1726882258.17121: ANSIBALLZ: Creating module 15330 1726882258.33878: ANSIBALLZ: Writing module into payload 15330 1726882258.33958: ANSIBALLZ: Writing module 15330 1726882258.33972: ANSIBALLZ: Renaming module 15330 1726882258.33983: ANSIBALLZ: Done creating module 15330 1726882258.34009: variable 'ansible_facts' from source: unknown 15330 1726882258.34056: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918/AnsiballZ_service_facts.py 15330 1726882258.34156: Sending initial data 15330 1726882258.34159: Sent initial data (162 bytes) 15330 1726882258.34553: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882258.34590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882258.34598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882258.34601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882258.34603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882258.34606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882258.34635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882258.34648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882258.34704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882258.36312: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882258.36484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882258.36562: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpxnuw_n9o /root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918/AnsiballZ_service_facts.py <<< 15330 1726882258.36568: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918/AnsiballZ_service_facts.py" <<< 15330 1726882258.36609: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpxnuw_n9o" to remote "/root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918/AnsiballZ_service_facts.py" <<< 15330 1726882258.37496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882258.37570: stderr chunk (state=3): >>><<< 15330 1726882258.37602: stdout chunk (state=3): >>><<< 15330 1726882258.37628: done transferring module to remote 15330 1726882258.37637: _low_level_execute_command(): starting 15330 1726882258.37662: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918/ /root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918/AnsiballZ_service_facts.py && sleep 0' 15330 1726882258.38219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882258.38247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882258.38250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882258.38301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882258.38373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882258.40173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882258.40177: stdout chunk (state=3): >>><<< 15330 1726882258.40179: stderr chunk (state=3): >>><<< 15330 1726882258.40246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882258.40250: _low_level_execute_command(): starting 15330 1726882258.40252: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918/AnsiballZ_service_facts.py && sleep 0' 15330 1726882258.40940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882258.40946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882258.41059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882258.41062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882258.41126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882259.89566: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15330 1726882259.91243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882259.91247: stdout chunk (state=3): >>><<< 15330 1726882259.91249: stderr chunk (state=3): >>><<< 15330 1726882259.91253: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882259.91965: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882259.91974: _low_level_execute_command(): starting 15330 1726882259.91977: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882258.1305478-15721-265675861812918/ > /dev/null 2>&1 && sleep 0' 15330 1726882259.92552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882259.92565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882259.92585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882259.92654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882259.94443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882259.94451: stdout chunk (state=3): >>><<< 15330 1726882259.94460: stderr chunk (state=3): >>><<< 15330 1726882259.94479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882259.94506: handler run complete 15330 1726882259.94747: variable 'ansible_facts' from source: unknown 15330 1726882259.94899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882259.95139: variable 'ansible_facts' from source: unknown 15330 1726882259.95227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882259.95339: attempt loop complete, returning result 15330 1726882259.95342: _execute() done 15330 1726882259.95345: dumping result to json 15330 1726882259.95384: done dumping result, returning 15330 1726882259.95392: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-e4fe-1358-000000000192] 15330 1726882259.95397: sending task result for task 12673a56-9f93-e4fe-1358-000000000192 15330 1726882259.96101: done sending task result for task 12673a56-9f93-e4fe-1358-000000000192 15330 1726882259.96104: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882259.96141: no more pending results, returning what we have 15330 1726882259.96143: results queue empty 15330 1726882259.96143: checking for any_errors_fatal 15330 1726882259.96145: done checking for any_errors_fatal 15330 1726882259.96146: checking for max_fail_percentage 15330 1726882259.96146: done checking for max_fail_percentage 15330 1726882259.96147: checking to see if all hosts have failed and the running result is not ok 15330 1726882259.96148: done checking to see if all hosts have failed 15330 1726882259.96148: getting the remaining hosts for this loop 15330 1726882259.96149: done getting the remaining hosts for this loop 15330 1726882259.96151: getting the next task for host managed_node3 15330 1726882259.96154: done getting next task for host managed_node3 15330 1726882259.96157: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15330 1726882259.96158: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882259.96163: getting variables 15330 1726882259.96164: in VariableManager get_vars() 15330 1726882259.96184: Calling all_inventory to load vars for managed_node3 15330 1726882259.96185: Calling groups_inventory to load vars for managed_node3 15330 1726882259.96187: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882259.96195: Calling all_plugins_play to load vars for managed_node3 15330 1726882259.96197: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882259.96199: Calling groups_plugins_play to load vars for managed_node3 15330 1726882259.96411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882259.96816: done with get_vars() 15330 1726882259.96829: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:30:59 -0400 (0:00:01.924) 0:00:09.175 ****** 15330 1726882259.96946: entering _queue_task() for managed_node3/package_facts 15330 1726882259.96948: Creating lock for package_facts 15330 1726882259.97224: worker is 1 (out of 1 available) 15330 1726882259.97240: exiting _queue_task() for managed_node3/package_facts 15330 1726882259.97253: done queuing things up, now waiting for results queue to drain 15330 1726882259.97255: waiting for pending results... 15330 1726882259.97472: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 15330 1726882259.97555: in run() - task 12673a56-9f93-e4fe-1358-000000000193 15330 1726882259.97566: variable 'ansible_search_path' from source: unknown 15330 1726882259.97571: variable 'ansible_search_path' from source: unknown 15330 1726882259.97600: calling self._execute() 15330 1726882259.97751: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882259.97757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882259.97761: variable 'omit' from source: magic vars 15330 1726882259.98129: variable 'ansible_distribution_major_version' from source: facts 15330 1726882259.98142: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882259.98151: variable 'omit' from source: magic vars 15330 1726882259.98222: variable 'omit' from source: magic vars 15330 1726882259.98242: variable 'omit' from source: magic vars 15330 1726882259.98274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882259.98401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882259.98406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882259.98411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882259.98418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882259.98448: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882259.98459: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882259.98472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882259.98605: Set connection var ansible_pipelining to False 15330 1726882259.98633: Set connection var ansible_timeout to 10 15330 1726882259.98647: Set connection var ansible_connection to ssh 15330 1726882259.98650: Set connection var ansible_shell_type to sh 15330 1726882259.98655: Set connection var ansible_shell_executable to /bin/sh 15330 1726882259.98797: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882259.98801: variable 'ansible_shell_executable' from source: unknown 15330 1726882259.98804: variable 'ansible_connection' from source: unknown 15330 1726882259.98806: variable 'ansible_module_compression' from source: unknown 15330 1726882259.98808: variable 'ansible_shell_type' from source: unknown 15330 1726882259.98810: variable 'ansible_shell_executable' from source: unknown 15330 1726882259.98812: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882259.98815: variable 'ansible_pipelining' from source: unknown 15330 1726882259.98817: variable 'ansible_timeout' from source: unknown 15330 1726882259.98819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882259.99033: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882259.99056: variable 'omit' from source: magic vars 15330 1726882259.99066: starting attempt loop 15330 1726882259.99078: running the handler 15330 1726882259.99113: _low_level_execute_command(): starting 15330 1726882259.99116: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882259.99830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882259.99834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882259.99837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882259.99839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882259.99842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882259.99846: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882259.99940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882259.99992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882260.01564: stdout chunk (state=3): >>>/root <<< 15330 1726882260.01731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882260.01831: stderr chunk (state=3): >>><<< 15330 1726882260.01835: stdout chunk (state=3): >>><<< 15330 1726882260.01859: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882260.01875: _low_level_execute_command(): starting 15330 1726882260.01881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704 `" && echo ansible-tmp-1726882260.0186226-15808-156320910480704="` echo /root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704 `" ) && sleep 0' 15330 1726882260.02350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882260.02353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882260.02356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882260.02366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882260.02369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882260.02418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882260.02423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882260.02467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882260.04284: stdout chunk (state=3): >>>ansible-tmp-1726882260.0186226-15808-156320910480704=/root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704 <<< 15330 1726882260.04407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882260.04448: stderr chunk (state=3): >>><<< 15330 1726882260.04452: stdout chunk (state=3): >>><<< 15330 1726882260.04517: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882260.0186226-15808-156320910480704=/root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882260.04664: variable 'ansible_module_compression' from source: unknown 15330 1726882260.04667: ANSIBALLZ: Using lock for package_facts 15330 1726882260.04670: ANSIBALLZ: Acquiring lock 15330 1726882260.04672: ANSIBALLZ: Lock acquired: 140238206202320 15330 1726882260.04676: ANSIBALLZ: Creating module 15330 1726882260.38539: ANSIBALLZ: Writing module into payload 15330 1726882260.38683: ANSIBALLZ: Writing module 15330 1726882260.38722: ANSIBALLZ: Renaming module 15330 1726882260.38725: ANSIBALLZ: Done creating module 15330 1726882260.38750: variable 'ansible_facts' from source: unknown 15330 1726882260.38918: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704/AnsiballZ_package_facts.py 15330 1726882260.39021: Sending initial data 15330 1726882260.39024: Sent initial data (162 bytes) 15330 1726882260.39623: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882260.39659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882260.39673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882260.39703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882260.39767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882260.41412: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882260.41497: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882260.41534: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp9tqouj63 /root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704/AnsiballZ_package_facts.py <<< 15330 1726882260.41541: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704/AnsiballZ_package_facts.py" <<< 15330 1726882260.41588: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp9tqouj63" to remote "/root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704/AnsiballZ_package_facts.py" <<< 15330 1726882260.41597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704/AnsiballZ_package_facts.py" <<< 15330 1726882260.43321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882260.43353: stderr chunk (state=3): >>><<< 15330 1726882260.43356: stdout chunk (state=3): >>><<< 15330 1726882260.43473: done transferring module to remote 15330 1726882260.43479: _low_level_execute_command(): starting 15330 1726882260.43482: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704/ /root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704/AnsiballZ_package_facts.py && sleep 0' 15330 1726882260.44400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882260.44408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882260.44481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882260.46207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882260.46256: stderr chunk (state=3): >>><<< 15330 1726882260.46262: stdout chunk (state=3): >>><<< 15330 1726882260.46278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882260.46283: _low_level_execute_command(): starting 15330 1726882260.46292: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704/AnsiballZ_package_facts.py && sleep 0' 15330 1726882260.46753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882260.46757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882260.46770: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882260.46833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882260.46840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882260.46842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882260.46892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882260.90739: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 15330 1726882260.90764: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 15330 1726882260.90771: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 15330 1726882260.90806: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15330 1726882260.92513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882260.92600: stderr chunk (state=3): >>><<< 15330 1726882260.92603: stdout chunk (state=3): >>><<< 15330 1726882260.92825: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882260.95329: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882260.95381: _low_level_execute_command(): starting 15330 1726882260.95385: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882260.0186226-15808-156320910480704/ > /dev/null 2>&1 && sleep 0' 15330 1726882260.96019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882260.96023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882260.96025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 15330 1726882260.96028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882260.96030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882260.96090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882260.96100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882260.96148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882260.97944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882260.97974: stderr chunk (state=3): >>><<< 15330 1726882260.97977: stdout chunk (state=3): >>><<< 15330 1726882260.97994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882260.97998: handler run complete 15330 1726882260.98496: variable 'ansible_facts' from source: unknown 15330 1726882260.98833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.00535: variable 'ansible_facts' from source: unknown 15330 1726882261.01199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.01768: attempt loop complete, returning result 15330 1726882261.01781: _execute() done 15330 1726882261.01784: dumping result to json 15330 1726882261.01977: done dumping result, returning 15330 1726882261.01990: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-e4fe-1358-000000000193] 15330 1726882261.01995: sending task result for task 12673a56-9f93-e4fe-1358-000000000193 15330 1726882261.03302: done sending task result for task 12673a56-9f93-e4fe-1358-000000000193 15330 1726882261.03306: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882261.03347: no more pending results, returning what we have 15330 1726882261.03348: results queue empty 15330 1726882261.03349: checking for any_errors_fatal 15330 1726882261.03352: done checking for any_errors_fatal 15330 1726882261.03352: checking for max_fail_percentage 15330 1726882261.03353: done checking for max_fail_percentage 15330 1726882261.03354: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.03354: done checking to see if all hosts have failed 15330 1726882261.03355: getting the remaining hosts for this loop 15330 1726882261.03356: done getting the remaining hosts for this loop 15330 1726882261.03358: getting the next task for host managed_node3 15330 1726882261.03362: done getting next task for host managed_node3 15330 1726882261.03364: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15330 1726882261.03366: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.03371: getting variables 15330 1726882261.03372: in VariableManager get_vars() 15330 1726882261.03396: Calling all_inventory to load vars for managed_node3 15330 1726882261.03398: Calling groups_inventory to load vars for managed_node3 15330 1726882261.03399: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.03406: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.03407: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.03409: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.04179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.05627: done with get_vars() 15330 1726882261.05649: done getting variables 15330 1726882261.05707: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:31:01 -0400 (0:00:01.087) 0:00:10.263 ****** 15330 1726882261.05734: entering _queue_task() for managed_node3/debug 15330 1726882261.06058: worker is 1 (out of 1 available) 15330 1726882261.06075: exiting _queue_task() for managed_node3/debug 15330 1726882261.06089: done queuing things up, now waiting for results queue to drain 15330 1726882261.06090: waiting for pending results... 15330 1726882261.06511: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 15330 1726882261.06516: in run() - task 12673a56-9f93-e4fe-1358-000000000015 15330 1726882261.06519: variable 'ansible_search_path' from source: unknown 15330 1726882261.06521: variable 'ansible_search_path' from source: unknown 15330 1726882261.06534: calling self._execute() 15330 1726882261.06628: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.06645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.06660: variable 'omit' from source: magic vars 15330 1726882261.07072: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.07076: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882261.07078: variable 'omit' from source: magic vars 15330 1726882261.07103: variable 'omit' from source: magic vars 15330 1726882261.07194: variable 'network_provider' from source: set_fact 15330 1726882261.07216: variable 'omit' from source: magic vars 15330 1726882261.07256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882261.07300: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882261.07399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882261.07403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882261.07406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882261.07408: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882261.07411: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.07413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.07509: Set connection var ansible_pipelining to False 15330 1726882261.07527: Set connection var ansible_timeout to 10 15330 1726882261.07534: Set connection var ansible_connection to ssh 15330 1726882261.07540: Set connection var ansible_shell_type to sh 15330 1726882261.07549: Set connection var ansible_shell_executable to /bin/sh 15330 1726882261.07558: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882261.07580: variable 'ansible_shell_executable' from source: unknown 15330 1726882261.07588: variable 'ansible_connection' from source: unknown 15330 1726882261.07596: variable 'ansible_module_compression' from source: unknown 15330 1726882261.07602: variable 'ansible_shell_type' from source: unknown 15330 1726882261.07608: variable 'ansible_shell_executable' from source: unknown 15330 1726882261.07618: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.07625: variable 'ansible_pipelining' from source: unknown 15330 1726882261.07632: variable 'ansible_timeout' from source: unknown 15330 1726882261.07724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.07780: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882261.07798: variable 'omit' from source: magic vars 15330 1726882261.07807: starting attempt loop 15330 1726882261.07813: running the handler 15330 1726882261.07863: handler run complete 15330 1726882261.07882: attempt loop complete, returning result 15330 1726882261.07889: _execute() done 15330 1726882261.07897: dumping result to json 15330 1726882261.07905: done dumping result, returning 15330 1726882261.07917: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-e4fe-1358-000000000015] 15330 1726882261.07926: sending task result for task 12673a56-9f93-e4fe-1358-000000000015 ok: [managed_node3] => {} MSG: Using network provider: nm 15330 1726882261.08105: no more pending results, returning what we have 15330 1726882261.08108: results queue empty 15330 1726882261.08110: checking for any_errors_fatal 15330 1726882261.08119: done checking for any_errors_fatal 15330 1726882261.08120: checking for max_fail_percentage 15330 1726882261.08122: done checking for max_fail_percentage 15330 1726882261.08126: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.08127: done checking to see if all hosts have failed 15330 1726882261.08128: getting the remaining hosts for this loop 15330 1726882261.08129: done getting the remaining hosts for this loop 15330 1726882261.08133: getting the next task for host managed_node3 15330 1726882261.08140: done getting next task for host managed_node3 15330 1726882261.08144: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15330 1726882261.08147: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.08157: getting variables 15330 1726882261.08159: in VariableManager get_vars() 15330 1726882261.08198: Calling all_inventory to load vars for managed_node3 15330 1726882261.08201: Calling groups_inventory to load vars for managed_node3 15330 1726882261.08204: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.08215: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.08219: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.08222: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.08907: done sending task result for task 12673a56-9f93-e4fe-1358-000000000015 15330 1726882261.08910: WORKER PROCESS EXITING 15330 1726882261.09830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.12060: done with get_vars() 15330 1726882261.12091: done getting variables 15330 1726882261.12369: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:31:01 -0400 (0:00:00.066) 0:00:10.330 ****** 15330 1726882261.12400: entering _queue_task() for managed_node3/fail 15330 1726882261.12402: Creating lock for fail 15330 1726882261.12836: worker is 1 (out of 1 available) 15330 1726882261.12849: exiting _queue_task() for managed_node3/fail 15330 1726882261.12863: done queuing things up, now waiting for results queue to drain 15330 1726882261.12865: waiting for pending results... 15330 1726882261.13102: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15330 1726882261.13202: in run() - task 12673a56-9f93-e4fe-1358-000000000016 15330 1726882261.13228: variable 'ansible_search_path' from source: unknown 15330 1726882261.13235: variable 'ansible_search_path' from source: unknown 15330 1726882261.13274: calling self._execute() 15330 1726882261.13376: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.13389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.13407: variable 'omit' from source: magic vars 15330 1726882261.13796: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.13815: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882261.13939: variable 'network_state' from source: role '' defaults 15330 1726882261.13955: Evaluated conditional (network_state != {}): False 15330 1726882261.13962: when evaluation is False, skipping this task 15330 1726882261.13970: _execute() done 15330 1726882261.13981: dumping result to json 15330 1726882261.13990: done dumping result, returning 15330 1726882261.14004: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-e4fe-1358-000000000016] 15330 1726882261.14015: sending task result for task 12673a56-9f93-e4fe-1358-000000000016 15330 1726882261.14199: done sending task result for task 12673a56-9f93-e4fe-1358-000000000016 15330 1726882261.14202: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882261.14255: no more pending results, returning what we have 15330 1726882261.14260: results queue empty 15330 1726882261.14261: checking for any_errors_fatal 15330 1726882261.14267: done checking for any_errors_fatal 15330 1726882261.14268: checking for max_fail_percentage 15330 1726882261.14270: done checking for max_fail_percentage 15330 1726882261.14271: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.14272: done checking to see if all hosts have failed 15330 1726882261.14273: getting the remaining hosts for this loop 15330 1726882261.14274: done getting the remaining hosts for this loop 15330 1726882261.14278: getting the next task for host managed_node3 15330 1726882261.14287: done getting next task for host managed_node3 15330 1726882261.14291: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15330 1726882261.14296: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.14311: getting variables 15330 1726882261.14313: in VariableManager get_vars() 15330 1726882261.14596: Calling all_inventory to load vars for managed_node3 15330 1726882261.14600: Calling groups_inventory to load vars for managed_node3 15330 1726882261.14603: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.14613: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.14615: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.14618: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.16647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.19407: done with get_vars() 15330 1726882261.19435: done getting variables 15330 1726882261.19495: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:31:01 -0400 (0:00:00.071) 0:00:10.401 ****** 15330 1726882261.19525: entering _queue_task() for managed_node3/fail 15330 1726882261.19841: worker is 1 (out of 1 available) 15330 1726882261.19853: exiting _queue_task() for managed_node3/fail 15330 1726882261.19865: done queuing things up, now waiting for results queue to drain 15330 1726882261.19866: waiting for pending results... 15330 1726882261.20312: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15330 1726882261.20316: in run() - task 12673a56-9f93-e4fe-1358-000000000017 15330 1726882261.20319: variable 'ansible_search_path' from source: unknown 15330 1726882261.20321: variable 'ansible_search_path' from source: unknown 15330 1726882261.20324: calling self._execute() 15330 1726882261.20380: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.20449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.20466: variable 'omit' from source: magic vars 15330 1726882261.21177: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.21524: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882261.21578: variable 'network_state' from source: role '' defaults 15330 1726882261.21596: Evaluated conditional (network_state != {}): False 15330 1726882261.21608: when evaluation is False, skipping this task 15330 1726882261.21631: _execute() done 15330 1726882261.21646: dumping result to json 15330 1726882261.21655: done dumping result, returning 15330 1726882261.21667: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-e4fe-1358-000000000017] 15330 1726882261.21677: sending task result for task 12673a56-9f93-e4fe-1358-000000000017 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882261.21872: no more pending results, returning what we have 15330 1726882261.21876: results queue empty 15330 1726882261.21877: checking for any_errors_fatal 15330 1726882261.21887: done checking for any_errors_fatal 15330 1726882261.21888: checking for max_fail_percentage 15330 1726882261.21890: done checking for max_fail_percentage 15330 1726882261.21891: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.21894: done checking to see if all hosts have failed 15330 1726882261.21895: getting the remaining hosts for this loop 15330 1726882261.21896: done getting the remaining hosts for this loop 15330 1726882261.21900: getting the next task for host managed_node3 15330 1726882261.21907: done getting next task for host managed_node3 15330 1726882261.21911: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15330 1726882261.21914: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.21928: getting variables 15330 1726882261.21930: in VariableManager get_vars() 15330 1726882261.21969: Calling all_inventory to load vars for managed_node3 15330 1726882261.21972: Calling groups_inventory to load vars for managed_node3 15330 1726882261.21974: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.21987: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.21990: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.22299: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.23006: done sending task result for task 12673a56-9f93-e4fe-1358-000000000017 15330 1726882261.23010: WORKER PROCESS EXITING 15330 1726882261.24239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.27608: done with get_vars() 15330 1726882261.27641: done getting variables 15330 1726882261.27704: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:31:01 -0400 (0:00:00.082) 0:00:10.483 ****** 15330 1726882261.27735: entering _queue_task() for managed_node3/fail 15330 1726882261.28549: worker is 1 (out of 1 available) 15330 1726882261.28561: exiting _queue_task() for managed_node3/fail 15330 1726882261.28573: done queuing things up, now waiting for results queue to drain 15330 1726882261.28574: waiting for pending results... 15330 1726882261.29145: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15330 1726882261.29504: in run() - task 12673a56-9f93-e4fe-1358-000000000018 15330 1726882261.29508: variable 'ansible_search_path' from source: unknown 15330 1726882261.29511: variable 'ansible_search_path' from source: unknown 15330 1726882261.29515: calling self._execute() 15330 1726882261.29576: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.29650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.29665: variable 'omit' from source: magic vars 15330 1726882261.30319: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.30337: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882261.30526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882261.34340: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882261.34427: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882261.34473: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882261.34520: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882261.34572: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882261.34623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.34648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.34665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.34697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.34709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.34798: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.34838: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15330 1726882261.35019: variable 'ansible_distribution' from source: facts 15330 1726882261.35023: variable '__network_rh_distros' from source: role '' defaults 15330 1726882261.35026: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15330 1726882261.35287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.35395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.35398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.35401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.35404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.35499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.35503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.35507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.35553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.35573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.35620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.35648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.35675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.35719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.35737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.36059: variable 'network_connections' from source: play vars 15330 1726882261.36125: variable 'interface' from source: set_fact 15330 1726882261.36204: variable 'interface' from source: set_fact 15330 1726882261.36208: variable 'interface' from source: set_fact 15330 1726882261.36248: variable 'interface' from source: set_fact 15330 1726882261.36260: variable 'network_state' from source: role '' defaults 15330 1726882261.36358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882261.36570: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882261.36574: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882261.36607: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882261.36624: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882261.36673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882261.36799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882261.36802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.36805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882261.36820: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15330 1726882261.36832: when evaluation is False, skipping this task 15330 1726882261.36839: _execute() done 15330 1726882261.36846: dumping result to json 15330 1726882261.36853: done dumping result, returning 15330 1726882261.36864: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-e4fe-1358-000000000018] 15330 1726882261.36871: sending task result for task 12673a56-9f93-e4fe-1358-000000000018 15330 1726882261.37102: done sending task result for task 12673a56-9f93-e4fe-1358-000000000018 15330 1726882261.37106: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15330 1726882261.37156: no more pending results, returning what we have 15330 1726882261.37161: results queue empty 15330 1726882261.37162: checking for any_errors_fatal 15330 1726882261.37167: done checking for any_errors_fatal 15330 1726882261.37168: checking for max_fail_percentage 15330 1726882261.37170: done checking for max_fail_percentage 15330 1726882261.37171: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.37172: done checking to see if all hosts have failed 15330 1726882261.37172: getting the remaining hosts for this loop 15330 1726882261.37174: done getting the remaining hosts for this loop 15330 1726882261.37180: getting the next task for host managed_node3 15330 1726882261.37189: done getting next task for host managed_node3 15330 1726882261.37196: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15330 1726882261.37198: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.37213: getting variables 15330 1726882261.37216: in VariableManager get_vars() 15330 1726882261.37255: Calling all_inventory to load vars for managed_node3 15330 1726882261.37258: Calling groups_inventory to load vars for managed_node3 15330 1726882261.37260: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.37272: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.37275: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.37278: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.38600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.40358: done with get_vars() 15330 1726882261.40399: done getting variables 15330 1726882261.40508: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:31:01 -0400 (0:00:00.128) 0:00:10.611 ****** 15330 1726882261.40538: entering _queue_task() for managed_node3/dnf 15330 1726882261.40900: worker is 1 (out of 1 available) 15330 1726882261.40915: exiting _queue_task() for managed_node3/dnf 15330 1726882261.40929: done queuing things up, now waiting for results queue to drain 15330 1726882261.40931: waiting for pending results... 15330 1726882261.41171: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15330 1726882261.41226: in run() - task 12673a56-9f93-e4fe-1358-000000000019 15330 1726882261.41238: variable 'ansible_search_path' from source: unknown 15330 1726882261.41242: variable 'ansible_search_path' from source: unknown 15330 1726882261.41271: calling self._execute() 15330 1726882261.41341: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.41347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.41357: variable 'omit' from source: magic vars 15330 1726882261.41655: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.41698: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882261.41866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882261.43997: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882261.44200: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882261.44204: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882261.44206: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882261.44208: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882261.44254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.44288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.44321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.44364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.44383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.44511: variable 'ansible_distribution' from source: facts 15330 1726882261.44521: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.44540: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15330 1726882261.44657: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882261.44751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.44769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.44798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.44821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.44831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.44862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.44877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.44897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.44922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.44933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.44963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.44977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.44996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.45021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.45031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.45138: variable 'network_connections' from source: play vars 15330 1726882261.45148: variable 'interface' from source: set_fact 15330 1726882261.45204: variable 'interface' from source: set_fact 15330 1726882261.45294: variable 'interface' from source: set_fact 15330 1726882261.45297: variable 'interface' from source: set_fact 15330 1726882261.45307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882261.45442: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882261.45469: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882261.45494: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882261.45516: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882261.45549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882261.45565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882261.45586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.45607: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882261.45653: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882261.45804: variable 'network_connections' from source: play vars 15330 1726882261.45807: variable 'interface' from source: set_fact 15330 1726882261.45853: variable 'interface' from source: set_fact 15330 1726882261.45858: variable 'interface' from source: set_fact 15330 1726882261.45903: variable 'interface' from source: set_fact 15330 1726882261.45927: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15330 1726882261.45931: when evaluation is False, skipping this task 15330 1726882261.45933: _execute() done 15330 1726882261.45937: dumping result to json 15330 1726882261.45939: done dumping result, returning 15330 1726882261.45948: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-000000000019] 15330 1726882261.45950: sending task result for task 12673a56-9f93-e4fe-1358-000000000019 15330 1726882261.46040: done sending task result for task 12673a56-9f93-e4fe-1358-000000000019 15330 1726882261.46043: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15330 1726882261.46126: no more pending results, returning what we have 15330 1726882261.46130: results queue empty 15330 1726882261.46131: checking for any_errors_fatal 15330 1726882261.46136: done checking for any_errors_fatal 15330 1726882261.46136: checking for max_fail_percentage 15330 1726882261.46138: done checking for max_fail_percentage 15330 1726882261.46139: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.46140: done checking to see if all hosts have failed 15330 1726882261.46140: getting the remaining hosts for this loop 15330 1726882261.46142: done getting the remaining hosts for this loop 15330 1726882261.46145: getting the next task for host managed_node3 15330 1726882261.46150: done getting next task for host managed_node3 15330 1726882261.46154: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15330 1726882261.46158: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.46170: getting variables 15330 1726882261.46172: in VariableManager get_vars() 15330 1726882261.46213: Calling all_inventory to load vars for managed_node3 15330 1726882261.46216: Calling groups_inventory to load vars for managed_node3 15330 1726882261.46217: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.46226: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.46228: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.46230: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.47462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.48458: done with get_vars() 15330 1726882261.48476: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15330 1726882261.48534: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:31:01 -0400 (0:00:00.080) 0:00:10.691 ****** 15330 1726882261.48556: entering _queue_task() for managed_node3/yum 15330 1726882261.48557: Creating lock for yum 15330 1726882261.48815: worker is 1 (out of 1 available) 15330 1726882261.48828: exiting _queue_task() for managed_node3/yum 15330 1726882261.48841: done queuing things up, now waiting for results queue to drain 15330 1726882261.48842: waiting for pending results... 15330 1726882261.49014: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15330 1726882261.49080: in run() - task 12673a56-9f93-e4fe-1358-00000000001a 15330 1726882261.49096: variable 'ansible_search_path' from source: unknown 15330 1726882261.49100: variable 'ansible_search_path' from source: unknown 15330 1726882261.49129: calling self._execute() 15330 1726882261.49200: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.49205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.49214: variable 'omit' from source: magic vars 15330 1726882261.49481: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.49495: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882261.49799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882261.51895: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882261.52002: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882261.52007: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882261.52009: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882261.52047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882261.52167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.52218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.52246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.52305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.52344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.52453: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.52483: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15330 1726882261.52486: when evaluation is False, skipping this task 15330 1726882261.52489: _execute() done 15330 1726882261.52491: dumping result to json 15330 1726882261.52495: done dumping result, returning 15330 1726882261.52505: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-00000000001a] 15330 1726882261.52509: sending task result for task 12673a56-9f93-e4fe-1358-00000000001a 15330 1726882261.52672: done sending task result for task 12673a56-9f93-e4fe-1358-00000000001a 15330 1726882261.52675: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15330 1726882261.52768: no more pending results, returning what we have 15330 1726882261.52771: results queue empty 15330 1726882261.52772: checking for any_errors_fatal 15330 1726882261.52776: done checking for any_errors_fatal 15330 1726882261.52777: checking for max_fail_percentage 15330 1726882261.52778: done checking for max_fail_percentage 15330 1726882261.52779: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.52780: done checking to see if all hosts have failed 15330 1726882261.52780: getting the remaining hosts for this loop 15330 1726882261.52782: done getting the remaining hosts for this loop 15330 1726882261.52785: getting the next task for host managed_node3 15330 1726882261.52790: done getting next task for host managed_node3 15330 1726882261.52796: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15330 1726882261.52798: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.52811: getting variables 15330 1726882261.52815: in VariableManager get_vars() 15330 1726882261.52849: Calling all_inventory to load vars for managed_node3 15330 1726882261.52852: Calling groups_inventory to load vars for managed_node3 15330 1726882261.52854: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.52862: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.52864: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.52867: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.53995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.55107: done with get_vars() 15330 1726882261.55125: done getting variables 15330 1726882261.55169: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:31:01 -0400 (0:00:00.066) 0:00:10.757 ****** 15330 1726882261.55195: entering _queue_task() for managed_node3/fail 15330 1726882261.55441: worker is 1 (out of 1 available) 15330 1726882261.55454: exiting _queue_task() for managed_node3/fail 15330 1726882261.55465: done queuing things up, now waiting for results queue to drain 15330 1726882261.55467: waiting for pending results... 15330 1726882261.55635: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15330 1726882261.55705: in run() - task 12673a56-9f93-e4fe-1358-00000000001b 15330 1726882261.55712: variable 'ansible_search_path' from source: unknown 15330 1726882261.55715: variable 'ansible_search_path' from source: unknown 15330 1726882261.55744: calling self._execute() 15330 1726882261.55813: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.55818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.55824: variable 'omit' from source: magic vars 15330 1726882261.56090: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.56099: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882261.56180: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882261.56311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882261.57950: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882261.57976: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882261.58075: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882261.58080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882261.58101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882261.58309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.58313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.58315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.58333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.58389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.58412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.58443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.58597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.58600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.58603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.58605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.58692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.58697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.58700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.58708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.58999: variable 'network_connections' from source: play vars 15330 1726882261.59003: variable 'interface' from source: set_fact 15330 1726882261.59005: variable 'interface' from source: set_fact 15330 1726882261.59007: variable 'interface' from source: set_fact 15330 1726882261.59077: variable 'interface' from source: set_fact 15330 1726882261.59134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882261.59579: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882261.59650: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882261.59658: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882261.59699: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882261.59802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882261.59807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882261.59849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.59853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882261.59900: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882261.60132: variable 'network_connections' from source: play vars 15330 1726882261.60135: variable 'interface' from source: set_fact 15330 1726882261.60189: variable 'interface' from source: set_fact 15330 1726882261.60202: variable 'interface' from source: set_fact 15330 1726882261.60327: variable 'interface' from source: set_fact 15330 1726882261.60330: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15330 1726882261.60333: when evaluation is False, skipping this task 15330 1726882261.60335: _execute() done 15330 1726882261.60337: dumping result to json 15330 1726882261.60339: done dumping result, returning 15330 1726882261.60341: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-00000000001b] 15330 1726882261.60352: sending task result for task 12673a56-9f93-e4fe-1358-00000000001b 15330 1726882261.60428: done sending task result for task 12673a56-9f93-e4fe-1358-00000000001b 15330 1726882261.60430: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15330 1726882261.60479: no more pending results, returning what we have 15330 1726882261.60483: results queue empty 15330 1726882261.60484: checking for any_errors_fatal 15330 1726882261.60490: done checking for any_errors_fatal 15330 1726882261.60491: checking for max_fail_percentage 15330 1726882261.60494: done checking for max_fail_percentage 15330 1726882261.60495: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.60496: done checking to see if all hosts have failed 15330 1726882261.60497: getting the remaining hosts for this loop 15330 1726882261.60498: done getting the remaining hosts for this loop 15330 1726882261.60501: getting the next task for host managed_node3 15330 1726882261.60507: done getting next task for host managed_node3 15330 1726882261.60511: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15330 1726882261.60512: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.60524: getting variables 15330 1726882261.60526: in VariableManager get_vars() 15330 1726882261.60568: Calling all_inventory to load vars for managed_node3 15330 1726882261.60571: Calling groups_inventory to load vars for managed_node3 15330 1726882261.60573: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.60583: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.60585: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.60590: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.61534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.62578: done with get_vars() 15330 1726882261.62598: done getting variables 15330 1726882261.62642: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:31:01 -0400 (0:00:00.074) 0:00:10.832 ****** 15330 1726882261.62663: entering _queue_task() for managed_node3/package 15330 1726882261.62918: worker is 1 (out of 1 available) 15330 1726882261.62933: exiting _queue_task() for managed_node3/package 15330 1726882261.62944: done queuing things up, now waiting for results queue to drain 15330 1726882261.62946: waiting for pending results... 15330 1726882261.63106: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 15330 1726882261.63190: in run() - task 12673a56-9f93-e4fe-1358-00000000001c 15330 1726882261.63196: variable 'ansible_search_path' from source: unknown 15330 1726882261.63199: variable 'ansible_search_path' from source: unknown 15330 1726882261.63221: calling self._execute() 15330 1726882261.63302: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.63330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.63334: variable 'omit' from source: magic vars 15330 1726882261.63691: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.63712: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882261.63843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882261.64038: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882261.64070: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882261.64098: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882261.64124: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882261.64238: variable 'network_packages' from source: role '' defaults 15330 1726882261.64324: variable '__network_provider_setup' from source: role '' defaults 15330 1726882261.64330: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882261.64389: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882261.64399: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882261.64461: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882261.64646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882261.66333: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882261.66385: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882261.66417: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882261.66441: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882261.66459: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882261.66523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.66541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.66559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.66658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.66662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.66769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.66772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.66774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.66786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.66804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.66950: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15330 1726882261.67299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.67302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.67307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.67309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.67314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.67392: variable 'ansible_python' from source: facts 15330 1726882261.67422: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15330 1726882261.67481: variable '__network_wpa_supplicant_required' from source: role '' defaults 15330 1726882261.67578: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15330 1726882261.67714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.67798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.67802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.67846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.67862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.67898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882261.67921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882261.67937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.67960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882261.67971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882261.68072: variable 'network_connections' from source: play vars 15330 1726882261.68076: variable 'interface' from source: set_fact 15330 1726882261.68183: variable 'interface' from source: set_fact 15330 1726882261.68189: variable 'interface' from source: set_fact 15330 1726882261.68283: variable 'interface' from source: set_fact 15330 1726882261.68498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882261.68502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882261.68504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882261.68506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882261.68509: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882261.68743: variable 'network_connections' from source: play vars 15330 1726882261.68752: variable 'interface' from source: set_fact 15330 1726882261.68853: variable 'interface' from source: set_fact 15330 1726882261.68867: variable 'interface' from source: set_fact 15330 1726882261.68962: variable 'interface' from source: set_fact 15330 1726882261.69247: variable '__network_packages_default_wireless' from source: role '' defaults 15330 1726882261.69339: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882261.69675: variable 'network_connections' from source: play vars 15330 1726882261.69682: variable 'interface' from source: set_fact 15330 1726882261.69898: variable 'interface' from source: set_fact 15330 1726882261.69901: variable 'interface' from source: set_fact 15330 1726882261.69903: variable 'interface' from source: set_fact 15330 1726882261.69906: variable '__network_packages_default_team' from source: role '' defaults 15330 1726882261.69954: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882261.70253: variable 'network_connections' from source: play vars 15330 1726882261.70263: variable 'interface' from source: set_fact 15330 1726882261.70328: variable 'interface' from source: set_fact 15330 1726882261.70340: variable 'interface' from source: set_fact 15330 1726882261.70405: variable 'interface' from source: set_fact 15330 1726882261.70468: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882261.70532: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882261.70543: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882261.70607: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882261.70838: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15330 1726882261.71170: variable 'network_connections' from source: play vars 15330 1726882261.71173: variable 'interface' from source: set_fact 15330 1726882261.71218: variable 'interface' from source: set_fact 15330 1726882261.71224: variable 'interface' from source: set_fact 15330 1726882261.71268: variable 'interface' from source: set_fact 15330 1726882261.71275: variable 'ansible_distribution' from source: facts 15330 1726882261.71278: variable '__network_rh_distros' from source: role '' defaults 15330 1726882261.71285: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.71310: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15330 1726882261.71417: variable 'ansible_distribution' from source: facts 15330 1726882261.71420: variable '__network_rh_distros' from source: role '' defaults 15330 1726882261.71425: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.71433: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15330 1726882261.71556: variable 'ansible_distribution' from source: facts 15330 1726882261.71559: variable '__network_rh_distros' from source: role '' defaults 15330 1726882261.71562: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.71631: variable 'network_provider' from source: set_fact 15330 1726882261.71634: variable 'ansible_facts' from source: unknown 15330 1726882261.72321: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15330 1726882261.72324: when evaluation is False, skipping this task 15330 1726882261.72326: _execute() done 15330 1726882261.72329: dumping result to json 15330 1726882261.72331: done dumping result, returning 15330 1726882261.72333: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-e4fe-1358-00000000001c] 15330 1726882261.72335: sending task result for task 12673a56-9f93-e4fe-1358-00000000001c 15330 1726882261.72402: done sending task result for task 12673a56-9f93-e4fe-1358-00000000001c 15330 1726882261.72404: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15330 1726882261.72467: no more pending results, returning what we have 15330 1726882261.72471: results queue empty 15330 1726882261.72472: checking for any_errors_fatal 15330 1726882261.72477: done checking for any_errors_fatal 15330 1726882261.72478: checking for max_fail_percentage 15330 1726882261.72480: done checking for max_fail_percentage 15330 1726882261.72481: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.72481: done checking to see if all hosts have failed 15330 1726882261.72482: getting the remaining hosts for this loop 15330 1726882261.72483: done getting the remaining hosts for this loop 15330 1726882261.72489: getting the next task for host managed_node3 15330 1726882261.72498: done getting next task for host managed_node3 15330 1726882261.72502: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15330 1726882261.72504: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.72516: getting variables 15330 1726882261.72518: in VariableManager get_vars() 15330 1726882261.72553: Calling all_inventory to load vars for managed_node3 15330 1726882261.72556: Calling groups_inventory to load vars for managed_node3 15330 1726882261.72558: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.72571: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.72573: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.72575: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.73874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.75723: done with get_vars() 15330 1726882261.75748: done getting variables 15330 1726882261.75810: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:31:01 -0400 (0:00:00.131) 0:00:10.964 ****** 15330 1726882261.75841: entering _queue_task() for managed_node3/package 15330 1726882261.76557: worker is 1 (out of 1 available) 15330 1726882261.76572: exiting _queue_task() for managed_node3/package 15330 1726882261.76588: done queuing things up, now waiting for results queue to drain 15330 1726882261.76589: waiting for pending results... 15330 1726882261.77380: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15330 1726882261.77800: in run() - task 12673a56-9f93-e4fe-1358-00000000001d 15330 1726882261.77804: variable 'ansible_search_path' from source: unknown 15330 1726882261.77807: variable 'ansible_search_path' from source: unknown 15330 1726882261.77810: calling self._execute() 15330 1726882261.77948: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.77960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.77973: variable 'omit' from source: magic vars 15330 1726882261.78718: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.78734: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882261.79098: variable 'network_state' from source: role '' defaults 15330 1726882261.79102: Evaluated conditional (network_state != {}): False 15330 1726882261.79104: when evaluation is False, skipping this task 15330 1726882261.79107: _execute() done 15330 1726882261.79109: dumping result to json 15330 1726882261.79111: done dumping result, returning 15330 1726882261.79113: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-e4fe-1358-00000000001d] 15330 1726882261.79116: sending task result for task 12673a56-9f93-e4fe-1358-00000000001d 15330 1726882261.79183: done sending task result for task 12673a56-9f93-e4fe-1358-00000000001d 15330 1726882261.79190: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882261.79242: no more pending results, returning what we have 15330 1726882261.79245: results queue empty 15330 1726882261.79246: checking for any_errors_fatal 15330 1726882261.79250: done checking for any_errors_fatal 15330 1726882261.79251: checking for max_fail_percentage 15330 1726882261.79252: done checking for max_fail_percentage 15330 1726882261.79253: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.79254: done checking to see if all hosts have failed 15330 1726882261.79255: getting the remaining hosts for this loop 15330 1726882261.79256: done getting the remaining hosts for this loop 15330 1726882261.79260: getting the next task for host managed_node3 15330 1726882261.79267: done getting next task for host managed_node3 15330 1726882261.79271: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15330 1726882261.79273: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.79289: getting variables 15330 1726882261.79291: in VariableManager get_vars() 15330 1726882261.79340: Calling all_inventory to load vars for managed_node3 15330 1726882261.79342: Calling groups_inventory to load vars for managed_node3 15330 1726882261.79345: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.79359: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.79362: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.79365: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.89043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882261.92159: done with get_vars() 15330 1726882261.92189: done getting variables 15330 1726882261.92237: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:31:01 -0400 (0:00:00.164) 0:00:11.128 ****** 15330 1726882261.92262: entering _queue_task() for managed_node3/package 15330 1726882261.92990: worker is 1 (out of 1 available) 15330 1726882261.93005: exiting _queue_task() for managed_node3/package 15330 1726882261.93017: done queuing things up, now waiting for results queue to drain 15330 1726882261.93019: waiting for pending results... 15330 1726882261.93709: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15330 1726882261.93735: in run() - task 12673a56-9f93-e4fe-1358-00000000001e 15330 1726882261.93757: variable 'ansible_search_path' from source: unknown 15330 1726882261.93807: variable 'ansible_search_path' from source: unknown 15330 1726882261.93901: calling self._execute() 15330 1726882261.94120: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882261.94135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882261.94149: variable 'omit' from source: magic vars 15330 1726882261.95079: variable 'ansible_distribution_major_version' from source: facts 15330 1726882261.95102: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882261.95338: variable 'network_state' from source: role '' defaults 15330 1726882261.95378: Evaluated conditional (network_state != {}): False 15330 1726882261.95385: when evaluation is False, skipping this task 15330 1726882261.95409: _execute() done 15330 1726882261.95416: dumping result to json 15330 1726882261.95436: done dumping result, returning 15330 1726882261.95460: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-e4fe-1358-00000000001e] 15330 1726882261.95698: sending task result for task 12673a56-9f93-e4fe-1358-00000000001e 15330 1726882261.96203: done sending task result for task 12673a56-9f93-e4fe-1358-00000000001e 15330 1726882261.96208: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882261.96255: no more pending results, returning what we have 15330 1726882261.96260: results queue empty 15330 1726882261.96260: checking for any_errors_fatal 15330 1726882261.96268: done checking for any_errors_fatal 15330 1726882261.96268: checking for max_fail_percentage 15330 1726882261.96270: done checking for max_fail_percentage 15330 1726882261.96271: checking to see if all hosts have failed and the running result is not ok 15330 1726882261.96272: done checking to see if all hosts have failed 15330 1726882261.96273: getting the remaining hosts for this loop 15330 1726882261.96274: done getting the remaining hosts for this loop 15330 1726882261.96277: getting the next task for host managed_node3 15330 1726882261.96283: done getting next task for host managed_node3 15330 1726882261.96287: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15330 1726882261.96289: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882261.96305: getting variables 15330 1726882261.96307: in VariableManager get_vars() 15330 1726882261.96344: Calling all_inventory to load vars for managed_node3 15330 1726882261.96346: Calling groups_inventory to load vars for managed_node3 15330 1726882261.96348: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882261.96358: Calling all_plugins_play to load vars for managed_node3 15330 1726882261.96361: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882261.96363: Calling groups_plugins_play to load vars for managed_node3 15330 1726882261.99143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882262.02345: done with get_vars() 15330 1726882262.02375: done getting variables 15330 1726882262.02479: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:31:02 -0400 (0:00:00.104) 0:00:11.233 ****** 15330 1726882262.02717: entering _queue_task() for managed_node3/service 15330 1726882262.02719: Creating lock for service 15330 1726882262.03256: worker is 1 (out of 1 available) 15330 1726882262.03270: exiting _queue_task() for managed_node3/service 15330 1726882262.03281: done queuing things up, now waiting for results queue to drain 15330 1726882262.03282: waiting for pending results... 15330 1726882262.03831: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15330 1726882262.04000: in run() - task 12673a56-9f93-e4fe-1358-00000000001f 15330 1726882262.04148: variable 'ansible_search_path' from source: unknown 15330 1726882262.04155: variable 'ansible_search_path' from source: unknown 15330 1726882262.04273: calling self._execute() 15330 1726882262.04483: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882262.04561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882262.04564: variable 'omit' from source: magic vars 15330 1726882262.05273: variable 'ansible_distribution_major_version' from source: facts 15330 1726882262.05358: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882262.05626: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882262.06202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882262.09081: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882262.09166: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882262.09208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882262.09254: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882262.09287: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882262.09374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882262.09409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882262.09440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882262.09494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882262.09515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882262.09564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882262.09599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882262.09629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882262.09670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882262.09697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882262.09742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882262.09770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882262.09838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882262.09928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882262.09948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882262.10400: variable 'network_connections' from source: play vars 15330 1726882262.10403: variable 'interface' from source: set_fact 15330 1726882262.10405: variable 'interface' from source: set_fact 15330 1726882262.10434: variable 'interface' from source: set_fact 15330 1726882262.10496: variable 'interface' from source: set_fact 15330 1726882262.10572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882262.10745: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882262.10784: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882262.10840: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882262.10874: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882262.10924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882262.10956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882262.10984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882262.11016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882262.11086: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882262.11327: variable 'network_connections' from source: play vars 15330 1726882262.11502: variable 'interface' from source: set_fact 15330 1726882262.11506: variable 'interface' from source: set_fact 15330 1726882262.11509: variable 'interface' from source: set_fact 15330 1726882262.11567: variable 'interface' from source: set_fact 15330 1726882262.11645: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15330 1726882262.11651: when evaluation is False, skipping this task 15330 1726882262.11661: _execute() done 15330 1726882262.11703: dumping result to json 15330 1726882262.11716: done dumping result, returning 15330 1726882262.11726: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-00000000001f] 15330 1726882262.11800: sending task result for task 12673a56-9f93-e4fe-1358-00000000001f 15330 1726882262.11874: done sending task result for task 12673a56-9f93-e4fe-1358-00000000001f 15330 1726882262.11877: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15330 1726882262.11976: no more pending results, returning what we have 15330 1726882262.11980: results queue empty 15330 1726882262.11981: checking for any_errors_fatal 15330 1726882262.11989: done checking for any_errors_fatal 15330 1726882262.11990: checking for max_fail_percentage 15330 1726882262.11992: done checking for max_fail_percentage 15330 1726882262.11992: checking to see if all hosts have failed and the running result is not ok 15330 1726882262.11995: done checking to see if all hosts have failed 15330 1726882262.11995: getting the remaining hosts for this loop 15330 1726882262.11997: done getting the remaining hosts for this loop 15330 1726882262.12001: getting the next task for host managed_node3 15330 1726882262.12007: done getting next task for host managed_node3 15330 1726882262.12011: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15330 1726882262.12013: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882262.12027: getting variables 15330 1726882262.12029: in VariableManager get_vars() 15330 1726882262.12067: Calling all_inventory to load vars for managed_node3 15330 1726882262.12069: Calling groups_inventory to load vars for managed_node3 15330 1726882262.12071: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882262.12081: Calling all_plugins_play to load vars for managed_node3 15330 1726882262.12084: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882262.12086: Calling groups_plugins_play to load vars for managed_node3 15330 1726882262.13518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882262.16304: done with get_vars() 15330 1726882262.16333: done getting variables 15330 1726882262.16403: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:31:02 -0400 (0:00:00.137) 0:00:11.370 ****** 15330 1726882262.16436: entering _queue_task() for managed_node3/service 15330 1726882262.16911: worker is 1 (out of 1 available) 15330 1726882262.16923: exiting _queue_task() for managed_node3/service 15330 1726882262.16933: done queuing things up, now waiting for results queue to drain 15330 1726882262.16934: waiting for pending results... 15330 1726882262.17171: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15330 1726882262.17270: in run() - task 12673a56-9f93-e4fe-1358-000000000020 15330 1726882262.17273: variable 'ansible_search_path' from source: unknown 15330 1726882262.17276: variable 'ansible_search_path' from source: unknown 15330 1726882262.17311: calling self._execute() 15330 1726882262.17452: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882262.17456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882262.17458: variable 'omit' from source: magic vars 15330 1726882262.17856: variable 'ansible_distribution_major_version' from source: facts 15330 1726882262.17872: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882262.18053: variable 'network_provider' from source: set_fact 15330 1726882262.18063: variable 'network_state' from source: role '' defaults 15330 1726882262.18075: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15330 1726882262.18100: variable 'omit' from source: magic vars 15330 1726882262.18134: variable 'omit' from source: magic vars 15330 1726882262.18200: variable 'network_service_name' from source: role '' defaults 15330 1726882262.18255: variable 'network_service_name' from source: role '' defaults 15330 1726882262.18375: variable '__network_provider_setup' from source: role '' defaults 15330 1726882262.18385: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882262.18461: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882262.18474: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882262.18549: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882262.18856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882262.20997: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882262.21092: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882262.21143: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882262.21190: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882262.21257: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882262.21319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882262.21355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882262.21473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882262.21476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882262.21478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882262.21512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882262.21539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882262.21566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882262.21624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882262.21645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882262.21898: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15330 1726882262.22036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882262.22132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882262.22135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882262.22148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882262.22168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882262.22272: variable 'ansible_python' from source: facts 15330 1726882262.22304: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15330 1726882262.22401: variable '__network_wpa_supplicant_required' from source: role '' defaults 15330 1726882262.22492: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15330 1726882262.22631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882262.22679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882262.22707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882262.22786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882262.22795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882262.22910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882262.22922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882262.22924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882262.22943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882262.22962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882262.23115: variable 'network_connections' from source: play vars 15330 1726882262.23133: variable 'interface' from source: set_fact 15330 1726882262.23216: variable 'interface' from source: set_fact 15330 1726882262.23257: variable 'interface' from source: set_fact 15330 1726882262.23320: variable 'interface' from source: set_fact 15330 1726882262.23449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882262.23677: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882262.23804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882262.23807: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882262.23846: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882262.23923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882262.23958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882262.24016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882262.24062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882262.24141: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882262.24439: variable 'network_connections' from source: play vars 15330 1726882262.24450: variable 'interface' from source: set_fact 15330 1726882262.24535: variable 'interface' from source: set_fact 15330 1726882262.24549: variable 'interface' from source: set_fact 15330 1726882262.24683: variable 'interface' from source: set_fact 15330 1726882262.24698: variable '__network_packages_default_wireless' from source: role '' defaults 15330 1726882262.24778: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882262.25119: variable 'network_connections' from source: play vars 15330 1726882262.25198: variable 'interface' from source: set_fact 15330 1726882262.25212: variable 'interface' from source: set_fact 15330 1726882262.25229: variable 'interface' from source: set_fact 15330 1726882262.25308: variable 'interface' from source: set_fact 15330 1726882262.25348: variable '__network_packages_default_team' from source: role '' defaults 15330 1726882262.25431: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882262.25750: variable 'network_connections' from source: play vars 15330 1726882262.25760: variable 'interface' from source: set_fact 15330 1726882262.25847: variable 'interface' from source: set_fact 15330 1726882262.25890: variable 'interface' from source: set_fact 15330 1726882262.25938: variable 'interface' from source: set_fact 15330 1726882262.26009: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882262.26070: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882262.26109: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882262.26153: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882262.26382: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15330 1726882262.27054: variable 'network_connections' from source: play vars 15330 1726882262.27198: variable 'interface' from source: set_fact 15330 1726882262.27201: variable 'interface' from source: set_fact 15330 1726882262.27203: variable 'interface' from source: set_fact 15330 1726882262.27205: variable 'interface' from source: set_fact 15330 1726882262.27207: variable 'ansible_distribution' from source: facts 15330 1726882262.27209: variable '__network_rh_distros' from source: role '' defaults 15330 1726882262.27210: variable 'ansible_distribution_major_version' from source: facts 15330 1726882262.27245: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15330 1726882262.27440: variable 'ansible_distribution' from source: facts 15330 1726882262.27450: variable '__network_rh_distros' from source: role '' defaults 15330 1726882262.27460: variable 'ansible_distribution_major_version' from source: facts 15330 1726882262.27474: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15330 1726882262.27640: variable 'ansible_distribution' from source: facts 15330 1726882262.27654: variable '__network_rh_distros' from source: role '' defaults 15330 1726882262.27662: variable 'ansible_distribution_major_version' from source: facts 15330 1726882262.27701: variable 'network_provider' from source: set_fact 15330 1726882262.27727: variable 'omit' from source: magic vars 15330 1726882262.27763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882262.27799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882262.27871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882262.27875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882262.27878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882262.27901: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882262.27908: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882262.27914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882262.28012: Set connection var ansible_pipelining to False 15330 1726882262.28029: Set connection var ansible_timeout to 10 15330 1726882262.28035: Set connection var ansible_connection to ssh 15330 1726882262.28039: Set connection var ansible_shell_type to sh 15330 1726882262.28046: Set connection var ansible_shell_executable to /bin/sh 15330 1726882262.28053: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882262.28089: variable 'ansible_shell_executable' from source: unknown 15330 1726882262.28092: variable 'ansible_connection' from source: unknown 15330 1726882262.28095: variable 'ansible_module_compression' from source: unknown 15330 1726882262.28200: variable 'ansible_shell_type' from source: unknown 15330 1726882262.28204: variable 'ansible_shell_executable' from source: unknown 15330 1726882262.28206: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882262.28214: variable 'ansible_pipelining' from source: unknown 15330 1726882262.28216: variable 'ansible_timeout' from source: unknown 15330 1726882262.28219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882262.28255: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882262.28272: variable 'omit' from source: magic vars 15330 1726882262.28285: starting attempt loop 15330 1726882262.28298: running the handler 15330 1726882262.28376: variable 'ansible_facts' from source: unknown 15330 1726882262.29114: _low_level_execute_command(): starting 15330 1726882262.29124: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882262.29918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882262.29950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882262.29966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882262.29991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882262.30071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882262.31807: stdout chunk (state=3): >>>/root <<< 15330 1726882262.31931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882262.31939: stdout chunk (state=3): >>><<< 15330 1726882262.31947: stderr chunk (state=3): >>><<< 15330 1726882262.31963: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882262.31975: _low_level_execute_command(): starting 15330 1726882262.32003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368 `" && echo ansible-tmp-1726882262.3196743-15940-136813512390368="` echo /root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368 `" ) && sleep 0' 15330 1726882262.32386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882262.32420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882262.32424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882262.32426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882262.32430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882262.32432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882262.32481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882262.32489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882262.32491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882262.32533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882262.34451: stdout chunk (state=3): >>>ansible-tmp-1726882262.3196743-15940-136813512390368=/root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368 <<< 15330 1726882262.34545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882262.34598: stderr chunk (state=3): >>><<< 15330 1726882262.34602: stdout chunk (state=3): >>><<< 15330 1726882262.34619: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882262.3196743-15940-136813512390368=/root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882262.34757: variable 'ansible_module_compression' from source: unknown 15330 1726882262.34762: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 15330 1726882262.34765: ANSIBALLZ: Acquiring lock 15330 1726882262.34767: ANSIBALLZ: Lock acquired: 140238209361168 15330 1726882262.34769: ANSIBALLZ: Creating module 15330 1726882262.71030: ANSIBALLZ: Writing module into payload 15330 1726882262.71190: ANSIBALLZ: Writing module 15330 1726882262.71226: ANSIBALLZ: Renaming module 15330 1726882262.71233: ANSIBALLZ: Done creating module 15330 1726882262.71262: variable 'ansible_facts' from source: unknown 15330 1726882262.71480: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368/AnsiballZ_systemd.py 15330 1726882262.71550: Sending initial data 15330 1726882262.71553: Sent initial data (156 bytes) 15330 1726882262.72309: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882262.72320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882262.72345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882262.72421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882262.74079: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15330 1726882262.74123: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15330 1726882262.74128: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882262.74175: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882262.74237: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp6gg25h1m /root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368/AnsiballZ_systemd.py <<< 15330 1726882262.74244: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368/AnsiballZ_systemd.py" <<< 15330 1726882262.74274: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp6gg25h1m" to remote "/root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368/AnsiballZ_systemd.py" <<< 15330 1726882262.76471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882262.76475: stdout chunk (state=3): >>><<< 15330 1726882262.76477: stderr chunk (state=3): >>><<< 15330 1726882262.76479: done transferring module to remote 15330 1726882262.76481: _low_level_execute_command(): starting 15330 1726882262.76483: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368/ /root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368/AnsiballZ_systemd.py && sleep 0' 15330 1726882262.78661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882262.78799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882262.78832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882262.79009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882262.79119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882262.80882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882262.80970: stderr chunk (state=3): >>><<< 15330 1726882262.80978: stdout chunk (state=3): >>><<< 15330 1726882262.81101: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882262.81105: _low_level_execute_command(): starting 15330 1726882262.81110: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368/AnsiballZ_systemd.py && sleep 0' 15330 1726882262.81725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882262.81735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882262.81746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882262.81765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882262.81806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882262.81861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882262.81876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882262.81888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882262.81970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882263.10649: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10416128", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3318468608", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1175910000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 15330 1726882263.10655: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15330 1726882263.12309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882263.12500: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 15330 1726882263.12504: stdout chunk (state=3): >>><<< 15330 1726882263.12507: stderr chunk (state=3): >>><<< 15330 1726882263.12510: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10416128", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3318468608", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1175910000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882263.12687: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882263.13000: _low_level_execute_command(): starting 15330 1726882263.13007: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882262.3196743-15940-136813512390368/ > /dev/null 2>&1 && sleep 0' 15330 1726882263.14103: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882263.14106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882263.14223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882263.14229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 15330 1726882263.14242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882263.14247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882263.14253: stderr chunk (state=3): >>>debug2: match found <<< 15330 1726882263.14265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882263.14418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882263.14427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882263.14511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882263.14645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882263.16449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882263.16453: stdout chunk (state=3): >>><<< 15330 1726882263.16456: stderr chunk (state=3): >>><<< 15330 1726882263.16472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882263.16483: handler run complete 15330 1726882263.16701: attempt loop complete, returning result 15330 1726882263.16704: _execute() done 15330 1726882263.16705: dumping result to json 15330 1726882263.16707: done dumping result, returning 15330 1726882263.16709: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-e4fe-1358-000000000020] 15330 1726882263.16712: sending task result for task 12673a56-9f93-e4fe-1358-000000000020 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882263.17446: no more pending results, returning what we have 15330 1726882263.17449: results queue empty 15330 1726882263.17450: checking for any_errors_fatal 15330 1726882263.17457: done checking for any_errors_fatal 15330 1726882263.17458: checking for max_fail_percentage 15330 1726882263.17460: done checking for max_fail_percentage 15330 1726882263.17461: checking to see if all hosts have failed and the running result is not ok 15330 1726882263.17462: done checking to see if all hosts have failed 15330 1726882263.17462: getting the remaining hosts for this loop 15330 1726882263.17463: done getting the remaining hosts for this loop 15330 1726882263.17467: getting the next task for host managed_node3 15330 1726882263.17473: done getting next task for host managed_node3 15330 1726882263.17477: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15330 1726882263.17479: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882263.17490: getting variables 15330 1726882263.17492: in VariableManager get_vars() 15330 1726882263.17527: Calling all_inventory to load vars for managed_node3 15330 1726882263.17529: Calling groups_inventory to load vars for managed_node3 15330 1726882263.17532: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882263.17541: Calling all_plugins_play to load vars for managed_node3 15330 1726882263.17544: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882263.17546: Calling groups_plugins_play to load vars for managed_node3 15330 1726882263.18401: done sending task result for task 12673a56-9f93-e4fe-1358-000000000020 15330 1726882263.18404: WORKER PROCESS EXITING 15330 1726882263.20491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882263.22266: done with get_vars() 15330 1726882263.22294: done getting variables 15330 1726882263.22361: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:31:03 -0400 (0:00:01.059) 0:00:12.429 ****** 15330 1726882263.22397: entering _queue_task() for managed_node3/service 15330 1726882263.22749: worker is 1 (out of 1 available) 15330 1726882263.22762: exiting _queue_task() for managed_node3/service 15330 1726882263.22777: done queuing things up, now waiting for results queue to drain 15330 1726882263.22779: waiting for pending results... 15330 1726882263.23060: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15330 1726882263.23179: in run() - task 12673a56-9f93-e4fe-1358-000000000021 15330 1726882263.23208: variable 'ansible_search_path' from source: unknown 15330 1726882263.23225: variable 'ansible_search_path' from source: unknown 15330 1726882263.23265: calling self._execute() 15330 1726882263.23362: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882263.23374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882263.23391: variable 'omit' from source: magic vars 15330 1726882263.23799: variable 'ansible_distribution_major_version' from source: facts 15330 1726882263.23819: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882263.23939: variable 'network_provider' from source: set_fact 15330 1726882263.23951: Evaluated conditional (network_provider == "nm"): True 15330 1726882263.24053: variable '__network_wpa_supplicant_required' from source: role '' defaults 15330 1726882263.24152: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15330 1726882263.24332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882263.26468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882263.26536: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882263.26599: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882263.26624: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882263.26656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882263.26798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882263.26803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882263.26836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882263.26879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882263.26905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882263.26954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882263.27010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882263.27017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882263.27060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882263.27079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882263.27227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882263.27231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882263.27233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882263.27235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882263.27246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882263.27400: variable 'network_connections' from source: play vars 15330 1726882263.27416: variable 'interface' from source: set_fact 15330 1726882263.27501: variable 'interface' from source: set_fact 15330 1726882263.27514: variable 'interface' from source: set_fact 15330 1726882263.27584: variable 'interface' from source: set_fact 15330 1726882263.27666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882263.27836: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882263.27882: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882263.27917: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882263.27950: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882263.28007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882263.28034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882263.28062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882263.28097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882263.28148: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882263.28426: variable 'network_connections' from source: play vars 15330 1726882263.28429: variable 'interface' from source: set_fact 15330 1726882263.28484: variable 'interface' from source: set_fact 15330 1726882263.28535: variable 'interface' from source: set_fact 15330 1726882263.28563: variable 'interface' from source: set_fact 15330 1726882263.28609: Evaluated conditional (__network_wpa_supplicant_required): False 15330 1726882263.28616: when evaluation is False, skipping this task 15330 1726882263.28623: _execute() done 15330 1726882263.28646: dumping result to json 15330 1726882263.28697: done dumping result, returning 15330 1726882263.28701: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-e4fe-1358-000000000021] 15330 1726882263.28703: sending task result for task 12673a56-9f93-e4fe-1358-000000000021 15330 1726882263.29030: done sending task result for task 12673a56-9f93-e4fe-1358-000000000021 15330 1726882263.29034: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15330 1726882263.29076: no more pending results, returning what we have 15330 1726882263.29079: results queue empty 15330 1726882263.29080: checking for any_errors_fatal 15330 1726882263.29101: done checking for any_errors_fatal 15330 1726882263.29102: checking for max_fail_percentage 15330 1726882263.29104: done checking for max_fail_percentage 15330 1726882263.29105: checking to see if all hosts have failed and the running result is not ok 15330 1726882263.29106: done checking to see if all hosts have failed 15330 1726882263.29107: getting the remaining hosts for this loop 15330 1726882263.29108: done getting the remaining hosts for this loop 15330 1726882263.29112: getting the next task for host managed_node3 15330 1726882263.29117: done getting next task for host managed_node3 15330 1726882263.29121: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15330 1726882263.29123: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882263.29136: getting variables 15330 1726882263.29138: in VariableManager get_vars() 15330 1726882263.29180: Calling all_inventory to load vars for managed_node3 15330 1726882263.29183: Calling groups_inventory to load vars for managed_node3 15330 1726882263.29185: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882263.29196: Calling all_plugins_play to load vars for managed_node3 15330 1726882263.29198: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882263.29201: Calling groups_plugins_play to load vars for managed_node3 15330 1726882263.30656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882263.32945: done with get_vars() 15330 1726882263.32976: done getting variables 15330 1726882263.33038: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:31:03 -0400 (0:00:00.106) 0:00:12.536 ****** 15330 1726882263.33073: entering _queue_task() for managed_node3/service 15330 1726882263.33620: worker is 1 (out of 1 available) 15330 1726882263.33630: exiting _queue_task() for managed_node3/service 15330 1726882263.33641: done queuing things up, now waiting for results queue to drain 15330 1726882263.33642: waiting for pending results... 15330 1726882263.33735: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 15330 1726882263.33921: in run() - task 12673a56-9f93-e4fe-1358-000000000022 15330 1726882263.33942: variable 'ansible_search_path' from source: unknown 15330 1726882263.33950: variable 'ansible_search_path' from source: unknown 15330 1726882263.33995: calling self._execute() 15330 1726882263.34087: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882263.34102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882263.34116: variable 'omit' from source: magic vars 15330 1726882263.34867: variable 'ansible_distribution_major_version' from source: facts 15330 1726882263.34912: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882263.35198: variable 'network_provider' from source: set_fact 15330 1726882263.35202: Evaluated conditional (network_provider == "initscripts"): False 15330 1726882263.35204: when evaluation is False, skipping this task 15330 1726882263.35207: _execute() done 15330 1726882263.35209: dumping result to json 15330 1726882263.35213: done dumping result, returning 15330 1726882263.35391: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-e4fe-1358-000000000022] 15330 1726882263.35397: sending task result for task 12673a56-9f93-e4fe-1358-000000000022 15330 1726882263.35464: done sending task result for task 12673a56-9f93-e4fe-1358-000000000022 15330 1726882263.35467: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882263.35540: no more pending results, returning what we have 15330 1726882263.35544: results queue empty 15330 1726882263.35545: checking for any_errors_fatal 15330 1726882263.35555: done checking for any_errors_fatal 15330 1726882263.35556: checking for max_fail_percentage 15330 1726882263.35558: done checking for max_fail_percentage 15330 1726882263.35559: checking to see if all hosts have failed and the running result is not ok 15330 1726882263.35559: done checking to see if all hosts have failed 15330 1726882263.35560: getting the remaining hosts for this loop 15330 1726882263.35561: done getting the remaining hosts for this loop 15330 1726882263.35566: getting the next task for host managed_node3 15330 1726882263.35573: done getting next task for host managed_node3 15330 1726882263.35578: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15330 1726882263.35581: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882263.35597: getting variables 15330 1726882263.35695: in VariableManager get_vars() 15330 1726882263.35743: Calling all_inventory to load vars for managed_node3 15330 1726882263.35746: Calling groups_inventory to load vars for managed_node3 15330 1726882263.35749: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882263.35761: Calling all_plugins_play to load vars for managed_node3 15330 1726882263.35764: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882263.35767: Calling groups_plugins_play to load vars for managed_node3 15330 1726882263.37424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882263.39055: done with get_vars() 15330 1726882263.39086: done getting variables 15330 1726882263.39146: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:31:03 -0400 (0:00:00.061) 0:00:12.597 ****** 15330 1726882263.39177: entering _queue_task() for managed_node3/copy 15330 1726882263.39606: worker is 1 (out of 1 available) 15330 1726882263.39620: exiting _queue_task() for managed_node3/copy 15330 1726882263.39631: done queuing things up, now waiting for results queue to drain 15330 1726882263.39633: waiting for pending results... 15330 1726882263.39872: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15330 1726882263.39970: in run() - task 12673a56-9f93-e4fe-1358-000000000023 15330 1726882263.39974: variable 'ansible_search_path' from source: unknown 15330 1726882263.39976: variable 'ansible_search_path' from source: unknown 15330 1726882263.39998: calling self._execute() 15330 1726882263.40100: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882263.40112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882263.40130: variable 'omit' from source: magic vars 15330 1726882263.40519: variable 'ansible_distribution_major_version' from source: facts 15330 1726882263.40598: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882263.40659: variable 'network_provider' from source: set_fact 15330 1726882263.40670: Evaluated conditional (network_provider == "initscripts"): False 15330 1726882263.40677: when evaluation is False, skipping this task 15330 1726882263.40685: _execute() done 15330 1726882263.40694: dumping result to json 15330 1726882263.40703: done dumping result, returning 15330 1726882263.40722: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-e4fe-1358-000000000023] 15330 1726882263.40799: sending task result for task 12673a56-9f93-e4fe-1358-000000000023 15330 1726882263.40877: done sending task result for task 12673a56-9f93-e4fe-1358-000000000023 15330 1726882263.40880: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15330 1726882263.40928: no more pending results, returning what we have 15330 1726882263.40931: results queue empty 15330 1726882263.40933: checking for any_errors_fatal 15330 1726882263.40939: done checking for any_errors_fatal 15330 1726882263.40940: checking for max_fail_percentage 15330 1726882263.40942: done checking for max_fail_percentage 15330 1726882263.40943: checking to see if all hosts have failed and the running result is not ok 15330 1726882263.40944: done checking to see if all hosts have failed 15330 1726882263.40944: getting the remaining hosts for this loop 15330 1726882263.40946: done getting the remaining hosts for this loop 15330 1726882263.40950: getting the next task for host managed_node3 15330 1726882263.40956: done getting next task for host managed_node3 15330 1726882263.40960: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15330 1726882263.40962: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882263.40976: getting variables 15330 1726882263.40978: in VariableManager get_vars() 15330 1726882263.41018: Calling all_inventory to load vars for managed_node3 15330 1726882263.41021: Calling groups_inventory to load vars for managed_node3 15330 1726882263.41023: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882263.41036: Calling all_plugins_play to load vars for managed_node3 15330 1726882263.41039: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882263.41043: Calling groups_plugins_play to load vars for managed_node3 15330 1726882263.42594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882263.44298: done with get_vars() 15330 1726882263.44319: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:31:03 -0400 (0:00:00.052) 0:00:12.650 ****** 15330 1726882263.44405: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15330 1726882263.44407: Creating lock for fedora.linux_system_roles.network_connections 15330 1726882263.44748: worker is 1 (out of 1 available) 15330 1726882263.44760: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15330 1726882263.44772: done queuing things up, now waiting for results queue to drain 15330 1726882263.44773: waiting for pending results... 15330 1726882263.45216: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15330 1726882263.45222: in run() - task 12673a56-9f93-e4fe-1358-000000000024 15330 1726882263.45225: variable 'ansible_search_path' from source: unknown 15330 1726882263.45228: variable 'ansible_search_path' from source: unknown 15330 1726882263.45230: calling self._execute() 15330 1726882263.45328: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882263.45343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882263.45436: variable 'omit' from source: magic vars 15330 1726882263.45740: variable 'ansible_distribution_major_version' from source: facts 15330 1726882263.45765: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882263.45782: variable 'omit' from source: magic vars 15330 1726882263.45827: variable 'omit' from source: magic vars 15330 1726882263.46000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882263.48139: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882263.48218: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882263.48263: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882263.48310: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882263.48342: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882263.48433: variable 'network_provider' from source: set_fact 15330 1726882263.48574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882263.48702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882263.48705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882263.48707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882263.48709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882263.48781: variable 'omit' from source: magic vars 15330 1726882263.48905: variable 'omit' from source: magic vars 15330 1726882263.49017: variable 'network_connections' from source: play vars 15330 1726882263.49039: variable 'interface' from source: set_fact 15330 1726882263.49107: variable 'interface' from source: set_fact 15330 1726882263.49118: variable 'interface' from source: set_fact 15330 1726882263.49244: variable 'interface' from source: set_fact 15330 1726882263.49342: variable 'omit' from source: magic vars 15330 1726882263.49364: variable '__lsr_ansible_managed' from source: task vars 15330 1726882263.49461: variable '__lsr_ansible_managed' from source: task vars 15330 1726882263.49613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15330 1726882263.49840: Loaded config def from plugin (lookup/template) 15330 1726882263.49849: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15330 1726882263.49880: File lookup term: get_ansible_managed.j2 15330 1726882263.49896: variable 'ansible_search_path' from source: unknown 15330 1726882263.50006: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15330 1726882263.50010: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15330 1726882263.50013: variable 'ansible_search_path' from source: unknown 15330 1726882263.56875: variable 'ansible_managed' from source: unknown 15330 1726882263.57357: variable 'omit' from source: magic vars 15330 1726882263.57361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882263.57364: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882263.57366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882263.57406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882263.57421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882263.57452: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882263.57581: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882263.57680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882263.57901: Set connection var ansible_pipelining to False 15330 1726882263.57904: Set connection var ansible_timeout to 10 15330 1726882263.57907: Set connection var ansible_connection to ssh 15330 1726882263.57909: Set connection var ansible_shell_type to sh 15330 1726882263.57911: Set connection var ansible_shell_executable to /bin/sh 15330 1726882263.57913: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882263.57915: variable 'ansible_shell_executable' from source: unknown 15330 1726882263.57917: variable 'ansible_connection' from source: unknown 15330 1726882263.57919: variable 'ansible_module_compression' from source: unknown 15330 1726882263.57921: variable 'ansible_shell_type' from source: unknown 15330 1726882263.57923: variable 'ansible_shell_executable' from source: unknown 15330 1726882263.57925: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882263.57927: variable 'ansible_pipelining' from source: unknown 15330 1726882263.57928: variable 'ansible_timeout' from source: unknown 15330 1726882263.57930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882263.58106: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882263.58300: variable 'omit' from source: magic vars 15330 1726882263.58303: starting attempt loop 15330 1726882263.58306: running the handler 15330 1726882263.58308: _low_level_execute_command(): starting 15330 1726882263.58310: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882263.59511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882263.59613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882263.59654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882263.59760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882263.59763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882263.59925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882263.61609: stdout chunk (state=3): >>>/root <<< 15330 1726882263.61712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882263.61919: stderr chunk (state=3): >>><<< 15330 1726882263.61922: stdout chunk (state=3): >>><<< 15330 1726882263.61925: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882263.61928: _low_level_execute_command(): starting 15330 1726882263.61930: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517 `" && echo ansible-tmp-1726882263.6184819-15986-263700937339517="` echo /root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517 `" ) && sleep 0' 15330 1726882263.63099: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882263.63240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882263.63244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882263.63290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882263.63305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882263.63409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882263.63579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882263.65451: stdout chunk (state=3): >>>ansible-tmp-1726882263.6184819-15986-263700937339517=/root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517 <<< 15330 1726882263.65619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882263.65903: stderr chunk (state=3): >>><<< 15330 1726882263.65906: stdout chunk (state=3): >>><<< 15330 1726882263.65909: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882263.6184819-15986-263700937339517=/root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882263.65911: variable 'ansible_module_compression' from source: unknown 15330 1726882263.65913: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 15330 1726882263.65915: ANSIBALLZ: Acquiring lock 15330 1726882263.65917: ANSIBALLZ: Lock acquired: 140238202174544 15330 1726882263.65919: ANSIBALLZ: Creating module 15330 1726882263.98156: ANSIBALLZ: Writing module into payload 15330 1726882263.98799: ANSIBALLZ: Writing module 15330 1726882263.98890: ANSIBALLZ: Renaming module 15330 1726882263.98998: ANSIBALLZ: Done creating module 15330 1726882263.99086: variable 'ansible_facts' from source: unknown 15330 1726882263.99248: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517/AnsiballZ_network_connections.py 15330 1726882263.99562: Sending initial data 15330 1726882263.99570: Sent initial data (168 bytes) 15330 1726882264.00980: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882264.01046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882264.01153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882264.01238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882264.01317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882264.02974: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882264.03005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882264.03127: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp16uayeja /root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517/AnsiballZ_network_connections.py <<< 15330 1726882264.03130: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517/AnsiballZ_network_connections.py" <<< 15330 1726882264.03210: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp16uayeja" to remote "/root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517/AnsiballZ_network_connections.py" <<< 15330 1726882264.04933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882264.05001: stderr chunk (state=3): >>><<< 15330 1726882264.05005: stdout chunk (state=3): >>><<< 15330 1726882264.05026: done transferring module to remote 15330 1726882264.05105: _low_level_execute_command(): starting 15330 1726882264.05108: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517/ /root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517/AnsiballZ_network_connections.py && sleep 0' 15330 1726882264.06553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882264.06558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882264.06562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882264.06564: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882264.06566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882264.07116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882264.07135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882264.08874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882264.08937: stderr chunk (state=3): >>><<< 15330 1726882264.08940: stdout chunk (state=3): >>><<< 15330 1726882264.09205: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882264.09209: _low_level_execute_command(): starting 15330 1726882264.09211: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517/AnsiballZ_network_connections.py && sleep 0' 15330 1726882264.10441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882264.10661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882264.10706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882264.10709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882264.11042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882264.11094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882264.42130: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15330 1726882264.43828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882264.43848: stderr chunk (state=3): >>><<< 15330 1726882264.43864: stdout chunk (state=3): >>><<< 15330 1726882264.43967: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882264.44044: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882264.44048: _low_level_execute_command(): starting 15330 1726882264.44050: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882263.6184819-15986-263700937339517/ > /dev/null 2>&1 && sleep 0' 15330 1726882264.44918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882264.44953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882264.44970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882264.45024: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882264.45206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882264.45259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882264.45311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882264.47187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882264.47191: stdout chunk (state=3): >>><<< 15330 1726882264.47195: stderr chunk (state=3): >>><<< 15330 1726882264.47429: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882264.47432: handler run complete 15330 1726882264.47434: attempt loop complete, returning result 15330 1726882264.47436: _execute() done 15330 1726882264.47438: dumping result to json 15330 1726882264.47440: done dumping result, returning 15330 1726882264.47442: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-e4fe-1358-000000000024] 15330 1726882264.47444: sending task result for task 12673a56-9f93-e4fe-1358-000000000024 changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e (not-active) 15330 1726882264.48001: no more pending results, returning what we have 15330 1726882264.48005: results queue empty 15330 1726882264.48007: checking for any_errors_fatal 15330 1726882264.48013: done checking for any_errors_fatal 15330 1726882264.48014: checking for max_fail_percentage 15330 1726882264.48016: done checking for max_fail_percentage 15330 1726882264.48017: checking to see if all hosts have failed and the running result is not ok 15330 1726882264.48018: done checking to see if all hosts have failed 15330 1726882264.48018: getting the remaining hosts for this loop 15330 1726882264.48020: done getting the remaining hosts for this loop 15330 1726882264.48024: getting the next task for host managed_node3 15330 1726882264.48030: done getting next task for host managed_node3 15330 1726882264.48035: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15330 1726882264.48036: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882264.48046: getting variables 15330 1726882264.48049: in VariableManager get_vars() 15330 1726882264.48085: Calling all_inventory to load vars for managed_node3 15330 1726882264.48087: Calling groups_inventory to load vars for managed_node3 15330 1726882264.48089: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882264.48321: Calling all_plugins_play to load vars for managed_node3 15330 1726882264.48325: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882264.48330: done sending task result for task 12673a56-9f93-e4fe-1358-000000000024 15330 1726882264.48332: WORKER PROCESS EXITING 15330 1726882264.48336: Calling groups_plugins_play to load vars for managed_node3 15330 1726882264.51611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882264.54658: done with get_vars() 15330 1726882264.54688: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:31:04 -0400 (0:00:01.103) 0:00:13.753 ****** 15330 1726882264.54784: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15330 1726882264.54786: Creating lock for fedora.linux_system_roles.network_state 15330 1726882264.55187: worker is 1 (out of 1 available) 15330 1726882264.55265: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15330 1726882264.55277: done queuing things up, now waiting for results queue to drain 15330 1726882264.55278: waiting for pending results... 15330 1726882264.55506: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 15330 1726882264.55622: in run() - task 12673a56-9f93-e4fe-1358-000000000025 15330 1726882264.55645: variable 'ansible_search_path' from source: unknown 15330 1726882264.55654: variable 'ansible_search_path' from source: unknown 15330 1726882264.55711: calling self._execute() 15330 1726882264.55866: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.55971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.56237: variable 'omit' from source: magic vars 15330 1726882264.56848: variable 'ansible_distribution_major_version' from source: facts 15330 1726882264.56866: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882264.57117: variable 'network_state' from source: role '' defaults 15330 1726882264.57138: Evaluated conditional (network_state != {}): False 15330 1726882264.57210: when evaluation is False, skipping this task 15330 1726882264.57238: _execute() done 15330 1726882264.57241: dumping result to json 15330 1726882264.57244: done dumping result, returning 15330 1726882264.57249: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-e4fe-1358-000000000025] 15330 1726882264.57259: sending task result for task 12673a56-9f93-e4fe-1358-000000000025 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882264.57575: no more pending results, returning what we have 15330 1726882264.57579: results queue empty 15330 1726882264.57580: checking for any_errors_fatal 15330 1726882264.57597: done checking for any_errors_fatal 15330 1726882264.57598: checking for max_fail_percentage 15330 1726882264.57600: done checking for max_fail_percentage 15330 1726882264.57601: checking to see if all hosts have failed and the running result is not ok 15330 1726882264.57602: done checking to see if all hosts have failed 15330 1726882264.57602: getting the remaining hosts for this loop 15330 1726882264.57604: done getting the remaining hosts for this loop 15330 1726882264.57608: getting the next task for host managed_node3 15330 1726882264.57617: done getting next task for host managed_node3 15330 1726882264.57621: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15330 1726882264.57623: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882264.57638: getting variables 15330 1726882264.57640: in VariableManager get_vars() 15330 1726882264.57679: Calling all_inventory to load vars for managed_node3 15330 1726882264.57682: Calling groups_inventory to load vars for managed_node3 15330 1726882264.57684: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882264.57926: Calling all_plugins_play to load vars for managed_node3 15330 1726882264.57930: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882264.57936: done sending task result for task 12673a56-9f93-e4fe-1358-000000000025 15330 1726882264.57939: WORKER PROCESS EXITING 15330 1726882264.57943: Calling groups_plugins_play to load vars for managed_node3 15330 1726882264.59757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882264.62966: done with get_vars() 15330 1726882264.63000: done getting variables 15330 1726882264.63137: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:31:04 -0400 (0:00:00.083) 0:00:13.837 ****** 15330 1726882264.63167: entering _queue_task() for managed_node3/debug 15330 1726882264.64059: worker is 1 (out of 1 available) 15330 1726882264.64070: exiting _queue_task() for managed_node3/debug 15330 1726882264.64082: done queuing things up, now waiting for results queue to drain 15330 1726882264.64083: waiting for pending results... 15330 1726882264.64564: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15330 1726882264.64905: in run() - task 12673a56-9f93-e4fe-1358-000000000026 15330 1726882264.64909: variable 'ansible_search_path' from source: unknown 15330 1726882264.64913: variable 'ansible_search_path' from source: unknown 15330 1726882264.64947: calling self._execute() 15330 1726882264.65230: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.65235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.65239: variable 'omit' from source: magic vars 15330 1726882264.66076: variable 'ansible_distribution_major_version' from source: facts 15330 1726882264.66095: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882264.66107: variable 'omit' from source: magic vars 15330 1726882264.66182: variable 'omit' from source: magic vars 15330 1726882264.66281: variable 'omit' from source: magic vars 15330 1726882264.66339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882264.66674: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882264.66677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882264.66680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882264.66710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882264.66808: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882264.66817: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.66825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.67130: Set connection var ansible_pipelining to False 15330 1726882264.67133: Set connection var ansible_timeout to 10 15330 1726882264.67136: Set connection var ansible_connection to ssh 15330 1726882264.67138: Set connection var ansible_shell_type to sh 15330 1726882264.67140: Set connection var ansible_shell_executable to /bin/sh 15330 1726882264.67142: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882264.67398: variable 'ansible_shell_executable' from source: unknown 15330 1726882264.67401: variable 'ansible_connection' from source: unknown 15330 1726882264.67403: variable 'ansible_module_compression' from source: unknown 15330 1726882264.67405: variable 'ansible_shell_type' from source: unknown 15330 1726882264.67407: variable 'ansible_shell_executable' from source: unknown 15330 1726882264.67409: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.67410: variable 'ansible_pipelining' from source: unknown 15330 1726882264.67412: variable 'ansible_timeout' from source: unknown 15330 1726882264.67414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.67563: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882264.67580: variable 'omit' from source: magic vars 15330 1726882264.67590: starting attempt loop 15330 1726882264.67621: running the handler 15330 1726882264.67780: variable '__network_connections_result' from source: set_fact 15330 1726882264.67851: handler run complete 15330 1726882264.67873: attempt loop complete, returning result 15330 1726882264.67879: _execute() done 15330 1726882264.67885: dumping result to json 15330 1726882264.67895: done dumping result, returning 15330 1726882264.67913: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-e4fe-1358-000000000026] 15330 1726882264.67924: sending task result for task 12673a56-9f93-e4fe-1358-000000000026 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e (not-active)" ] } 15330 1726882264.68132: no more pending results, returning what we have 15330 1726882264.68136: results queue empty 15330 1726882264.68137: checking for any_errors_fatal 15330 1726882264.68144: done checking for any_errors_fatal 15330 1726882264.68145: checking for max_fail_percentage 15330 1726882264.68147: done checking for max_fail_percentage 15330 1726882264.68148: checking to see if all hosts have failed and the running result is not ok 15330 1726882264.68149: done checking to see if all hosts have failed 15330 1726882264.68149: getting the remaining hosts for this loop 15330 1726882264.68151: done getting the remaining hosts for this loop 15330 1726882264.68198: getting the next task for host managed_node3 15330 1726882264.68322: done getting next task for host managed_node3 15330 1726882264.68327: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15330 1726882264.68329: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882264.68340: getting variables 15330 1726882264.68342: in VariableManager get_vars() 15330 1726882264.68383: Calling all_inventory to load vars for managed_node3 15330 1726882264.68386: Calling groups_inventory to load vars for managed_node3 15330 1726882264.68388: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882264.68435: Calling all_plugins_play to load vars for managed_node3 15330 1726882264.68438: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882264.68443: done sending task result for task 12673a56-9f93-e4fe-1358-000000000026 15330 1726882264.68446: WORKER PROCESS EXITING 15330 1726882264.68449: Calling groups_plugins_play to load vars for managed_node3 15330 1726882264.70933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882264.73844: done with get_vars() 15330 1726882264.73876: done getting variables 15330 1726882264.73984: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:31:04 -0400 (0:00:00.108) 0:00:13.946 ****** 15330 1726882264.74018: entering _queue_task() for managed_node3/debug 15330 1726882264.74597: worker is 1 (out of 1 available) 15330 1726882264.74609: exiting _queue_task() for managed_node3/debug 15330 1726882264.74624: done queuing things up, now waiting for results queue to drain 15330 1726882264.74626: waiting for pending results... 15330 1726882264.74901: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15330 1726882264.75164: in run() - task 12673a56-9f93-e4fe-1358-000000000027 15330 1726882264.75188: variable 'ansible_search_path' from source: unknown 15330 1726882264.75200: variable 'ansible_search_path' from source: unknown 15330 1726882264.75351: calling self._execute() 15330 1726882264.75365: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.75379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.75399: variable 'omit' from source: magic vars 15330 1726882264.75904: variable 'ansible_distribution_major_version' from source: facts 15330 1726882264.75907: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882264.75910: variable 'omit' from source: magic vars 15330 1726882264.75912: variable 'omit' from source: magic vars 15330 1726882264.75942: variable 'omit' from source: magic vars 15330 1726882264.75985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882264.76039: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882264.76068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882264.76090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882264.76109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882264.76160: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882264.76168: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.76199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.76291: Set connection var ansible_pipelining to False 15330 1726882264.76313: Set connection var ansible_timeout to 10 15330 1726882264.76321: Set connection var ansible_connection to ssh 15330 1726882264.76350: Set connection var ansible_shell_type to sh 15330 1726882264.76356: Set connection var ansible_shell_executable to /bin/sh 15330 1726882264.76361: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882264.76386: variable 'ansible_shell_executable' from source: unknown 15330 1726882264.76446: variable 'ansible_connection' from source: unknown 15330 1726882264.76450: variable 'ansible_module_compression' from source: unknown 15330 1726882264.76452: variable 'ansible_shell_type' from source: unknown 15330 1726882264.76462: variable 'ansible_shell_executable' from source: unknown 15330 1726882264.76465: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.76466: variable 'ansible_pipelining' from source: unknown 15330 1726882264.76468: variable 'ansible_timeout' from source: unknown 15330 1726882264.76470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.76883: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882264.76887: variable 'omit' from source: magic vars 15330 1726882264.76891: starting attempt loop 15330 1726882264.76895: running the handler 15330 1726882264.76992: variable '__network_connections_result' from source: set_fact 15330 1726882264.77061: variable '__network_connections_result' from source: set_fact 15330 1726882264.77209: handler run complete 15330 1726882264.77250: attempt loop complete, returning result 15330 1726882264.77257: _execute() done 15330 1726882264.77263: dumping result to json 15330 1726882264.77299: done dumping result, returning 15330 1726882264.77302: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-e4fe-1358-000000000027] 15330 1726882264.77305: sending task result for task 12673a56-9f93-e4fe-1358-000000000027 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, da01a2c2-cda1-473e-9566-db3ae75e453e (not-active)" ] } } 15330 1726882264.77609: no more pending results, returning what we have 15330 1726882264.77614: results queue empty 15330 1726882264.77616: checking for any_errors_fatal 15330 1726882264.77624: done checking for any_errors_fatal 15330 1726882264.77625: checking for max_fail_percentage 15330 1726882264.77626: done checking for max_fail_percentage 15330 1726882264.77628: checking to see if all hosts have failed and the running result is not ok 15330 1726882264.77628: done checking to see if all hosts have failed 15330 1726882264.77629: getting the remaining hosts for this loop 15330 1726882264.77631: done getting the remaining hosts for this loop 15330 1726882264.77634: getting the next task for host managed_node3 15330 1726882264.77642: done getting next task for host managed_node3 15330 1726882264.77646: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15330 1726882264.77648: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882264.77807: done sending task result for task 12673a56-9f93-e4fe-1358-000000000027 15330 1726882264.77810: WORKER PROCESS EXITING 15330 1726882264.77817: getting variables 15330 1726882264.77818: in VariableManager get_vars() 15330 1726882264.77853: Calling all_inventory to load vars for managed_node3 15330 1726882264.77855: Calling groups_inventory to load vars for managed_node3 15330 1726882264.77858: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882264.77867: Calling all_plugins_play to load vars for managed_node3 15330 1726882264.77870: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882264.77873: Calling groups_plugins_play to load vars for managed_node3 15330 1726882264.81592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882264.83345: done with get_vars() 15330 1726882264.83373: done getting variables 15330 1726882264.83454: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:31:04 -0400 (0:00:00.094) 0:00:14.040 ****** 15330 1726882264.83496: entering _queue_task() for managed_node3/debug 15330 1726882264.84082: worker is 1 (out of 1 available) 15330 1726882264.84182: exiting _queue_task() for managed_node3/debug 15330 1726882264.84196: done queuing things up, now waiting for results queue to drain 15330 1726882264.84197: waiting for pending results... 15330 1726882264.84511: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15330 1726882264.84583: in run() - task 12673a56-9f93-e4fe-1358-000000000028 15330 1726882264.84609: variable 'ansible_search_path' from source: unknown 15330 1726882264.84613: variable 'ansible_search_path' from source: unknown 15330 1726882264.84663: calling self._execute() 15330 1726882264.84744: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.84748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.84754: variable 'omit' from source: magic vars 15330 1726882264.85135: variable 'ansible_distribution_major_version' from source: facts 15330 1726882264.85150: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882264.85247: variable 'network_state' from source: role '' defaults 15330 1726882264.85255: Evaluated conditional (network_state != {}): False 15330 1726882264.85258: when evaluation is False, skipping this task 15330 1726882264.85262: _execute() done 15330 1726882264.85265: dumping result to json 15330 1726882264.85267: done dumping result, returning 15330 1726882264.85274: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-e4fe-1358-000000000028] 15330 1726882264.85288: sending task result for task 12673a56-9f93-e4fe-1358-000000000028 15330 1726882264.85368: done sending task result for task 12673a56-9f93-e4fe-1358-000000000028 15330 1726882264.85371: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 15330 1726882264.85442: no more pending results, returning what we have 15330 1726882264.85446: results queue empty 15330 1726882264.85447: checking for any_errors_fatal 15330 1726882264.85458: done checking for any_errors_fatal 15330 1726882264.85459: checking for max_fail_percentage 15330 1726882264.85460: done checking for max_fail_percentage 15330 1726882264.85461: checking to see if all hosts have failed and the running result is not ok 15330 1726882264.85462: done checking to see if all hosts have failed 15330 1726882264.85463: getting the remaining hosts for this loop 15330 1726882264.85464: done getting the remaining hosts for this loop 15330 1726882264.85468: getting the next task for host managed_node3 15330 1726882264.85472: done getting next task for host managed_node3 15330 1726882264.85476: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15330 1726882264.85478: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882264.85491: getting variables 15330 1726882264.85492: in VariableManager get_vars() 15330 1726882264.85633: Calling all_inventory to load vars for managed_node3 15330 1726882264.85636: Calling groups_inventory to load vars for managed_node3 15330 1726882264.85647: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882264.85656: Calling all_plugins_play to load vars for managed_node3 15330 1726882264.85659: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882264.85667: Calling groups_plugins_play to load vars for managed_node3 15330 1726882264.86931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882264.88168: done with get_vars() 15330 1726882264.88184: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:31:04 -0400 (0:00:00.047) 0:00:14.088 ****** 15330 1726882264.88253: entering _queue_task() for managed_node3/ping 15330 1726882264.88255: Creating lock for ping 15330 1726882264.88494: worker is 1 (out of 1 available) 15330 1726882264.88510: exiting _queue_task() for managed_node3/ping 15330 1726882264.88525: done queuing things up, now waiting for results queue to drain 15330 1726882264.88526: waiting for pending results... 15330 1726882264.88903: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 15330 1726882264.88949: in run() - task 12673a56-9f93-e4fe-1358-000000000029 15330 1726882264.88954: variable 'ansible_search_path' from source: unknown 15330 1726882264.88957: variable 'ansible_search_path' from source: unknown 15330 1726882264.89015: calling self._execute() 15330 1726882264.89148: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.89151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.89154: variable 'omit' from source: magic vars 15330 1726882264.89548: variable 'ansible_distribution_major_version' from source: facts 15330 1726882264.89563: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882264.89574: variable 'omit' from source: magic vars 15330 1726882264.89622: variable 'omit' from source: magic vars 15330 1726882264.89657: variable 'omit' from source: magic vars 15330 1726882264.89692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882264.89746: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882264.89763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882264.89778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882264.89789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882264.89830: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882264.89883: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.89888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.90001: Set connection var ansible_pipelining to False 15330 1726882264.90004: Set connection var ansible_timeout to 10 15330 1726882264.90007: Set connection var ansible_connection to ssh 15330 1726882264.90011: Set connection var ansible_shell_type to sh 15330 1726882264.90014: Set connection var ansible_shell_executable to /bin/sh 15330 1726882264.90033: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882264.90047: variable 'ansible_shell_executable' from source: unknown 15330 1726882264.90050: variable 'ansible_connection' from source: unknown 15330 1726882264.90052: variable 'ansible_module_compression' from source: unknown 15330 1726882264.90055: variable 'ansible_shell_type' from source: unknown 15330 1726882264.90057: variable 'ansible_shell_executable' from source: unknown 15330 1726882264.90059: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882264.90061: variable 'ansible_pipelining' from source: unknown 15330 1726882264.90063: variable 'ansible_timeout' from source: unknown 15330 1726882264.90126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882264.90333: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882264.90368: variable 'omit' from source: magic vars 15330 1726882264.90371: starting attempt loop 15330 1726882264.90374: running the handler 15330 1726882264.90376: _low_level_execute_command(): starting 15330 1726882264.90378: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882264.91487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882264.91547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882264.91565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882264.91597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882264.93268: stdout chunk (state=3): >>>/root <<< 15330 1726882264.93368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882264.93399: stderr chunk (state=3): >>><<< 15330 1726882264.93402: stdout chunk (state=3): >>><<< 15330 1726882264.93425: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882264.93437: _low_level_execute_command(): starting 15330 1726882264.93443: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485 `" && echo ansible-tmp-1726882264.9342566-16037-44322034889485="` echo /root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485 `" ) && sleep 0' 15330 1726882264.93869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882264.93898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882264.93903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882264.93905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882264.93919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882264.93921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882264.93933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882264.93982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882264.93986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882264.94005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882264.94063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882264.95928: stdout chunk (state=3): >>>ansible-tmp-1726882264.9342566-16037-44322034889485=/root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485 <<< 15330 1726882264.96082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882264.96085: stderr chunk (state=3): >>><<< 15330 1726882264.96088: stdout chunk (state=3): >>><<< 15330 1726882264.96119: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882264.9342566-16037-44322034889485=/root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882264.96158: variable 'ansible_module_compression' from source: unknown 15330 1726882264.96211: ANSIBALLZ: Using lock for ping 15330 1726882264.96215: ANSIBALLZ: Acquiring lock 15330 1726882264.96217: ANSIBALLZ: Lock acquired: 140238202168304 15330 1726882264.96219: ANSIBALLZ: Creating module 15330 1726882265.07019: ANSIBALLZ: Writing module into payload 15330 1726882265.07077: ANSIBALLZ: Writing module 15330 1726882265.07144: ANSIBALLZ: Renaming module 15330 1726882265.07148: ANSIBALLZ: Done creating module 15330 1726882265.07153: variable 'ansible_facts' from source: unknown 15330 1726882265.07299: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485/AnsiballZ_ping.py 15330 1726882265.07523: Sending initial data 15330 1726882265.07527: Sent initial data (152 bytes) 15330 1726882265.08081: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882265.08132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882265.08136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882265.08139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882265.08200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882265.08203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882265.08299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882265.09834: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15330 1726882265.09881: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882265.09911: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882265.09961: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpxe2vfszl /root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485/AnsiballZ_ping.py <<< 15330 1726882265.09973: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485/AnsiballZ_ping.py" <<< 15330 1726882265.10017: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpxe2vfszl" to remote "/root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485/AnsiballZ_ping.py" <<< 15330 1726882265.10899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882265.10902: stdout chunk (state=3): >>><<< 15330 1726882265.10905: stderr chunk (state=3): >>><<< 15330 1726882265.10911: done transferring module to remote 15330 1726882265.10913: _low_level_execute_command(): starting 15330 1726882265.10916: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485/ /root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485/AnsiballZ_ping.py && sleep 0' 15330 1726882265.11412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882265.11464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882265.11476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882265.11525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882265.13300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882265.13318: stderr chunk (state=3): >>><<< 15330 1726882265.13321: stdout chunk (state=3): >>><<< 15330 1726882265.13335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882265.13338: _low_level_execute_command(): starting 15330 1726882265.13350: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485/AnsiballZ_ping.py && sleep 0' 15330 1726882265.13821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882265.13824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882265.13827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882265.13829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882265.13831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882265.13880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882265.13886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882265.13888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882265.13936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882265.28719: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15330 1726882265.29769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882265.29800: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 15330 1726882265.30044: stderr chunk (state=3): >>><<< 15330 1726882265.30048: stdout chunk (state=3): >>><<< 15330 1726882265.30067: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882265.30091: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882265.30398: _low_level_execute_command(): starting 15330 1726882265.30401: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882264.9342566-16037-44322034889485/ > /dev/null 2>&1 && sleep 0' 15330 1726882265.31701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882265.31705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882265.31739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882265.31811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882265.33772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882265.33776: stdout chunk (state=3): >>><<< 15330 1726882265.33781: stderr chunk (state=3): >>><<< 15330 1726882265.33813: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882265.33828: handler run complete 15330 1726882265.33843: attempt loop complete, returning result 15330 1726882265.33846: _execute() done 15330 1726882265.33849: dumping result to json 15330 1726882265.33851: done dumping result, returning 15330 1726882265.33861: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-e4fe-1358-000000000029] 15330 1726882265.33992: sending task result for task 12673a56-9f93-e4fe-1358-000000000029 15330 1726882265.34080: done sending task result for task 12673a56-9f93-e4fe-1358-000000000029 15330 1726882265.34096: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 15330 1726882265.34160: no more pending results, returning what we have 15330 1726882265.34164: results queue empty 15330 1726882265.34165: checking for any_errors_fatal 15330 1726882265.34171: done checking for any_errors_fatal 15330 1726882265.34172: checking for max_fail_percentage 15330 1726882265.34174: done checking for max_fail_percentage 15330 1726882265.34175: checking to see if all hosts have failed and the running result is not ok 15330 1726882265.34175: done checking to see if all hosts have failed 15330 1726882265.34176: getting the remaining hosts for this loop 15330 1726882265.34177: done getting the remaining hosts for this loop 15330 1726882265.34181: getting the next task for host managed_node3 15330 1726882265.34498: done getting next task for host managed_node3 15330 1726882265.34502: ^ task is: TASK: meta (role_complete) 15330 1726882265.34504: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882265.34515: getting variables 15330 1726882265.34517: in VariableManager get_vars() 15330 1726882265.34560: Calling all_inventory to load vars for managed_node3 15330 1726882265.34563: Calling groups_inventory to load vars for managed_node3 15330 1726882265.34566: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882265.34577: Calling all_plugins_play to load vars for managed_node3 15330 1726882265.34579: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882265.34582: Calling groups_plugins_play to load vars for managed_node3 15330 1726882265.37829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882265.40483: done with get_vars() 15330 1726882265.40581: done getting variables 15330 1726882265.40921: done queuing things up, now waiting for results queue to drain 15330 1726882265.40924: results queue empty 15330 1726882265.40925: checking for any_errors_fatal 15330 1726882265.40928: done checking for any_errors_fatal 15330 1726882265.40956: checking for max_fail_percentage 15330 1726882265.40957: done checking for max_fail_percentage 15330 1726882265.40958: checking to see if all hosts have failed and the running result is not ok 15330 1726882265.40959: done checking to see if all hosts have failed 15330 1726882265.40960: getting the remaining hosts for this loop 15330 1726882265.40961: done getting the remaining hosts for this loop 15330 1726882265.40964: getting the next task for host managed_node3 15330 1726882265.40968: done getting next task for host managed_node3 15330 1726882265.40969: ^ task is: TASK: meta (flush_handlers) 15330 1726882265.40971: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882265.40974: getting variables 15330 1726882265.40974: in VariableManager get_vars() 15330 1726882265.40992: Calling all_inventory to load vars for managed_node3 15330 1726882265.41066: Calling groups_inventory to load vars for managed_node3 15330 1726882265.41069: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882265.41075: Calling all_plugins_play to load vars for managed_node3 15330 1726882265.41077: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882265.41080: Calling groups_plugins_play to load vars for managed_node3 15330 1726882265.43200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882265.45539: done with get_vars() 15330 1726882265.45569: done getting variables 15330 1726882265.45857: in VariableManager get_vars() 15330 1726882265.45941: Calling all_inventory to load vars for managed_node3 15330 1726882265.45944: Calling groups_inventory to load vars for managed_node3 15330 1726882265.45947: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882265.45952: Calling all_plugins_play to load vars for managed_node3 15330 1726882265.45954: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882265.45957: Calling groups_plugins_play to load vars for managed_node3 15330 1726882265.48853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882265.52607: done with get_vars() 15330 1726882265.52646: done queuing things up, now waiting for results queue to drain 15330 1726882265.52648: results queue empty 15330 1726882265.52649: checking for any_errors_fatal 15330 1726882265.52650: done checking for any_errors_fatal 15330 1726882265.52651: checking for max_fail_percentage 15330 1726882265.52652: done checking for max_fail_percentage 15330 1726882265.52653: checking to see if all hosts have failed and the running result is not ok 15330 1726882265.52654: done checking to see if all hosts have failed 15330 1726882265.52654: getting the remaining hosts for this loop 15330 1726882265.52655: done getting the remaining hosts for this loop 15330 1726882265.52658: getting the next task for host managed_node3 15330 1726882265.52662: done getting next task for host managed_node3 15330 1726882265.52664: ^ task is: TASK: meta (flush_handlers) 15330 1726882265.52665: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882265.52668: getting variables 15330 1726882265.52669: in VariableManager get_vars() 15330 1726882265.52681: Calling all_inventory to load vars for managed_node3 15330 1726882265.52684: Calling groups_inventory to load vars for managed_node3 15330 1726882265.52685: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882265.52698: Calling all_plugins_play to load vars for managed_node3 15330 1726882265.52700: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882265.52703: Calling groups_plugins_play to load vars for managed_node3 15330 1726882265.55431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882265.57910: done with get_vars() 15330 1726882265.57940: done getting variables 15330 1726882265.58114: in VariableManager get_vars() 15330 1726882265.58129: Calling all_inventory to load vars for managed_node3 15330 1726882265.58131: Calling groups_inventory to load vars for managed_node3 15330 1726882265.58134: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882265.58139: Calling all_plugins_play to load vars for managed_node3 15330 1726882265.58141: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882265.58144: Calling groups_plugins_play to load vars for managed_node3 15330 1726882265.60533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882265.63417: done with get_vars() 15330 1726882265.63454: done queuing things up, now waiting for results queue to drain 15330 1726882265.63457: results queue empty 15330 1726882265.63457: checking for any_errors_fatal 15330 1726882265.63459: done checking for any_errors_fatal 15330 1726882265.63459: checking for max_fail_percentage 15330 1726882265.63461: done checking for max_fail_percentage 15330 1726882265.63461: checking to see if all hosts have failed and the running result is not ok 15330 1726882265.63462: done checking to see if all hosts have failed 15330 1726882265.63463: getting the remaining hosts for this loop 15330 1726882265.63464: done getting the remaining hosts for this loop 15330 1726882265.63467: getting the next task for host managed_node3 15330 1726882265.63470: done getting next task for host managed_node3 15330 1726882265.63471: ^ task is: None 15330 1726882265.63472: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882265.63473: done queuing things up, now waiting for results queue to drain 15330 1726882265.63474: results queue empty 15330 1726882265.63475: checking for any_errors_fatal 15330 1726882265.63476: done checking for any_errors_fatal 15330 1726882265.63476: checking for max_fail_percentage 15330 1726882265.63477: done checking for max_fail_percentage 15330 1726882265.63478: checking to see if all hosts have failed and the running result is not ok 15330 1726882265.63478: done checking to see if all hosts have failed 15330 1726882265.63479: getting the next task for host managed_node3 15330 1726882265.63481: done getting next task for host managed_node3 15330 1726882265.63482: ^ task is: None 15330 1726882265.63483: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882265.63673: in VariableManager get_vars() 15330 1726882265.63696: done with get_vars() 15330 1726882265.63704: in VariableManager get_vars() 15330 1726882265.63715: done with get_vars() 15330 1726882265.63720: variable 'omit' from source: magic vars 15330 1726882265.63995: variable 'task' from source: play vars 15330 1726882265.64028: in VariableManager get_vars() 15330 1726882265.64039: done with get_vars() 15330 1726882265.64057: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 15330 1726882265.64445: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15330 1726882265.64518: getting the remaining hosts for this loop 15330 1726882265.64520: done getting the remaining hosts for this loop 15330 1726882265.64523: getting the next task for host managed_node3 15330 1726882265.64525: done getting next task for host managed_node3 15330 1726882265.64528: ^ task is: TASK: Gathering Facts 15330 1726882265.64529: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882265.64531: getting variables 15330 1726882265.64532: in VariableManager get_vars() 15330 1726882265.64540: Calling all_inventory to load vars for managed_node3 15330 1726882265.64543: Calling groups_inventory to load vars for managed_node3 15330 1726882265.64545: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882265.64550: Calling all_plugins_play to load vars for managed_node3 15330 1726882265.64552: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882265.64555: Calling groups_plugins_play to load vars for managed_node3 15330 1726882265.67126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882265.68670: done with get_vars() 15330 1726882265.68691: done getting variables 15330 1726882265.68737: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:31:05 -0400 (0:00:00.805) 0:00:14.893 ****** 15330 1726882265.68775: entering _queue_task() for managed_node3/gather_facts 15330 1726882265.69335: worker is 1 (out of 1 available) 15330 1726882265.69710: exiting _queue_task() for managed_node3/gather_facts 15330 1726882265.69721: done queuing things up, now waiting for results queue to drain 15330 1726882265.69723: waiting for pending results... 15330 1726882265.70228: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882265.70500: in run() - task 12673a56-9f93-e4fe-1358-000000000219 15330 1726882265.70503: variable 'ansible_search_path' from source: unknown 15330 1726882265.70506: calling self._execute() 15330 1726882265.70850: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882265.71537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882265.71540: variable 'omit' from source: magic vars 15330 1726882265.72342: variable 'ansible_distribution_major_version' from source: facts 15330 1726882265.72416: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882265.72429: variable 'omit' from source: magic vars 15330 1726882265.72462: variable 'omit' from source: magic vars 15330 1726882265.72701: variable 'omit' from source: magic vars 15330 1726882265.72704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882265.72722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882265.72747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882265.72884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882265.72909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882265.72947: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882265.72956: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882265.72964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882265.73066: Set connection var ansible_pipelining to False 15330 1726882265.73090: Set connection var ansible_timeout to 10 15330 1726882265.73100: Set connection var ansible_connection to ssh 15330 1726882265.73107: Set connection var ansible_shell_type to sh 15330 1726882265.73118: Set connection var ansible_shell_executable to /bin/sh 15330 1726882265.73126: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882265.73153: variable 'ansible_shell_executable' from source: unknown 15330 1726882265.73161: variable 'ansible_connection' from source: unknown 15330 1726882265.73167: variable 'ansible_module_compression' from source: unknown 15330 1726882265.73174: variable 'ansible_shell_type' from source: unknown 15330 1726882265.73186: variable 'ansible_shell_executable' from source: unknown 15330 1726882265.73198: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882265.73206: variable 'ansible_pipelining' from source: unknown 15330 1726882265.73212: variable 'ansible_timeout' from source: unknown 15330 1726882265.73220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882265.73428: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882265.73445: variable 'omit' from source: magic vars 15330 1726882265.73455: starting attempt loop 15330 1726882265.73461: running the handler 15330 1726882265.73480: variable 'ansible_facts' from source: unknown 15330 1726882265.73512: _low_level_execute_command(): starting 15330 1726882265.73527: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882265.74716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882265.74821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882265.74835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882265.74847: stderr chunk (state=3): >>>debug2: match found <<< 15330 1726882265.74859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882265.75191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882265.75276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882265.76978: stdout chunk (state=3): >>>/root <<< 15330 1726882265.77273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882265.77277: stdout chunk (state=3): >>><<< 15330 1726882265.77279: stderr chunk (state=3): >>><<< 15330 1726882265.77283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882265.77285: _low_level_execute_command(): starting 15330 1726882265.77290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763 `" && echo ansible-tmp-1726882265.7718127-16072-196703946105763="` echo /root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763 `" ) && sleep 0' 15330 1726882265.77848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882265.77862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882265.77876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882265.77977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882265.78021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882265.78090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882265.79957: stdout chunk (state=3): >>>ansible-tmp-1726882265.7718127-16072-196703946105763=/root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763 <<< 15330 1726882265.80111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882265.80197: stderr chunk (state=3): >>><<< 15330 1726882265.80201: stdout chunk (state=3): >>><<< 15330 1726882265.80411: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882265.7718127-16072-196703946105763=/root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882265.80415: variable 'ansible_module_compression' from source: unknown 15330 1726882265.80417: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882265.80451: variable 'ansible_facts' from source: unknown 15330 1726882265.80765: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763/AnsiballZ_setup.py 15330 1726882265.81021: Sending initial data 15330 1726882265.81024: Sent initial data (154 bytes) 15330 1726882265.82320: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882265.82373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882265.82399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882265.82535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882265.84268: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882265.84290: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882265.84321: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp1ken5v84 /root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763/AnsiballZ_setup.py <<< 15330 1726882265.84325: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763/AnsiballZ_setup.py" <<< 15330 1726882265.84385: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp1ken5v84" to remote "/root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763/AnsiballZ_setup.py" <<< 15330 1726882265.87051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882265.87319: stderr chunk (state=3): >>><<< 15330 1726882265.87322: stdout chunk (state=3): >>><<< 15330 1726882265.87325: done transferring module to remote 15330 1726882265.87326: _low_level_execute_command(): starting 15330 1726882265.87329: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763/ /root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763/AnsiballZ_setup.py && sleep 0' 15330 1726882265.88748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882265.88866: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882265.89117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882265.89188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882265.90948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882265.90952: stdout chunk (state=3): >>><<< 15330 1726882265.90954: stderr chunk (state=3): >>><<< 15330 1726882265.90971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882265.90985: _low_level_execute_command(): starting 15330 1726882265.91001: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763/AnsiballZ_setup.py && sleep 0' 15330 1726882265.92314: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882265.92523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882265.92554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882265.92637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882266.55984: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_loadavg": {"1m": 1.025390625, "5m": 0.48291015625, "15m": 0.22314453125}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "06", "epoch": "1726882266", "epoch_int": "1726882266", "date": "2024-09-20", "time": "21:31:06", "iso8601_micro": "2024-09-21T01:31:06.198217Z", "iso8601": "2024-09-21T01:31:06Z", "iso8601_basic": "20240920T213106198217", "iso8601_basic_short": "20240920T213106", "tz": "EDT", "tz_dst": "<<< 15330 1726882266.56031: stdout chunk (state=3): >>>EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2971, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 560, "free": 2971}, "nocache": {"free": 3286, "used": 245}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 573, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805170688, "block_size": 4096, "block_total": 65519099, "block_available": 63917278, "block_used": 1601821, "inode_total": 131070960, "inode_available": 131029133, "inode_used": 41827, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": fals<<< 15330 1726882266.56041: stdout chunk (state=3): >>>e, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "26:84:62:af:e1:90", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off<<< 15330 1726882266.56064: stdout chunk (state=3): >>> [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882266.58401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882266.58405: stdout chunk (state=3): >>><<< 15330 1726882266.58408: stderr chunk (state=3): >>><<< 15330 1726882266.58417: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_loadavg": {"1m": 1.025390625, "5m": 0.48291015625, "15m": 0.22314453125}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "06", "epoch": "1726882266", "epoch_int": "1726882266", "date": "2024-09-20", "time": "21:31:06", "iso8601_micro": "2024-09-21T01:31:06.198217Z", "iso8601": "2024-09-21T01:31:06Z", "iso8601_basic": "20240920T213106198217", "iso8601_basic_short": "20240920T213106", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2971, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 560, "free": 2971}, "nocache": {"free": 3286, "used": 245}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 573, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805170688, "block_size": 4096, "block_total": 65519099, "block_available": 63917278, "block_used": 1601821, "inode_total": 131070960, "inode_available": 131029133, "inode_used": 41827, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "26:84:62:af:e1:90", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882266.59044: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882266.59066: _low_level_execute_command(): starting 15330 1726882266.59070: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882265.7718127-16072-196703946105763/ > /dev/null 2>&1 && sleep 0' 15330 1726882266.60444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882266.60605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882266.60660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882266.60738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882266.62549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882266.62618: stderr chunk (state=3): >>><<< 15330 1726882266.62631: stdout chunk (state=3): >>><<< 15330 1726882266.62656: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882266.62679: handler run complete 15330 1726882266.62858: variable 'ansible_facts' from source: unknown 15330 1726882266.63101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882266.63366: variable 'ansible_facts' from source: unknown 15330 1726882266.63476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882266.63651: attempt loop complete, returning result 15330 1726882266.63664: _execute() done 15330 1726882266.63670: dumping result to json 15330 1726882266.63717: done dumping result, returning 15330 1726882266.63730: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-000000000219] 15330 1726882266.63756: sending task result for task 12673a56-9f93-e4fe-1358-000000000219 ok: [managed_node3] 15330 1726882266.64923: no more pending results, returning what we have 15330 1726882266.64926: results queue empty 15330 1726882266.64927: checking for any_errors_fatal 15330 1726882266.64929: done checking for any_errors_fatal 15330 1726882266.64930: checking for max_fail_percentage 15330 1726882266.64931: done checking for max_fail_percentage 15330 1726882266.64932: checking to see if all hosts have failed and the running result is not ok 15330 1726882266.64933: done checking to see if all hosts have failed 15330 1726882266.64934: getting the remaining hosts for this loop 15330 1726882266.64935: done getting the remaining hosts for this loop 15330 1726882266.64938: getting the next task for host managed_node3 15330 1726882266.64943: done getting next task for host managed_node3 15330 1726882266.64954: ^ task is: TASK: meta (flush_handlers) 15330 1726882266.64956: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882266.64960: getting variables 15330 1726882266.64962: in VariableManager get_vars() 15330 1726882266.64984: Calling all_inventory to load vars for managed_node3 15330 1726882266.64986: Calling groups_inventory to load vars for managed_node3 15330 1726882266.64992: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882266.65001: done sending task result for task 12673a56-9f93-e4fe-1358-000000000219 15330 1726882266.65004: WORKER PROCESS EXITING 15330 1726882266.65013: Calling all_plugins_play to load vars for managed_node3 15330 1726882266.65016: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882266.65019: Calling groups_plugins_play to load vars for managed_node3 15330 1726882266.66530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882266.72624: done with get_vars() 15330 1726882266.72658: done getting variables 15330 1726882266.72723: in VariableManager get_vars() 15330 1726882266.72733: Calling all_inventory to load vars for managed_node3 15330 1726882266.72735: Calling groups_inventory to load vars for managed_node3 15330 1726882266.72737: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882266.72742: Calling all_plugins_play to load vars for managed_node3 15330 1726882266.72745: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882266.72747: Calling groups_plugins_play to load vars for managed_node3 15330 1726882266.73994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882266.75698: done with get_vars() 15330 1726882266.75734: done queuing things up, now waiting for results queue to drain 15330 1726882266.75736: results queue empty 15330 1726882266.75737: checking for any_errors_fatal 15330 1726882266.75741: done checking for any_errors_fatal 15330 1726882266.75741: checking for max_fail_percentage 15330 1726882266.75742: done checking for max_fail_percentage 15330 1726882266.75743: checking to see if all hosts have failed and the running result is not ok 15330 1726882266.75744: done checking to see if all hosts have failed 15330 1726882266.75749: getting the remaining hosts for this loop 15330 1726882266.75751: done getting the remaining hosts for this loop 15330 1726882266.75753: getting the next task for host managed_node3 15330 1726882266.75757: done getting next task for host managed_node3 15330 1726882266.75760: ^ task is: TASK: Include the task '{{ task }}' 15330 1726882266.75761: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882266.75763: getting variables 15330 1726882266.75764: in VariableManager get_vars() 15330 1726882266.75774: Calling all_inventory to load vars for managed_node3 15330 1726882266.75775: Calling groups_inventory to load vars for managed_node3 15330 1726882266.75777: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882266.75783: Calling all_plugins_play to load vars for managed_node3 15330 1726882266.75785: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882266.75791: Calling groups_plugins_play to load vars for managed_node3 15330 1726882266.77105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882266.78826: done with get_vars() 15330 1726882266.78850: done getting variables 15330 1726882266.79014: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:31:06 -0400 (0:00:01.102) 0:00:15.996 ****** 15330 1726882266.79050: entering _queue_task() for managed_node3/include_tasks 15330 1726882266.79449: worker is 1 (out of 1 available) 15330 1726882266.79461: exiting _queue_task() for managed_node3/include_tasks 15330 1726882266.79589: done queuing things up, now waiting for results queue to drain 15330 1726882266.79591: waiting for pending results... 15330 1726882266.79818: running TaskExecutor() for managed_node3/TASK: Include the task 'tasks/assert_device_present.yml' 15330 1726882266.79938: in run() - task 12673a56-9f93-e4fe-1358-00000000002d 15330 1726882266.79942: variable 'ansible_search_path' from source: unknown 15330 1726882266.80020: calling self._execute() 15330 1726882266.80089: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882266.80105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882266.80124: variable 'omit' from source: magic vars 15330 1726882266.80546: variable 'ansible_distribution_major_version' from source: facts 15330 1726882266.80592: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882266.80597: variable 'task' from source: play vars 15330 1726882266.80699: variable 'task' from source: play vars 15330 1726882266.80704: _execute() done 15330 1726882266.80706: dumping result to json 15330 1726882266.80712: done dumping result, returning 15330 1726882266.80724: done running TaskExecutor() for managed_node3/TASK: Include the task 'tasks/assert_device_present.yml' [12673a56-9f93-e4fe-1358-00000000002d] 15330 1726882266.80782: sending task result for task 12673a56-9f93-e4fe-1358-00000000002d 15330 1726882266.80870: done sending task result for task 12673a56-9f93-e4fe-1358-00000000002d 15330 1726882266.80873: WORKER PROCESS EXITING 15330 1726882266.80918: no more pending results, returning what we have 15330 1726882266.80924: in VariableManager get_vars() 15330 1726882266.80960: Calling all_inventory to load vars for managed_node3 15330 1726882266.80963: Calling groups_inventory to load vars for managed_node3 15330 1726882266.80967: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882266.80981: Calling all_plugins_play to load vars for managed_node3 15330 1726882266.80986: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882266.80995: Calling groups_plugins_play to load vars for managed_node3 15330 1726882266.82722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882266.84429: done with get_vars() 15330 1726882266.84456: variable 'ansible_search_path' from source: unknown 15330 1726882266.84473: we have included files to process 15330 1726882266.84474: generating all_blocks data 15330 1726882266.84475: done generating all_blocks data 15330 1726882266.84476: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15330 1726882266.84477: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15330 1726882266.84480: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15330 1726882266.84670: in VariableManager get_vars() 15330 1726882266.84691: done with get_vars() 15330 1726882266.84815: done processing included file 15330 1726882266.84817: iterating over new_blocks loaded from include file 15330 1726882266.84818: in VariableManager get_vars() 15330 1726882266.84829: done with get_vars() 15330 1726882266.84831: filtering new block on tags 15330 1726882266.84848: done filtering new block on tags 15330 1726882266.84850: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 15330 1726882266.84861: extending task lists for all hosts with included blocks 15330 1726882266.84899: done extending task lists 15330 1726882266.84901: done processing included files 15330 1726882266.84902: results queue empty 15330 1726882266.84902: checking for any_errors_fatal 15330 1726882266.84904: done checking for any_errors_fatal 15330 1726882266.84904: checking for max_fail_percentage 15330 1726882266.84905: done checking for max_fail_percentage 15330 1726882266.84906: checking to see if all hosts have failed and the running result is not ok 15330 1726882266.84907: done checking to see if all hosts have failed 15330 1726882266.84907: getting the remaining hosts for this loop 15330 1726882266.84908: done getting the remaining hosts for this loop 15330 1726882266.84911: getting the next task for host managed_node3 15330 1726882266.84914: done getting next task for host managed_node3 15330 1726882266.84917: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15330 1726882266.84919: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882266.84921: getting variables 15330 1726882266.84922: in VariableManager get_vars() 15330 1726882266.84929: Calling all_inventory to load vars for managed_node3 15330 1726882266.84932: Calling groups_inventory to load vars for managed_node3 15330 1726882266.84934: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882266.84939: Calling all_plugins_play to load vars for managed_node3 15330 1726882266.84941: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882266.84944: Calling groups_plugins_play to load vars for managed_node3 15330 1726882266.86273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882266.87429: done with get_vars() 15330 1726882266.87449: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:31:06 -0400 (0:00:00.084) 0:00:16.081 ****** 15330 1726882266.87510: entering _queue_task() for managed_node3/include_tasks 15330 1726882266.87767: worker is 1 (out of 1 available) 15330 1726882266.87781: exiting _queue_task() for managed_node3/include_tasks 15330 1726882266.87791: done queuing things up, now waiting for results queue to drain 15330 1726882266.87794: waiting for pending results... 15330 1726882266.87972: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 15330 1726882266.88056: in run() - task 12673a56-9f93-e4fe-1358-00000000022a 15330 1726882266.88065: variable 'ansible_search_path' from source: unknown 15330 1726882266.88068: variable 'ansible_search_path' from source: unknown 15330 1726882266.88099: calling self._execute() 15330 1726882266.88167: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882266.88171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882266.88181: variable 'omit' from source: magic vars 15330 1726882266.88463: variable 'ansible_distribution_major_version' from source: facts 15330 1726882266.88474: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882266.88478: _execute() done 15330 1726882266.88481: dumping result to json 15330 1726882266.88484: done dumping result, returning 15330 1726882266.88488: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-e4fe-1358-00000000022a] 15330 1726882266.88501: sending task result for task 12673a56-9f93-e4fe-1358-00000000022a 15330 1726882266.88611: done sending task result for task 12673a56-9f93-e4fe-1358-00000000022a 15330 1726882266.88614: WORKER PROCESS EXITING 15330 1726882266.88647: no more pending results, returning what we have 15330 1726882266.88651: in VariableManager get_vars() 15330 1726882266.88682: Calling all_inventory to load vars for managed_node3 15330 1726882266.88685: Calling groups_inventory to load vars for managed_node3 15330 1726882266.88688: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882266.88741: Calling all_plugins_play to load vars for managed_node3 15330 1726882266.88744: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882266.88748: Calling groups_plugins_play to load vars for managed_node3 15330 1726882266.90088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882266.91587: done with get_vars() 15330 1726882266.91603: variable 'ansible_search_path' from source: unknown 15330 1726882266.91604: variable 'ansible_search_path' from source: unknown 15330 1726882266.91619: variable 'task' from source: play vars 15330 1726882266.91716: variable 'task' from source: play vars 15330 1726882266.91741: we have included files to process 15330 1726882266.91742: generating all_blocks data 15330 1726882266.91743: done generating all_blocks data 15330 1726882266.91745: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15330 1726882266.91746: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15330 1726882266.91748: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15330 1726882266.91881: done processing included file 15330 1726882266.91883: iterating over new_blocks loaded from include file 15330 1726882266.91884: in VariableManager get_vars() 15330 1726882266.91895: done with get_vars() 15330 1726882266.91897: filtering new block on tags 15330 1726882266.91907: done filtering new block on tags 15330 1726882266.91908: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 15330 1726882266.91912: extending task lists for all hosts with included blocks 15330 1726882266.91973: done extending task lists 15330 1726882266.91974: done processing included files 15330 1726882266.91975: results queue empty 15330 1726882266.91975: checking for any_errors_fatal 15330 1726882266.91978: done checking for any_errors_fatal 15330 1726882266.91978: checking for max_fail_percentage 15330 1726882266.91979: done checking for max_fail_percentage 15330 1726882266.91980: checking to see if all hosts have failed and the running result is not ok 15330 1726882266.91980: done checking to see if all hosts have failed 15330 1726882266.91981: getting the remaining hosts for this loop 15330 1726882266.91981: done getting the remaining hosts for this loop 15330 1726882266.91983: getting the next task for host managed_node3 15330 1726882266.91986: done getting next task for host managed_node3 15330 1726882266.91988: ^ task is: TASK: Get stat for interface {{ interface }} 15330 1726882266.91990: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882266.91991: getting variables 15330 1726882266.91992: in VariableManager get_vars() 15330 1726882266.91999: Calling all_inventory to load vars for managed_node3 15330 1726882266.92001: Calling groups_inventory to load vars for managed_node3 15330 1726882266.92003: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882266.92007: Calling all_plugins_play to load vars for managed_node3 15330 1726882266.92008: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882266.92010: Calling groups_plugins_play to load vars for managed_node3 15330 1726882266.92653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882266.93575: done with get_vars() 15330 1726882266.93596: done getting variables 15330 1726882266.93722: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:31:06 -0400 (0:00:00.062) 0:00:16.143 ****** 15330 1726882266.93752: entering _queue_task() for managed_node3/stat 15330 1726882266.94102: worker is 1 (out of 1 available) 15330 1726882266.94113: exiting _queue_task() for managed_node3/stat 15330 1726882266.94124: done queuing things up, now waiting for results queue to drain 15330 1726882266.94125: waiting for pending results... 15330 1726882266.94482: running TaskExecutor() for managed_node3/TASK: Get stat for interface LSR-TST-br31 15330 1726882266.94537: in run() - task 12673a56-9f93-e4fe-1358-000000000235 15330 1726882266.94541: variable 'ansible_search_path' from source: unknown 15330 1726882266.94544: variable 'ansible_search_path' from source: unknown 15330 1726882266.94567: calling self._execute() 15330 1726882266.94638: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882266.94642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882266.94651: variable 'omit' from source: magic vars 15330 1726882266.94930: variable 'ansible_distribution_major_version' from source: facts 15330 1726882266.94941: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882266.94945: variable 'omit' from source: magic vars 15330 1726882266.94977: variable 'omit' from source: magic vars 15330 1726882266.95047: variable 'interface' from source: set_fact 15330 1726882266.95061: variable 'omit' from source: magic vars 15330 1726882266.95095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882266.95125: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882266.95140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882266.95153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882266.95164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882266.95186: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882266.95192: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882266.95196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882266.95267: Set connection var ansible_pipelining to False 15330 1726882266.95278: Set connection var ansible_timeout to 10 15330 1726882266.95281: Set connection var ansible_connection to ssh 15330 1726882266.95283: Set connection var ansible_shell_type to sh 15330 1726882266.95286: Set connection var ansible_shell_executable to /bin/sh 15330 1726882266.95291: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882266.95308: variable 'ansible_shell_executable' from source: unknown 15330 1726882266.95311: variable 'ansible_connection' from source: unknown 15330 1726882266.95314: variable 'ansible_module_compression' from source: unknown 15330 1726882266.95317: variable 'ansible_shell_type' from source: unknown 15330 1726882266.95319: variable 'ansible_shell_executable' from source: unknown 15330 1726882266.95321: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882266.95323: variable 'ansible_pipelining' from source: unknown 15330 1726882266.95326: variable 'ansible_timeout' from source: unknown 15330 1726882266.95331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882266.95476: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882266.95484: variable 'omit' from source: magic vars 15330 1726882266.95494: starting attempt loop 15330 1726882266.95498: running the handler 15330 1726882266.95507: _low_level_execute_command(): starting 15330 1726882266.95514: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882266.95998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882266.96031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882266.96036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882266.96039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882266.96084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882266.96087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882266.96160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882266.97820: stdout chunk (state=3): >>>/root <<< 15330 1726882266.97924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882266.97955: stderr chunk (state=3): >>><<< 15330 1726882266.97958: stdout chunk (state=3): >>><<< 15330 1726882266.97980: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882266.97992: _low_level_execute_command(): starting 15330 1726882266.97999: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896 `" && echo ansible-tmp-1726882266.9797986-16114-76560207850896="` echo /root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896 `" ) && sleep 0' 15330 1726882266.98669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882266.98672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882266.98675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882266.98686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882266.98688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882266.98757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882266.98777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882266.98854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882267.00704: stdout chunk (state=3): >>>ansible-tmp-1726882266.9797986-16114-76560207850896=/root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896 <<< 15330 1726882267.00828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882267.00871: stderr chunk (state=3): >>><<< 15330 1726882267.00874: stdout chunk (state=3): >>><<< 15330 1726882267.00902: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882266.9797986-16114-76560207850896=/root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882267.00950: variable 'ansible_module_compression' from source: unknown 15330 1726882267.01003: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15330 1726882267.01035: variable 'ansible_facts' from source: unknown 15330 1726882267.01118: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896/AnsiballZ_stat.py 15330 1726882267.01244: Sending initial data 15330 1726882267.01248: Sent initial data (152 bytes) 15330 1726882267.01743: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882267.01747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882267.01749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882267.01759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882267.01762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882267.01812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882267.01835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882267.01891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882267.03402: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15330 1726882267.03407: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882267.03446: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882267.03498: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp6kej6jy1 /root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896/AnsiballZ_stat.py <<< 15330 1726882267.03504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896/AnsiballZ_stat.py" <<< 15330 1726882267.03540: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp6kej6jy1" to remote "/root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896/AnsiballZ_stat.py" <<< 15330 1726882267.03543: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896/AnsiballZ_stat.py" <<< 15330 1726882267.04070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882267.04116: stderr chunk (state=3): >>><<< 15330 1726882267.04119: stdout chunk (state=3): >>><<< 15330 1726882267.04140: done transferring module to remote 15330 1726882267.04148: _low_level_execute_command(): starting 15330 1726882267.04153: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896/ /root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896/AnsiballZ_stat.py && sleep 0' 15330 1726882267.04582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882267.04589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882267.04620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882267.04623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882267.04625: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882267.04631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882267.04679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882267.04682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882267.04742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882267.06499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882267.06506: stderr chunk (state=3): >>><<< 15330 1726882267.06508: stdout chunk (state=3): >>><<< 15330 1726882267.06523: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882267.06526: _low_level_execute_command(): starting 15330 1726882267.06531: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896/AnsiballZ_stat.py && sleep 0' 15330 1726882267.07157: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882267.07202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882267.07216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882267.07291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882267.22390: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29797, "dev": 23, "nlink": 1, "atime": 1726882264.3567126, "mtime": 1726882264.3567126, "ctime": 1726882264.3567126, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15330 1726882267.23627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882267.23689: stderr chunk (state=3): >>><<< 15330 1726882267.23692: stdout chunk (state=3): >>><<< 15330 1726882267.23707: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29797, "dev": 23, "nlink": 1, "atime": 1726882264.3567126, "mtime": 1726882264.3567126, "ctime": 1726882264.3567126, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882267.23745: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882267.23753: _low_level_execute_command(): starting 15330 1726882267.23758: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882266.9797986-16114-76560207850896/ > /dev/null 2>&1 && sleep 0' 15330 1726882267.24261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882267.24264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882267.24267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882267.24269: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882267.24271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882267.24363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882267.24414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882267.26216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882267.26260: stderr chunk (state=3): >>><<< 15330 1726882267.26263: stdout chunk (state=3): >>><<< 15330 1726882267.26274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882267.26281: handler run complete 15330 1726882267.26315: attempt loop complete, returning result 15330 1726882267.26318: _execute() done 15330 1726882267.26325: dumping result to json 15330 1726882267.26334: done dumping result, returning 15330 1726882267.26357: done running TaskExecutor() for managed_node3/TASK: Get stat for interface LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000235] 15330 1726882267.26362: sending task result for task 12673a56-9f93-e4fe-1358-000000000235 15330 1726882267.26457: done sending task result for task 12673a56-9f93-e4fe-1358-000000000235 15330 1726882267.26459: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882264.3567126, "block_size": 4096, "blocks": 0, "ctime": 1726882264.3567126, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29797, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1726882264.3567126, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15330 1726882267.26564: no more pending results, returning what we have 15330 1726882267.26568: results queue empty 15330 1726882267.26571: checking for any_errors_fatal 15330 1726882267.26572: done checking for any_errors_fatal 15330 1726882267.26573: checking for max_fail_percentage 15330 1726882267.26574: done checking for max_fail_percentage 15330 1726882267.26575: checking to see if all hosts have failed and the running result is not ok 15330 1726882267.26576: done checking to see if all hosts have failed 15330 1726882267.26577: getting the remaining hosts for this loop 15330 1726882267.26578: done getting the remaining hosts for this loop 15330 1726882267.26583: getting the next task for host managed_node3 15330 1726882267.26592: done getting next task for host managed_node3 15330 1726882267.26596: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15330 1726882267.26599: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882267.26603: getting variables 15330 1726882267.26604: in VariableManager get_vars() 15330 1726882267.26629: Calling all_inventory to load vars for managed_node3 15330 1726882267.26632: Calling groups_inventory to load vars for managed_node3 15330 1726882267.26634: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882267.26643: Calling all_plugins_play to load vars for managed_node3 15330 1726882267.26646: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882267.26648: Calling groups_plugins_play to load vars for managed_node3 15330 1726882267.27541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882267.28401: done with get_vars() 15330 1726882267.28417: done getting variables 15330 1726882267.28462: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882267.28546: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:31:07 -0400 (0:00:00.348) 0:00:16.491 ****** 15330 1726882267.28570: entering _queue_task() for managed_node3/assert 15330 1726882267.28821: worker is 1 (out of 1 available) 15330 1726882267.28834: exiting _queue_task() for managed_node3/assert 15330 1726882267.28846: done queuing things up, now waiting for results queue to drain 15330 1726882267.28847: waiting for pending results... 15330 1726882267.29048: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'LSR-TST-br31' 15330 1726882267.29120: in run() - task 12673a56-9f93-e4fe-1358-00000000022b 15330 1726882267.29131: variable 'ansible_search_path' from source: unknown 15330 1726882267.29134: variable 'ansible_search_path' from source: unknown 15330 1726882267.29161: calling self._execute() 15330 1726882267.29226: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882267.29230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882267.29238: variable 'omit' from source: magic vars 15330 1726882267.29703: variable 'ansible_distribution_major_version' from source: facts 15330 1726882267.29707: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882267.29714: variable 'omit' from source: magic vars 15330 1726882267.29758: variable 'omit' from source: magic vars 15330 1726882267.29841: variable 'interface' from source: set_fact 15330 1726882267.29852: variable 'omit' from source: magic vars 15330 1726882267.29883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882267.29914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882267.29930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882267.29945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882267.29955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882267.29979: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882267.29982: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882267.29984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882267.30059: Set connection var ansible_pipelining to False 15330 1726882267.30273: Set connection var ansible_timeout to 10 15330 1726882267.30280: Set connection var ansible_connection to ssh 15330 1726882267.30284: Set connection var ansible_shell_type to sh 15330 1726882267.30288: Set connection var ansible_shell_executable to /bin/sh 15330 1726882267.30291: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882267.30312: variable 'ansible_shell_executable' from source: unknown 15330 1726882267.30316: variable 'ansible_connection' from source: unknown 15330 1726882267.30318: variable 'ansible_module_compression' from source: unknown 15330 1726882267.30320: variable 'ansible_shell_type' from source: unknown 15330 1726882267.30322: variable 'ansible_shell_executable' from source: unknown 15330 1726882267.30324: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882267.30326: variable 'ansible_pipelining' from source: unknown 15330 1726882267.30332: variable 'ansible_timeout' from source: unknown 15330 1726882267.30335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882267.30499: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882267.30503: variable 'omit' from source: magic vars 15330 1726882267.30505: starting attempt loop 15330 1726882267.30511: running the handler 15330 1726882267.30563: variable 'interface_stat' from source: set_fact 15330 1726882267.30605: Evaluated conditional (interface_stat.stat.exists): True 15330 1726882267.30620: handler run complete 15330 1726882267.30652: attempt loop complete, returning result 15330 1726882267.30663: _execute() done 15330 1726882267.30675: dumping result to json 15330 1726882267.30738: done dumping result, returning 15330 1726882267.30741: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'LSR-TST-br31' [12673a56-9f93-e4fe-1358-00000000022b] 15330 1726882267.30744: sending task result for task 12673a56-9f93-e4fe-1358-00000000022b 15330 1726882267.30809: done sending task result for task 12673a56-9f93-e4fe-1358-00000000022b 15330 1726882267.30811: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15330 1726882267.30863: no more pending results, returning what we have 15330 1726882267.30867: results queue empty 15330 1726882267.30869: checking for any_errors_fatal 15330 1726882267.30880: done checking for any_errors_fatal 15330 1726882267.30880: checking for max_fail_percentage 15330 1726882267.30882: done checking for max_fail_percentage 15330 1726882267.30883: checking to see if all hosts have failed and the running result is not ok 15330 1726882267.30884: done checking to see if all hosts have failed 15330 1726882267.30885: getting the remaining hosts for this loop 15330 1726882267.30886: done getting the remaining hosts for this loop 15330 1726882267.30891: getting the next task for host managed_node3 15330 1726882267.30906: done getting next task for host managed_node3 15330 1726882267.30909: ^ task is: TASK: meta (flush_handlers) 15330 1726882267.30911: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882267.30915: getting variables 15330 1726882267.30917: in VariableManager get_vars() 15330 1726882267.30950: Calling all_inventory to load vars for managed_node3 15330 1726882267.30953: Calling groups_inventory to load vars for managed_node3 15330 1726882267.30957: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882267.30969: Calling all_plugins_play to load vars for managed_node3 15330 1726882267.30973: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882267.30976: Calling groups_plugins_play to load vars for managed_node3 15330 1726882267.32133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882267.33773: done with get_vars() 15330 1726882267.33795: done getting variables 15330 1726882267.33844: in VariableManager get_vars() 15330 1726882267.33851: Calling all_inventory to load vars for managed_node3 15330 1726882267.33853: Calling groups_inventory to load vars for managed_node3 15330 1726882267.33854: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882267.33858: Calling all_plugins_play to load vars for managed_node3 15330 1726882267.33860: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882267.33862: Calling groups_plugins_play to load vars for managed_node3 15330 1726882267.34602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882267.35564: done with get_vars() 15330 1726882267.35590: done queuing things up, now waiting for results queue to drain 15330 1726882267.35592: results queue empty 15330 1726882267.35595: checking for any_errors_fatal 15330 1726882267.35597: done checking for any_errors_fatal 15330 1726882267.35598: checking for max_fail_percentage 15330 1726882267.35599: done checking for max_fail_percentage 15330 1726882267.35600: checking to see if all hosts have failed and the running result is not ok 15330 1726882267.35600: done checking to see if all hosts have failed 15330 1726882267.35606: getting the remaining hosts for this loop 15330 1726882267.35607: done getting the remaining hosts for this loop 15330 1726882267.35610: getting the next task for host managed_node3 15330 1726882267.35614: done getting next task for host managed_node3 15330 1726882267.35616: ^ task is: TASK: meta (flush_handlers) 15330 1726882267.35617: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882267.35620: getting variables 15330 1726882267.35621: in VariableManager get_vars() 15330 1726882267.35629: Calling all_inventory to load vars for managed_node3 15330 1726882267.35631: Calling groups_inventory to load vars for managed_node3 15330 1726882267.35633: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882267.35638: Calling all_plugins_play to load vars for managed_node3 15330 1726882267.35640: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882267.35643: Calling groups_plugins_play to load vars for managed_node3 15330 1726882267.36721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882267.38260: done with get_vars() 15330 1726882267.38283: done getting variables 15330 1726882267.38335: in VariableManager get_vars() 15330 1726882267.38344: Calling all_inventory to load vars for managed_node3 15330 1726882267.38346: Calling groups_inventory to load vars for managed_node3 15330 1726882267.38348: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882267.38352: Calling all_plugins_play to load vars for managed_node3 15330 1726882267.38355: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882267.38357: Calling groups_plugins_play to load vars for managed_node3 15330 1726882267.39526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882267.41020: done with get_vars() 15330 1726882267.41047: done queuing things up, now waiting for results queue to drain 15330 1726882267.41049: results queue empty 15330 1726882267.41050: checking for any_errors_fatal 15330 1726882267.41051: done checking for any_errors_fatal 15330 1726882267.41052: checking for max_fail_percentage 15330 1726882267.41053: done checking for max_fail_percentage 15330 1726882267.41053: checking to see if all hosts have failed and the running result is not ok 15330 1726882267.41054: done checking to see if all hosts have failed 15330 1726882267.41055: getting the remaining hosts for this loop 15330 1726882267.41056: done getting the remaining hosts for this loop 15330 1726882267.41061: getting the next task for host managed_node3 15330 1726882267.41065: done getting next task for host managed_node3 15330 1726882267.41065: ^ task is: None 15330 1726882267.41067: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882267.41068: done queuing things up, now waiting for results queue to drain 15330 1726882267.41069: results queue empty 15330 1726882267.41070: checking for any_errors_fatal 15330 1726882267.41071: done checking for any_errors_fatal 15330 1726882267.41071: checking for max_fail_percentage 15330 1726882267.41072: done checking for max_fail_percentage 15330 1726882267.41073: checking to see if all hosts have failed and the running result is not ok 15330 1726882267.41074: done checking to see if all hosts have failed 15330 1726882267.41075: getting the next task for host managed_node3 15330 1726882267.41077: done getting next task for host managed_node3 15330 1726882267.41078: ^ task is: None 15330 1726882267.41079: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882267.41124: in VariableManager get_vars() 15330 1726882267.41140: done with get_vars() 15330 1726882267.41146: in VariableManager get_vars() 15330 1726882267.41156: done with get_vars() 15330 1726882267.41161: variable 'omit' from source: magic vars 15330 1726882267.41277: variable 'task' from source: play vars 15330 1726882267.41314: in VariableManager get_vars() 15330 1726882267.41325: done with get_vars() 15330 1726882267.41344: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 15330 1726882267.41569: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15330 1726882267.41596: getting the remaining hosts for this loop 15330 1726882267.41598: done getting the remaining hosts for this loop 15330 1726882267.41601: getting the next task for host managed_node3 15330 1726882267.41603: done getting next task for host managed_node3 15330 1726882267.41605: ^ task is: TASK: Gathering Facts 15330 1726882267.41607: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882267.41609: getting variables 15330 1726882267.41609: in VariableManager get_vars() 15330 1726882267.41618: Calling all_inventory to load vars for managed_node3 15330 1726882267.41620: Calling groups_inventory to load vars for managed_node3 15330 1726882267.41623: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882267.41628: Calling all_plugins_play to load vars for managed_node3 15330 1726882267.41630: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882267.41633: Calling groups_plugins_play to load vars for managed_node3 15330 1726882267.42795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882267.44370: done with get_vars() 15330 1726882267.44391: done getting variables 15330 1726882267.44436: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:31:07 -0400 (0:00:00.158) 0:00:16.650 ****** 15330 1726882267.44461: entering _queue_task() for managed_node3/gather_facts 15330 1726882267.44811: worker is 1 (out of 1 available) 15330 1726882267.44823: exiting _queue_task() for managed_node3/gather_facts 15330 1726882267.44836: done queuing things up, now waiting for results queue to drain 15330 1726882267.44837: waiting for pending results... 15330 1726882267.45309: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882267.45314: in run() - task 12673a56-9f93-e4fe-1358-00000000024e 15330 1726882267.45317: variable 'ansible_search_path' from source: unknown 15330 1726882267.45320: calling self._execute() 15330 1726882267.45365: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882267.45379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882267.45398: variable 'omit' from source: magic vars 15330 1726882267.45780: variable 'ansible_distribution_major_version' from source: facts 15330 1726882267.45804: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882267.45814: variable 'omit' from source: magic vars 15330 1726882267.45845: variable 'omit' from source: magic vars 15330 1726882267.45891: variable 'omit' from source: magic vars 15330 1726882267.45938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882267.45980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882267.46011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882267.46034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882267.46052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882267.46096: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882267.46105: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882267.46113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882267.46220: Set connection var ansible_pipelining to False 15330 1726882267.46239: Set connection var ansible_timeout to 10 15330 1726882267.46300: Set connection var ansible_connection to ssh 15330 1726882267.46303: Set connection var ansible_shell_type to sh 15330 1726882267.46305: Set connection var ansible_shell_executable to /bin/sh 15330 1726882267.46307: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882267.46309: variable 'ansible_shell_executable' from source: unknown 15330 1726882267.46310: variable 'ansible_connection' from source: unknown 15330 1726882267.46312: variable 'ansible_module_compression' from source: unknown 15330 1726882267.46315: variable 'ansible_shell_type' from source: unknown 15330 1726882267.46317: variable 'ansible_shell_executable' from source: unknown 15330 1726882267.46321: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882267.46330: variable 'ansible_pipelining' from source: unknown 15330 1726882267.46337: variable 'ansible_timeout' from source: unknown 15330 1726882267.46345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882267.46538: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882267.46552: variable 'omit' from source: magic vars 15330 1726882267.46562: starting attempt loop 15330 1726882267.46625: running the handler 15330 1726882267.46629: variable 'ansible_facts' from source: unknown 15330 1726882267.46631: _low_level_execute_command(): starting 15330 1726882267.46634: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882267.47373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882267.47398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882267.47503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882267.47525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882267.47619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882267.49289: stdout chunk (state=3): >>>/root <<< 15330 1726882267.49391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882267.49440: stderr chunk (state=3): >>><<< 15330 1726882267.49459: stdout chunk (state=3): >>><<< 15330 1726882267.49498: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882267.49516: _low_level_execute_command(): starting 15330 1726882267.49525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517 `" && echo ansible-tmp-1726882267.4950428-16126-91565553318517="` echo /root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517 `" ) && sleep 0' 15330 1726882267.50200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882267.50216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882267.50230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882267.50259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882267.50277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882267.50297: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882267.50377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882267.50424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882267.50448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882267.50482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882267.50556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882267.52407: stdout chunk (state=3): >>>ansible-tmp-1726882267.4950428-16126-91565553318517=/root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517 <<< 15330 1726882267.52548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882267.52560: stderr chunk (state=3): >>><<< 15330 1726882267.52567: stdout chunk (state=3): >>><<< 15330 1726882267.52595: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882267.4950428-16126-91565553318517=/root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882267.52632: variable 'ansible_module_compression' from source: unknown 15330 1726882267.52798: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882267.52801: variable 'ansible_facts' from source: unknown 15330 1726882267.52971: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517/AnsiballZ_setup.py 15330 1726882267.53208: Sending initial data 15330 1726882267.53218: Sent initial data (153 bytes) 15330 1726882267.53736: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882267.53750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882267.53764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882267.53780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882267.53806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882267.53901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882267.53927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882267.53992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882267.55524: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882267.55595: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882267.55640: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmptyjkp7tj /root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517/AnsiballZ_setup.py <<< 15330 1726882267.55652: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517/AnsiballZ_setup.py" <<< 15330 1726882267.55701: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15330 1726882267.55715: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmptyjkp7tj" to remote "/root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517/AnsiballZ_setup.py" <<< 15330 1726882267.57361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882267.57365: stderr chunk (state=3): >>><<< 15330 1726882267.57367: stdout chunk (state=3): >>><<< 15330 1726882267.57370: done transferring module to remote 15330 1726882267.57381: _low_level_execute_command(): starting 15330 1726882267.57390: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517/ /root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517/AnsiballZ_setup.py && sleep 0' 15330 1726882267.58109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882267.58159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882267.58175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882267.58192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882267.58268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882267.60009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882267.60033: stderr chunk (state=3): >>><<< 15330 1726882267.60060: stdout chunk (state=3): >>><<< 15330 1726882267.60082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882267.60167: _low_level_execute_command(): starting 15330 1726882267.60171: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517/AnsiballZ_setup.py && sleep 0' 15330 1726882267.60766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882267.60808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15330 1726882267.60821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882267.60867: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882267.60932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882267.60994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882267.61037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882268.24536: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "07", "epoch": "1726882267", "epoch_int": "1726882267", "date": "2024-09-20", "time": "21:31:07", "iso8601_micro": "2024-09-21T01:31:07.872018Z", "iso8601": "2024-09-21T01:31:07Z", "iso8601_basic": "20240920T213107872018", "iso8601_basic_short": "20240920T213107", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_loadavg": {"1m": 1.025390625, "5m": 0.48291015625, "15m": 0.22314453125}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 575, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805170688, "block_size": 4096, "block_total": 65519099, "block_available": 63917278, "block_used": 1601821, "inode_total": 131070960, "inode_available": 131029133, "inode_used": 41827, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "26:84:62:af:e1:90", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882268.27003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882268.27008: stdout chunk (state=3): >>><<< 15330 1726882268.27010: stderr chunk (state=3): >>><<< 15330 1726882268.27013: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "07", "epoch": "1726882267", "epoch_int": "1726882267", "date": "2024-09-20", "time": "21:31:07", "iso8601_micro": "2024-09-21T01:31:07.872018Z", "iso8601": "2024-09-21T01:31:07Z", "iso8601_basic": "20240920T213107872018", "iso8601_basic_short": "20240920T213107", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_loadavg": {"1m": 1.025390625, "5m": 0.48291015625, "15m": 0.22314453125}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 575, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805170688, "block_size": 4096, "block_total": 65519099, "block_available": 63917278, "block_used": 1601821, "inode_total": 131070960, "inode_available": 131029133, "inode_used": 41827, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "26:84:62:af:e1:90", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882268.28059: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882268.28090: _low_level_execute_command(): starting 15330 1726882268.28130: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882267.4950428-16126-91565553318517/ > /dev/null 2>&1 && sleep 0' 15330 1726882268.29011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882268.29051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882268.29070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882268.29100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882268.29246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882268.31110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882268.31114: stdout chunk (state=3): >>><<< 15330 1726882268.31116: stderr chunk (state=3): >>><<< 15330 1726882268.31298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882268.31302: handler run complete 15330 1726882268.31313: variable 'ansible_facts' from source: unknown 15330 1726882268.31430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882268.31804: variable 'ansible_facts' from source: unknown 15330 1726882268.31910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882268.32071: attempt loop complete, returning result 15330 1726882268.32087: _execute() done 15330 1726882268.32095: dumping result to json 15330 1726882268.32133: done dumping result, returning 15330 1726882268.32144: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-00000000024e] 15330 1726882268.32152: sending task result for task 12673a56-9f93-e4fe-1358-00000000024e 15330 1726882268.33101: done sending task result for task 12673a56-9f93-e4fe-1358-00000000024e 15330 1726882268.33113: WORKER PROCESS EXITING ok: [managed_node3] 15330 1726882268.33533: no more pending results, returning what we have 15330 1726882268.33536: results queue empty 15330 1726882268.33537: checking for any_errors_fatal 15330 1726882268.33547: done checking for any_errors_fatal 15330 1726882268.33548: checking for max_fail_percentage 15330 1726882268.33550: done checking for max_fail_percentage 15330 1726882268.33551: checking to see if all hosts have failed and the running result is not ok 15330 1726882268.33552: done checking to see if all hosts have failed 15330 1726882268.33552: getting the remaining hosts for this loop 15330 1726882268.33553: done getting the remaining hosts for this loop 15330 1726882268.33557: getting the next task for host managed_node3 15330 1726882268.33561: done getting next task for host managed_node3 15330 1726882268.33563: ^ task is: TASK: meta (flush_handlers) 15330 1726882268.33565: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882268.33569: getting variables 15330 1726882268.33570: in VariableManager get_vars() 15330 1726882268.33591: Calling all_inventory to load vars for managed_node3 15330 1726882268.33595: Calling groups_inventory to load vars for managed_node3 15330 1726882268.33598: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882268.33608: Calling all_plugins_play to load vars for managed_node3 15330 1726882268.33611: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882268.33615: Calling groups_plugins_play to load vars for managed_node3 15330 1726882268.35044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882268.37306: done with get_vars() 15330 1726882268.37331: done getting variables 15330 1726882268.37417: in VariableManager get_vars() 15330 1726882268.37427: Calling all_inventory to load vars for managed_node3 15330 1726882268.37429: Calling groups_inventory to load vars for managed_node3 15330 1726882268.37432: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882268.37436: Calling all_plugins_play to load vars for managed_node3 15330 1726882268.37439: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882268.37441: Calling groups_plugins_play to load vars for managed_node3 15330 1726882268.38207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882268.39207: done with get_vars() 15330 1726882268.39230: done queuing things up, now waiting for results queue to drain 15330 1726882268.39232: results queue empty 15330 1726882268.39232: checking for any_errors_fatal 15330 1726882268.39235: done checking for any_errors_fatal 15330 1726882268.39236: checking for max_fail_percentage 15330 1726882268.39237: done checking for max_fail_percentage 15330 1726882268.39238: checking to see if all hosts have failed and the running result is not ok 15330 1726882268.39239: done checking to see if all hosts have failed 15330 1726882268.39243: getting the remaining hosts for this loop 15330 1726882268.39244: done getting the remaining hosts for this loop 15330 1726882268.39247: getting the next task for host managed_node3 15330 1726882268.39250: done getting next task for host managed_node3 15330 1726882268.39252: ^ task is: TASK: Include the task '{{ task }}' 15330 1726882268.39254: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882268.39257: getting variables 15330 1726882268.39258: in VariableManager get_vars() 15330 1726882268.39270: Calling all_inventory to load vars for managed_node3 15330 1726882268.39274: Calling groups_inventory to load vars for managed_node3 15330 1726882268.39276: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882268.39281: Calling all_plugins_play to load vars for managed_node3 15330 1726882268.39283: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882268.39286: Calling groups_plugins_play to load vars for managed_node3 15330 1726882268.40431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882268.42191: done with get_vars() 15330 1726882268.42215: done getting variables 15330 1726882268.42418: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:31:08 -0400 (0:00:00.979) 0:00:17.630 ****** 15330 1726882268.42458: entering _queue_task() for managed_node3/include_tasks 15330 1726882268.43027: worker is 1 (out of 1 available) 15330 1726882268.43039: exiting _queue_task() for managed_node3/include_tasks 15330 1726882268.43050: done queuing things up, now waiting for results queue to drain 15330 1726882268.43052: waiting for pending results... 15330 1726882268.43243: running TaskExecutor() for managed_node3/TASK: Include the task 'tasks/assert_profile_present.yml' 15330 1726882268.43313: in run() - task 12673a56-9f93-e4fe-1358-000000000031 15330 1726882268.43331: variable 'ansible_search_path' from source: unknown 15330 1726882268.43354: calling self._execute() 15330 1726882268.43454: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882268.43498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882268.43509: variable 'omit' from source: magic vars 15330 1726882268.44008: variable 'ansible_distribution_major_version' from source: facts 15330 1726882268.44017: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882268.44024: variable 'task' from source: play vars 15330 1726882268.44073: variable 'task' from source: play vars 15330 1726882268.44080: _execute() done 15330 1726882268.44083: dumping result to json 15330 1726882268.44086: done dumping result, returning 15330 1726882268.44097: done running TaskExecutor() for managed_node3/TASK: Include the task 'tasks/assert_profile_present.yml' [12673a56-9f93-e4fe-1358-000000000031] 15330 1726882268.44107: sending task result for task 12673a56-9f93-e4fe-1358-000000000031 15330 1726882268.44385: done sending task result for task 12673a56-9f93-e4fe-1358-000000000031 15330 1726882268.44391: WORKER PROCESS EXITING 15330 1726882268.44433: no more pending results, returning what we have 15330 1726882268.44442: in VariableManager get_vars() 15330 1726882268.44474: Calling all_inventory to load vars for managed_node3 15330 1726882268.44477: Calling groups_inventory to load vars for managed_node3 15330 1726882268.44481: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882268.44498: Calling all_plugins_play to load vars for managed_node3 15330 1726882268.44504: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882268.44514: Calling groups_plugins_play to load vars for managed_node3 15330 1726882268.46625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882268.48073: done with get_vars() 15330 1726882268.48089: variable 'ansible_search_path' from source: unknown 15330 1726882268.48101: we have included files to process 15330 1726882268.48102: generating all_blocks data 15330 1726882268.48103: done generating all_blocks data 15330 1726882268.48103: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15330 1726882268.48104: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15330 1726882268.48106: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15330 1726882268.48234: in VariableManager get_vars() 15330 1726882268.48245: done with get_vars() 15330 1726882268.48421: done processing included file 15330 1726882268.48422: iterating over new_blocks loaded from include file 15330 1726882268.48423: in VariableManager get_vars() 15330 1726882268.48431: done with get_vars() 15330 1726882268.48432: filtering new block on tags 15330 1726882268.48444: done filtering new block on tags 15330 1726882268.48445: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 15330 1726882268.48449: extending task lists for all hosts with included blocks 15330 1726882268.48468: done extending task lists 15330 1726882268.48468: done processing included files 15330 1726882268.48469: results queue empty 15330 1726882268.48469: checking for any_errors_fatal 15330 1726882268.48470: done checking for any_errors_fatal 15330 1726882268.48471: checking for max_fail_percentage 15330 1726882268.48471: done checking for max_fail_percentage 15330 1726882268.48472: checking to see if all hosts have failed and the running result is not ok 15330 1726882268.48472: done checking to see if all hosts have failed 15330 1726882268.48473: getting the remaining hosts for this loop 15330 1726882268.48474: done getting the remaining hosts for this loop 15330 1726882268.48475: getting the next task for host managed_node3 15330 1726882268.48477: done getting next task for host managed_node3 15330 1726882268.48479: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15330 1726882268.48481: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882268.48482: getting variables 15330 1726882268.48482: in VariableManager get_vars() 15330 1726882268.48488: Calling all_inventory to load vars for managed_node3 15330 1726882268.48489: Calling groups_inventory to load vars for managed_node3 15330 1726882268.48491: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882268.48496: Calling all_plugins_play to load vars for managed_node3 15330 1726882268.48497: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882268.48499: Calling groups_plugins_play to load vars for managed_node3 15330 1726882268.49187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882268.50551: done with get_vars() 15330 1726882268.50569: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:31:08 -0400 (0:00:00.082) 0:00:17.712 ****** 15330 1726882268.50682: entering _queue_task() for managed_node3/include_tasks 15330 1726882268.51088: worker is 1 (out of 1 available) 15330 1726882268.51101: exiting _queue_task() for managed_node3/include_tasks 15330 1726882268.51113: done queuing things up, now waiting for results queue to drain 15330 1726882268.51115: waiting for pending results... 15330 1726882268.51626: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 15330 1726882268.51632: in run() - task 12673a56-9f93-e4fe-1358-00000000025f 15330 1726882268.51636: variable 'ansible_search_path' from source: unknown 15330 1726882268.51638: variable 'ansible_search_path' from source: unknown 15330 1726882268.51641: calling self._execute() 15330 1726882268.51764: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882268.51775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882268.51801: variable 'omit' from source: magic vars 15330 1726882268.52273: variable 'ansible_distribution_major_version' from source: facts 15330 1726882268.52277: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882268.52280: _execute() done 15330 1726882268.52283: dumping result to json 15330 1726882268.52285: done dumping result, returning 15330 1726882268.52287: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-e4fe-1358-00000000025f] 15330 1726882268.52289: sending task result for task 12673a56-9f93-e4fe-1358-00000000025f 15330 1726882268.52408: no more pending results, returning what we have 15330 1726882268.52415: in VariableManager get_vars() 15330 1726882268.52447: Calling all_inventory to load vars for managed_node3 15330 1726882268.52449: Calling groups_inventory to load vars for managed_node3 15330 1726882268.52452: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882268.52465: Calling all_plugins_play to load vars for managed_node3 15330 1726882268.52468: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882268.52470: Calling groups_plugins_play to load vars for managed_node3 15330 1726882268.53237: done sending task result for task 12673a56-9f93-e4fe-1358-00000000025f 15330 1726882268.53240: WORKER PROCESS EXITING 15330 1726882268.54492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882268.56909: done with get_vars() 15330 1726882268.56931: variable 'ansible_search_path' from source: unknown 15330 1726882268.56933: variable 'ansible_search_path' from source: unknown 15330 1726882268.56944: variable 'task' from source: play vars 15330 1726882268.57082: variable 'task' from source: play vars 15330 1726882268.57127: we have included files to process 15330 1726882268.57129: generating all_blocks data 15330 1726882268.57130: done generating all_blocks data 15330 1726882268.57132: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15330 1726882268.57133: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15330 1726882268.57135: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15330 1726882268.58522: done processing included file 15330 1726882268.58564: iterating over new_blocks loaded from include file 15330 1726882268.58566: in VariableManager get_vars() 15330 1726882268.58584: done with get_vars() 15330 1726882268.58590: filtering new block on tags 15330 1726882268.58616: done filtering new block on tags 15330 1726882268.58622: in VariableManager get_vars() 15330 1726882268.58640: done with get_vars() 15330 1726882268.58642: filtering new block on tags 15330 1726882268.58663: done filtering new block on tags 15330 1726882268.58665: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 15330 1726882268.58671: extending task lists for all hosts with included blocks 15330 1726882268.58898: done extending task lists 15330 1726882268.58899: done processing included files 15330 1726882268.58900: results queue empty 15330 1726882268.58901: checking for any_errors_fatal 15330 1726882268.58904: done checking for any_errors_fatal 15330 1726882268.58905: checking for max_fail_percentage 15330 1726882268.58906: done checking for max_fail_percentage 15330 1726882268.58907: checking to see if all hosts have failed and the running result is not ok 15330 1726882268.58908: done checking to see if all hosts have failed 15330 1726882268.58908: getting the remaining hosts for this loop 15330 1726882268.58909: done getting the remaining hosts for this loop 15330 1726882268.58912: getting the next task for host managed_node3 15330 1726882268.58916: done getting next task for host managed_node3 15330 1726882268.58918: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15330 1726882268.58921: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882268.58929: getting variables 15330 1726882268.58930: in VariableManager get_vars() 15330 1726882268.59085: Calling all_inventory to load vars for managed_node3 15330 1726882268.59091: Calling groups_inventory to load vars for managed_node3 15330 1726882268.59095: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882268.59100: Calling all_plugins_play to load vars for managed_node3 15330 1726882268.59102: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882268.59105: Calling groups_plugins_play to load vars for managed_node3 15330 1726882268.65149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882268.67761: done with get_vars() 15330 1726882268.67791: done getting variables 15330 1726882268.67833: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:31:08 -0400 (0:00:00.172) 0:00:17.885 ****** 15330 1726882268.67976: entering _queue_task() for managed_node3/set_fact 15330 1726882268.68779: worker is 1 (out of 1 available) 15330 1726882268.68941: exiting _queue_task() for managed_node3/set_fact 15330 1726882268.68952: done queuing things up, now waiting for results queue to drain 15330 1726882268.68953: waiting for pending results... 15330 1726882268.69451: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 15330 1726882268.69625: in run() - task 12673a56-9f93-e4fe-1358-00000000026c 15330 1726882268.69648: variable 'ansible_search_path' from source: unknown 15330 1726882268.69657: variable 'ansible_search_path' from source: unknown 15330 1726882268.69716: calling self._execute() 15330 1726882268.69836: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882268.69850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882268.69866: variable 'omit' from source: magic vars 15330 1726882268.70336: variable 'ansible_distribution_major_version' from source: facts 15330 1726882268.70464: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882268.70471: variable 'omit' from source: magic vars 15330 1726882268.70474: variable 'omit' from source: magic vars 15330 1726882268.70477: variable 'omit' from source: magic vars 15330 1726882268.70525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882268.70576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882268.70614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882268.70638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882268.70655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882268.70714: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882268.70727: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882268.70736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882268.70868: Set connection var ansible_pipelining to False 15330 1726882268.70900: Set connection var ansible_timeout to 10 15330 1726882268.70999: Set connection var ansible_connection to ssh 15330 1726882268.71002: Set connection var ansible_shell_type to sh 15330 1726882268.71012: Set connection var ansible_shell_executable to /bin/sh 15330 1726882268.71015: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882268.71018: variable 'ansible_shell_executable' from source: unknown 15330 1726882268.71023: variable 'ansible_connection' from source: unknown 15330 1726882268.71029: variable 'ansible_module_compression' from source: unknown 15330 1726882268.71031: variable 'ansible_shell_type' from source: unknown 15330 1726882268.71033: variable 'ansible_shell_executable' from source: unknown 15330 1726882268.71035: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882268.71037: variable 'ansible_pipelining' from source: unknown 15330 1726882268.71043: variable 'ansible_timeout' from source: unknown 15330 1726882268.71046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882268.71271: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882268.71276: variable 'omit' from source: magic vars 15330 1726882268.71279: starting attempt loop 15330 1726882268.71281: running the handler 15330 1726882268.71283: handler run complete 15330 1726882268.71295: attempt loop complete, returning result 15330 1726882268.71304: _execute() done 15330 1726882268.71311: dumping result to json 15330 1726882268.71320: done dumping result, returning 15330 1726882268.71340: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-e4fe-1358-00000000026c] 15330 1726882268.71379: sending task result for task 12673a56-9f93-e4fe-1358-00000000026c 15330 1726882268.71582: done sending task result for task 12673a56-9f93-e4fe-1358-00000000026c 15330 1726882268.71586: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15330 1726882268.71656: no more pending results, returning what we have 15330 1726882268.71666: results queue empty 15330 1726882268.71668: checking for any_errors_fatal 15330 1726882268.71670: done checking for any_errors_fatal 15330 1726882268.71671: checking for max_fail_percentage 15330 1726882268.71673: done checking for max_fail_percentage 15330 1726882268.71674: checking to see if all hosts have failed and the running result is not ok 15330 1726882268.71674: done checking to see if all hosts have failed 15330 1726882268.71675: getting the remaining hosts for this loop 15330 1726882268.71677: done getting the remaining hosts for this loop 15330 1726882268.71681: getting the next task for host managed_node3 15330 1726882268.71692: done getting next task for host managed_node3 15330 1726882268.71800: ^ task is: TASK: Stat profile file 15330 1726882268.71807: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882268.71812: getting variables 15330 1726882268.71813: in VariableManager get_vars() 15330 1726882268.71843: Calling all_inventory to load vars for managed_node3 15330 1726882268.71846: Calling groups_inventory to load vars for managed_node3 15330 1726882268.71850: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882268.71861: Calling all_plugins_play to load vars for managed_node3 15330 1726882268.71864: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882268.71867: Calling groups_plugins_play to load vars for managed_node3 15330 1726882268.74924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882268.77718: done with get_vars() 15330 1726882268.77744: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:31:08 -0400 (0:00:00.098) 0:00:17.984 ****** 15330 1726882268.77871: entering _queue_task() for managed_node3/stat 15330 1726882268.78354: worker is 1 (out of 1 available) 15330 1726882268.78369: exiting _queue_task() for managed_node3/stat 15330 1726882268.78380: done queuing things up, now waiting for results queue to drain 15330 1726882268.78381: waiting for pending results... 15330 1726882268.78773: running TaskExecutor() for managed_node3/TASK: Stat profile file 15330 1726882268.78816: in run() - task 12673a56-9f93-e4fe-1358-00000000026d 15330 1726882268.78837: variable 'ansible_search_path' from source: unknown 15330 1726882268.78845: variable 'ansible_search_path' from source: unknown 15330 1726882268.78902: calling self._execute() 15330 1726882268.79037: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882268.79055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882268.79095: variable 'omit' from source: magic vars 15330 1726882268.79637: variable 'ansible_distribution_major_version' from source: facts 15330 1726882268.79641: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882268.79644: variable 'omit' from source: magic vars 15330 1726882268.79699: variable 'omit' from source: magic vars 15330 1726882268.79809: variable 'profile' from source: play vars 15330 1726882268.79818: variable 'interface' from source: set_fact 15330 1726882268.79904: variable 'interface' from source: set_fact 15330 1726882268.79935: variable 'omit' from source: magic vars 15330 1726882268.80007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882268.80040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882268.80065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882268.80106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882268.80127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882268.80188: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882268.80192: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882268.80197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882268.80490: Set connection var ansible_pipelining to False 15330 1726882268.80600: Set connection var ansible_timeout to 10 15330 1726882268.80603: Set connection var ansible_connection to ssh 15330 1726882268.80605: Set connection var ansible_shell_type to sh 15330 1726882268.80607: Set connection var ansible_shell_executable to /bin/sh 15330 1726882268.80609: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882268.80611: variable 'ansible_shell_executable' from source: unknown 15330 1726882268.80613: variable 'ansible_connection' from source: unknown 15330 1726882268.80614: variable 'ansible_module_compression' from source: unknown 15330 1726882268.80616: variable 'ansible_shell_type' from source: unknown 15330 1726882268.80618: variable 'ansible_shell_executable' from source: unknown 15330 1726882268.80619: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882268.80621: variable 'ansible_pipelining' from source: unknown 15330 1726882268.80623: variable 'ansible_timeout' from source: unknown 15330 1726882268.80625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882268.81151: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882268.81167: variable 'omit' from source: magic vars 15330 1726882268.81178: starting attempt loop 15330 1726882268.81184: running the handler 15330 1726882268.81203: _low_level_execute_command(): starting 15330 1726882268.81216: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882268.83283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882268.83340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882268.83543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882268.83669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882268.85346: stdout chunk (state=3): >>>/root <<< 15330 1726882268.85475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882268.85486: stdout chunk (state=3): >>><<< 15330 1726882268.85502: stderr chunk (state=3): >>><<< 15330 1726882268.85531: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882268.85631: _low_level_execute_command(): starting 15330 1726882268.85635: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927 `" && echo ansible-tmp-1726882268.8553886-16180-115042505927927="` echo /root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927 `" ) && sleep 0' 15330 1726882268.86183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882268.86198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882268.86218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882268.86239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882268.86311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882268.86371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882268.86389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882268.86412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882268.86498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882268.88400: stdout chunk (state=3): >>>ansible-tmp-1726882268.8553886-16180-115042505927927=/root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927 <<< 15330 1726882268.88568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882268.88572: stdout chunk (state=3): >>><<< 15330 1726882268.88574: stderr chunk (state=3): >>><<< 15330 1726882268.88913: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882268.8553886-16180-115042505927927=/root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882268.88916: variable 'ansible_module_compression' from source: unknown 15330 1726882268.88922: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15330 1726882268.88962: variable 'ansible_facts' from source: unknown 15330 1726882268.89162: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927/AnsiballZ_stat.py 15330 1726882268.89488: Sending initial data 15330 1726882268.89491: Sent initial data (153 bytes) 15330 1726882268.90219: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882268.90228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882268.90252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882268.90256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882268.90313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882268.90323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882268.90369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882268.91933: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882268.92247: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882268.92292: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmprsmxr_tq /root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927/AnsiballZ_stat.py <<< 15330 1726882268.92298: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927/AnsiballZ_stat.py" <<< 15330 1726882268.92358: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmprsmxr_tq" to remote "/root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927/AnsiballZ_stat.py" <<< 15330 1726882268.93540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882268.93590: stderr chunk (state=3): >>><<< 15330 1726882268.93598: stdout chunk (state=3): >>><<< 15330 1726882268.93622: done transferring module to remote 15330 1726882268.93631: _low_level_execute_command(): starting 15330 1726882268.93637: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927/ /root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927/AnsiballZ_stat.py && sleep 0' 15330 1726882268.94229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882268.94239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882268.94249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882268.94263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882268.94275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882268.94284: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882268.94304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882268.94310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882268.94399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882268.94409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882268.94421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882268.94486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882268.96564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882268.96569: stdout chunk (state=3): >>><<< 15330 1726882268.96572: stderr chunk (state=3): >>><<< 15330 1726882268.96576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882268.96579: _low_level_execute_command(): starting 15330 1726882268.96581: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927/AnsiballZ_stat.py && sleep 0' 15330 1726882268.97658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882268.98001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882268.98232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882268.98285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882269.13376: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15330 1726882269.14819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882269.14824: stderr chunk (state=3): >>><<< 15330 1726882269.14839: stdout chunk (state=3): >>><<< 15330 1726882269.14857: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882269.14883: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882269.14892: _low_level_execute_command(): starting 15330 1726882269.14900: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882268.8553886-16180-115042505927927/ > /dev/null 2>&1 && sleep 0' 15330 1726882269.15559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882269.15565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882269.15600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882269.15604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882269.15645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882269.15649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882269.15711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882269.17587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882269.17600: stdout chunk (state=3): >>><<< 15330 1726882269.17603: stderr chunk (state=3): >>><<< 15330 1726882269.17634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882269.17640: handler run complete 15330 1726882269.17669: attempt loop complete, returning result 15330 1726882269.17672: _execute() done 15330 1726882269.17675: dumping result to json 15330 1726882269.17677: done dumping result, returning 15330 1726882269.17682: done running TaskExecutor() for managed_node3/TASK: Stat profile file [12673a56-9f93-e4fe-1358-00000000026d] 15330 1726882269.17703: sending task result for task 12673a56-9f93-e4fe-1358-00000000026d 15330 1726882269.17945: done sending task result for task 12673a56-9f93-e4fe-1358-00000000026d 15330 1726882269.17948: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 15330 1726882269.18021: no more pending results, returning what we have 15330 1726882269.18025: results queue empty 15330 1726882269.18026: checking for any_errors_fatal 15330 1726882269.18034: done checking for any_errors_fatal 15330 1726882269.18035: checking for max_fail_percentage 15330 1726882269.18037: done checking for max_fail_percentage 15330 1726882269.18038: checking to see if all hosts have failed and the running result is not ok 15330 1726882269.18039: done checking to see if all hosts have failed 15330 1726882269.18040: getting the remaining hosts for this loop 15330 1726882269.18041: done getting the remaining hosts for this loop 15330 1726882269.18045: getting the next task for host managed_node3 15330 1726882269.18169: done getting next task for host managed_node3 15330 1726882269.18173: ^ task is: TASK: Set NM profile exist flag based on the profile files 15330 1726882269.18176: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882269.18183: getting variables 15330 1726882269.18185: in VariableManager get_vars() 15330 1726882269.18230: Calling all_inventory to load vars for managed_node3 15330 1726882269.18233: Calling groups_inventory to load vars for managed_node3 15330 1726882269.18237: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882269.18251: Calling all_plugins_play to load vars for managed_node3 15330 1726882269.18257: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882269.18261: Calling groups_plugins_play to load vars for managed_node3 15330 1726882269.21185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882269.23226: done with get_vars() 15330 1726882269.23251: done getting variables 15330 1726882269.23322: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:31:09 -0400 (0:00:00.454) 0:00:18.439 ****** 15330 1726882269.23359: entering _queue_task() for managed_node3/set_fact 15330 1726882269.23836: worker is 1 (out of 1 available) 15330 1726882269.23847: exiting _queue_task() for managed_node3/set_fact 15330 1726882269.23857: done queuing things up, now waiting for results queue to drain 15330 1726882269.23858: waiting for pending results... 15330 1726882269.24311: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 15330 1726882269.24318: in run() - task 12673a56-9f93-e4fe-1358-00000000026e 15330 1726882269.24322: variable 'ansible_search_path' from source: unknown 15330 1726882269.24325: variable 'ansible_search_path' from source: unknown 15330 1726882269.24328: calling self._execute() 15330 1726882269.24346: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882269.24353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882269.24363: variable 'omit' from source: magic vars 15330 1726882269.24792: variable 'ansible_distribution_major_version' from source: facts 15330 1726882269.24853: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882269.25168: variable 'profile_stat' from source: set_fact 15330 1726882269.25220: Evaluated conditional (profile_stat.stat.exists): False 15330 1726882269.25234: when evaluation is False, skipping this task 15330 1726882269.25245: _execute() done 15330 1726882269.25333: dumping result to json 15330 1726882269.25337: done dumping result, returning 15330 1726882269.25340: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-e4fe-1358-00000000026e] 15330 1726882269.25345: sending task result for task 12673a56-9f93-e4fe-1358-00000000026e skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15330 1726882269.25600: no more pending results, returning what we have 15330 1726882269.25604: results queue empty 15330 1726882269.25605: checking for any_errors_fatal 15330 1726882269.25622: done checking for any_errors_fatal 15330 1726882269.25624: checking for max_fail_percentage 15330 1726882269.25625: done checking for max_fail_percentage 15330 1726882269.25626: checking to see if all hosts have failed and the running result is not ok 15330 1726882269.25627: done checking to see if all hosts have failed 15330 1726882269.25628: getting the remaining hosts for this loop 15330 1726882269.25629: done getting the remaining hosts for this loop 15330 1726882269.25636: getting the next task for host managed_node3 15330 1726882269.25646: done getting next task for host managed_node3 15330 1726882269.25649: ^ task is: TASK: Get NM profile info 15330 1726882269.25654: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882269.25657: getting variables 15330 1726882269.25659: in VariableManager get_vars() 15330 1726882269.25704: Calling all_inventory to load vars for managed_node3 15330 1726882269.25709: Calling groups_inventory to load vars for managed_node3 15330 1726882269.25713: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882269.25726: Calling all_plugins_play to load vars for managed_node3 15330 1726882269.25729: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882269.25732: Calling groups_plugins_play to load vars for managed_node3 15330 1726882269.26311: done sending task result for task 12673a56-9f93-e4fe-1358-00000000026e 15330 1726882269.26315: WORKER PROCESS EXITING 15330 1726882269.27924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882269.29632: done with get_vars() 15330 1726882269.29657: done getting variables 15330 1726882269.29778: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:31:09 -0400 (0:00:00.064) 0:00:18.504 ****** 15330 1726882269.29813: entering _queue_task() for managed_node3/shell 15330 1726882269.29815: Creating lock for shell 15330 1726882269.30243: worker is 1 (out of 1 available) 15330 1726882269.30257: exiting _queue_task() for managed_node3/shell 15330 1726882269.30269: done queuing things up, now waiting for results queue to drain 15330 1726882269.30271: waiting for pending results... 15330 1726882269.30606: running TaskExecutor() for managed_node3/TASK: Get NM profile info 15330 1726882269.30811: in run() - task 12673a56-9f93-e4fe-1358-00000000026f 15330 1726882269.30815: variable 'ansible_search_path' from source: unknown 15330 1726882269.30819: variable 'ansible_search_path' from source: unknown 15330 1726882269.30828: calling self._execute() 15330 1726882269.30928: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882269.30941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882269.30956: variable 'omit' from source: magic vars 15330 1726882269.31425: variable 'ansible_distribution_major_version' from source: facts 15330 1726882269.31444: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882269.31461: variable 'omit' from source: magic vars 15330 1726882269.31527: variable 'omit' from source: magic vars 15330 1726882269.31702: variable 'profile' from source: play vars 15330 1726882269.31711: variable 'interface' from source: set_fact 15330 1726882269.31825: variable 'interface' from source: set_fact 15330 1726882269.31857: variable 'omit' from source: magic vars 15330 1726882269.31920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882269.32015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882269.32066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882269.32091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882269.32133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882269.32155: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882269.32198: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882269.32201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882269.32360: Set connection var ansible_pipelining to False 15330 1726882269.32364: Set connection var ansible_timeout to 10 15330 1726882269.32366: Set connection var ansible_connection to ssh 15330 1726882269.32368: Set connection var ansible_shell_type to sh 15330 1726882269.32375: Set connection var ansible_shell_executable to /bin/sh 15330 1726882269.32421: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882269.32447: variable 'ansible_shell_executable' from source: unknown 15330 1726882269.32464: variable 'ansible_connection' from source: unknown 15330 1726882269.32566: variable 'ansible_module_compression' from source: unknown 15330 1726882269.32569: variable 'ansible_shell_type' from source: unknown 15330 1726882269.32572: variable 'ansible_shell_executable' from source: unknown 15330 1726882269.32575: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882269.32582: variable 'ansible_pipelining' from source: unknown 15330 1726882269.32586: variable 'ansible_timeout' from source: unknown 15330 1726882269.32594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882269.32830: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882269.32927: variable 'omit' from source: magic vars 15330 1726882269.32931: starting attempt loop 15330 1726882269.32933: running the handler 15330 1726882269.32937: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882269.32940: _low_level_execute_command(): starting 15330 1726882269.32942: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882269.33817: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882269.33919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882269.33939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882269.34018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882269.34205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882269.36050: stdout chunk (state=3): >>>/root <<< 15330 1726882269.36053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882269.36055: stdout chunk (state=3): >>><<< 15330 1726882269.36057: stderr chunk (state=3): >>><<< 15330 1726882269.36062: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882269.36065: _low_level_execute_command(): starting 15330 1726882269.36068: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511 `" && echo ansible-tmp-1726882269.3596706-16206-89966096309511="` echo /root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511 `" ) && sleep 0' 15330 1726882269.36925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882269.36935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882269.36937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882269.36940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882269.36942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882269.36945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882269.37001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882269.37005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882269.37065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882269.39399: stdout chunk (state=3): >>>ansible-tmp-1726882269.3596706-16206-89966096309511=/root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511 <<< 15330 1726882269.39403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882269.39405: stdout chunk (state=3): >>><<< 15330 1726882269.39408: stderr chunk (state=3): >>><<< 15330 1726882269.39599: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882269.3596706-16206-89966096309511=/root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882269.39603: variable 'ansible_module_compression' from source: unknown 15330 1726882269.39606: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15330 1726882269.39608: variable 'ansible_facts' from source: unknown 15330 1726882269.39634: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511/AnsiballZ_command.py 15330 1726882269.39933: Sending initial data 15330 1726882269.40108: Sent initial data (155 bytes) 15330 1726882269.40965: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882269.40981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882269.41109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882269.41152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882269.41173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882269.41209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882269.41290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882269.42816: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882269.42885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882269.42927: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp9o__yeco /root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511/AnsiballZ_command.py <<< 15330 1726882269.42936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511/AnsiballZ_command.py" <<< 15330 1726882269.43001: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp9o__yeco" to remote "/root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511/AnsiballZ_command.py" <<< 15330 1726882269.44870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882269.44932: stderr chunk (state=3): >>><<< 15330 1726882269.44940: stdout chunk (state=3): >>><<< 15330 1726882269.45104: done transferring module to remote 15330 1726882269.45119: _low_level_execute_command(): starting 15330 1726882269.45130: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511/ /root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511/AnsiballZ_command.py && sleep 0' 15330 1726882269.46363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882269.46478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882269.46490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882269.46509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882269.46564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882269.46606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882269.46698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882269.46902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882269.48503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882269.48545: stderr chunk (state=3): >>><<< 15330 1726882269.48549: stdout chunk (state=3): >>><<< 15330 1726882269.48571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882269.48581: _low_level_execute_command(): starting 15330 1726882269.48590: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511/AnsiballZ_command.py && sleep 0' 15330 1726882269.49856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882269.49870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882269.49981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882269.50106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882269.50128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882269.50266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882269.66950: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:31:09.650849", "end": "2024-09-20 21:31:09.667789", "delta": "0:00:00.016940", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15330 1726882269.68501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882269.68652: stdout chunk (state=3): >>><<< 15330 1726882269.68656: stderr chunk (state=3): >>><<< 15330 1726882269.68659: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:31:09.650849", "end": "2024-09-20 21:31:09.667789", "delta": "0:00:00.016940", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882269.68662: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882269.68665: _low_level_execute_command(): starting 15330 1726882269.68667: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882269.3596706-16206-89966096309511/ > /dev/null 2>&1 && sleep 0' 15330 1726882269.69440: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882269.69456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882269.69470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882269.69508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882269.69521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882269.69542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882269.69627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882269.69653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882269.69671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882269.69748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882269.71599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882269.71602: stdout chunk (state=3): >>><<< 15330 1726882269.71605: stderr chunk (state=3): >>><<< 15330 1726882269.71607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882269.71609: handler run complete 15330 1726882269.71900: Evaluated conditional (False): False 15330 1726882269.71904: attempt loop complete, returning result 15330 1726882269.71906: _execute() done 15330 1726882269.71908: dumping result to json 15330 1726882269.71910: done dumping result, returning 15330 1726882269.71912: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [12673a56-9f93-e4fe-1358-00000000026f] 15330 1726882269.71914: sending task result for task 12673a56-9f93-e4fe-1358-00000000026f 15330 1726882269.71980: done sending task result for task 12673a56-9f93-e4fe-1358-00000000026f 15330 1726882269.71984: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.016940", "end": "2024-09-20 21:31:09.667789", "rc": 0, "start": "2024-09-20 21:31:09.650849" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 15330 1726882269.72055: no more pending results, returning what we have 15330 1726882269.72058: results queue empty 15330 1726882269.72059: checking for any_errors_fatal 15330 1726882269.72066: done checking for any_errors_fatal 15330 1726882269.72067: checking for max_fail_percentage 15330 1726882269.72069: done checking for max_fail_percentage 15330 1726882269.72069: checking to see if all hosts have failed and the running result is not ok 15330 1726882269.72070: done checking to see if all hosts have failed 15330 1726882269.72071: getting the remaining hosts for this loop 15330 1726882269.72072: done getting the remaining hosts for this loop 15330 1726882269.72075: getting the next task for host managed_node3 15330 1726882269.72082: done getting next task for host managed_node3 15330 1726882269.72084: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15330 1726882269.72091: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882269.72096: getting variables 15330 1726882269.72098: in VariableManager get_vars() 15330 1726882269.72127: Calling all_inventory to load vars for managed_node3 15330 1726882269.72129: Calling groups_inventory to load vars for managed_node3 15330 1726882269.72132: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882269.72143: Calling all_plugins_play to load vars for managed_node3 15330 1726882269.72146: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882269.72148: Calling groups_plugins_play to load vars for managed_node3 15330 1726882269.74779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882269.78118: done with get_vars() 15330 1726882269.78146: done getting variables 15330 1726882269.78518: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:31:09 -0400 (0:00:00.487) 0:00:18.991 ****** 15330 1726882269.78552: entering _queue_task() for managed_node3/set_fact 15330 1726882269.79526: worker is 1 (out of 1 available) 15330 1726882269.79536: exiting _queue_task() for managed_node3/set_fact 15330 1726882269.79547: done queuing things up, now waiting for results queue to drain 15330 1726882269.79548: waiting for pending results... 15330 1726882269.79833: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15330 1726882269.80031: in run() - task 12673a56-9f93-e4fe-1358-000000000270 15330 1726882269.80035: variable 'ansible_search_path' from source: unknown 15330 1726882269.80039: variable 'ansible_search_path' from source: unknown 15330 1726882269.80299: calling self._execute() 15330 1726882269.80331: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882269.80346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882269.80363: variable 'omit' from source: magic vars 15330 1726882269.81200: variable 'ansible_distribution_major_version' from source: facts 15330 1726882269.81204: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882269.81523: variable 'nm_profile_exists' from source: set_fact 15330 1726882269.81546: Evaluated conditional (nm_profile_exists.rc == 0): True 15330 1726882269.81575: variable 'omit' from source: magic vars 15330 1726882269.81648: variable 'omit' from source: magic vars 15330 1726882269.81822: variable 'omit' from source: magic vars 15330 1726882269.81868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882269.82099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882269.82104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882269.82106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882269.82109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882269.82333: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882269.82336: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882269.82338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882269.82339: Set connection var ansible_pipelining to False 15330 1726882269.82410: Set connection var ansible_timeout to 10 15330 1726882269.82449: Set connection var ansible_connection to ssh 15330 1726882269.82458: Set connection var ansible_shell_type to sh 15330 1726882269.82470: Set connection var ansible_shell_executable to /bin/sh 15330 1726882269.82481: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882269.82574: variable 'ansible_shell_executable' from source: unknown 15330 1726882269.82582: variable 'ansible_connection' from source: unknown 15330 1726882269.82591: variable 'ansible_module_compression' from source: unknown 15330 1726882269.82601: variable 'ansible_shell_type' from source: unknown 15330 1726882269.82607: variable 'ansible_shell_executable' from source: unknown 15330 1726882269.82613: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882269.82620: variable 'ansible_pipelining' from source: unknown 15330 1726882269.82626: variable 'ansible_timeout' from source: unknown 15330 1726882269.82634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882269.82923: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882269.82938: variable 'omit' from source: magic vars 15330 1726882269.82948: starting attempt loop 15330 1726882269.82999: running the handler 15330 1726882269.83308: handler run complete 15330 1726882269.83312: attempt loop complete, returning result 15330 1726882269.83315: _execute() done 15330 1726882269.83317: dumping result to json 15330 1726882269.83319: done dumping result, returning 15330 1726882269.83321: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-e4fe-1358-000000000270] 15330 1726882269.83323: sending task result for task 12673a56-9f93-e4fe-1358-000000000270 15330 1726882269.83389: done sending task result for task 12673a56-9f93-e4fe-1358-000000000270 15330 1726882269.83392: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 15330 1726882269.83467: no more pending results, returning what we have 15330 1726882269.83471: results queue empty 15330 1726882269.83472: checking for any_errors_fatal 15330 1726882269.83481: done checking for any_errors_fatal 15330 1726882269.83481: checking for max_fail_percentage 15330 1726882269.83483: done checking for max_fail_percentage 15330 1726882269.83484: checking to see if all hosts have failed and the running result is not ok 15330 1726882269.83485: done checking to see if all hosts have failed 15330 1726882269.83486: getting the remaining hosts for this loop 15330 1726882269.83491: done getting the remaining hosts for this loop 15330 1726882269.83497: getting the next task for host managed_node3 15330 1726882269.83509: done getting next task for host managed_node3 15330 1726882269.83511: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15330 1726882269.83516: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882269.83520: getting variables 15330 1726882269.83522: in VariableManager get_vars() 15330 1726882269.83551: Calling all_inventory to load vars for managed_node3 15330 1726882269.83554: Calling groups_inventory to load vars for managed_node3 15330 1726882269.83558: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882269.83570: Calling all_plugins_play to load vars for managed_node3 15330 1726882269.83573: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882269.83577: Calling groups_plugins_play to load vars for managed_node3 15330 1726882269.87091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882269.90451: done with get_vars() 15330 1726882269.90477: done getting variables 15330 1726882269.90647: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882269.90946: variable 'profile' from source: play vars 15330 1726882269.90950: variable 'interface' from source: set_fact 15330 1726882269.91013: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:31:09 -0400 (0:00:00.125) 0:00:19.117 ****** 15330 1726882269.91138: entering _queue_task() for managed_node3/command 15330 1726882269.91942: worker is 1 (out of 1 available) 15330 1726882269.91953: exiting _queue_task() for managed_node3/command 15330 1726882269.91965: done queuing things up, now waiting for results queue to drain 15330 1726882269.91966: waiting for pending results... 15330 1726882269.92306: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15330 1726882269.92684: in run() - task 12673a56-9f93-e4fe-1358-000000000272 15330 1726882269.92705: variable 'ansible_search_path' from source: unknown 15330 1726882269.92709: variable 'ansible_search_path' from source: unknown 15330 1726882269.92741: calling self._execute() 15330 1726882269.93000: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882269.93004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882269.93007: variable 'omit' from source: magic vars 15330 1726882269.93713: variable 'ansible_distribution_major_version' from source: facts 15330 1726882269.93723: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882269.93945: variable 'profile_stat' from source: set_fact 15330 1726882269.93957: Evaluated conditional (profile_stat.stat.exists): False 15330 1726882269.93960: when evaluation is False, skipping this task 15330 1726882269.93963: _execute() done 15330 1726882269.93965: dumping result to json 15330 1726882269.93967: done dumping result, returning 15330 1726882269.93998: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000272] 15330 1726882269.94001: sending task result for task 12673a56-9f93-e4fe-1358-000000000272 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15330 1726882269.94364: no more pending results, returning what we have 15330 1726882269.94367: results queue empty 15330 1726882269.94368: checking for any_errors_fatal 15330 1726882269.94376: done checking for any_errors_fatal 15330 1726882269.94377: checking for max_fail_percentage 15330 1726882269.94378: done checking for max_fail_percentage 15330 1726882269.94379: checking to see if all hosts have failed and the running result is not ok 15330 1726882269.94380: done checking to see if all hosts have failed 15330 1726882269.94380: getting the remaining hosts for this loop 15330 1726882269.94381: done getting the remaining hosts for this loop 15330 1726882269.94385: getting the next task for host managed_node3 15330 1726882269.94396: done getting next task for host managed_node3 15330 1726882269.94399: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15330 1726882269.94404: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882269.94408: getting variables 15330 1726882269.94409: in VariableManager get_vars() 15330 1726882269.94442: Calling all_inventory to load vars for managed_node3 15330 1726882269.94445: Calling groups_inventory to load vars for managed_node3 15330 1726882269.94449: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882269.94460: Calling all_plugins_play to load vars for managed_node3 15330 1726882269.94463: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882269.94465: Calling groups_plugins_play to load vars for managed_node3 15330 1726882269.95501: done sending task result for task 12673a56-9f93-e4fe-1358-000000000272 15330 1726882269.95504: WORKER PROCESS EXITING 15330 1726882269.97013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.00748: done with get_vars() 15330 1726882270.00771: done getting variables 15330 1726882270.00833: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882270.01231: variable 'profile' from source: play vars 15330 1726882270.01235: variable 'interface' from source: set_fact 15330 1726882270.01410: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:31:10 -0400 (0:00:00.103) 0:00:19.220 ****** 15330 1726882270.01442: entering _queue_task() for managed_node3/set_fact 15330 1726882270.02142: worker is 1 (out of 1 available) 15330 1726882270.02158: exiting _queue_task() for managed_node3/set_fact 15330 1726882270.02170: done queuing things up, now waiting for results queue to drain 15330 1726882270.02171: waiting for pending results... 15330 1726882270.02573: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15330 1726882270.02675: in run() - task 12673a56-9f93-e4fe-1358-000000000273 15330 1726882270.02692: variable 'ansible_search_path' from source: unknown 15330 1726882270.02696: variable 'ansible_search_path' from source: unknown 15330 1726882270.03199: calling self._execute() 15330 1726882270.03202: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.03205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.03207: variable 'omit' from source: magic vars 15330 1726882270.03872: variable 'ansible_distribution_major_version' from source: facts 15330 1726882270.03884: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882270.04273: variable 'profile_stat' from source: set_fact 15330 1726882270.04287: Evaluated conditional (profile_stat.stat.exists): False 15330 1726882270.04291: when evaluation is False, skipping this task 15330 1726882270.04295: _execute() done 15330 1726882270.04422: dumping result to json 15330 1726882270.04426: done dumping result, returning 15330 1726882270.04433: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000273] 15330 1726882270.04436: sending task result for task 12673a56-9f93-e4fe-1358-000000000273 15330 1726882270.04713: done sending task result for task 12673a56-9f93-e4fe-1358-000000000273 15330 1726882270.04716: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15330 1726882270.04765: no more pending results, returning what we have 15330 1726882270.04768: results queue empty 15330 1726882270.04769: checking for any_errors_fatal 15330 1726882270.04775: done checking for any_errors_fatal 15330 1726882270.04776: checking for max_fail_percentage 15330 1726882270.04778: done checking for max_fail_percentage 15330 1726882270.04779: checking to see if all hosts have failed and the running result is not ok 15330 1726882270.04780: done checking to see if all hosts have failed 15330 1726882270.04780: getting the remaining hosts for this loop 15330 1726882270.04782: done getting the remaining hosts for this loop 15330 1726882270.04786: getting the next task for host managed_node3 15330 1726882270.04794: done getting next task for host managed_node3 15330 1726882270.04797: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15330 1726882270.04801: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882270.04805: getting variables 15330 1726882270.04806: in VariableManager get_vars() 15330 1726882270.04837: Calling all_inventory to load vars for managed_node3 15330 1726882270.04839: Calling groups_inventory to load vars for managed_node3 15330 1726882270.04843: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882270.04856: Calling all_plugins_play to load vars for managed_node3 15330 1726882270.04858: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882270.04861: Calling groups_plugins_play to load vars for managed_node3 15330 1726882270.07349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.10576: done with get_vars() 15330 1726882270.10810: done getting variables 15330 1726882270.10878: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882270.10997: variable 'profile' from source: play vars 15330 1726882270.11002: variable 'interface' from source: set_fact 15330 1726882270.11060: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:31:10 -0400 (0:00:00.096) 0:00:19.316 ****** 15330 1726882270.11294: entering _queue_task() for managed_node3/command 15330 1726882270.12127: worker is 1 (out of 1 available) 15330 1726882270.12138: exiting _queue_task() for managed_node3/command 15330 1726882270.12148: done queuing things up, now waiting for results queue to drain 15330 1726882270.12149: waiting for pending results... 15330 1726882270.12530: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15330 1726882270.12919: in run() - task 12673a56-9f93-e4fe-1358-000000000274 15330 1726882270.12923: variable 'ansible_search_path' from source: unknown 15330 1726882270.12926: variable 'ansible_search_path' from source: unknown 15330 1726882270.12929: calling self._execute() 15330 1726882270.13042: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.13498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.13501: variable 'omit' from source: magic vars 15330 1726882270.13861: variable 'ansible_distribution_major_version' from source: facts 15330 1726882270.13880: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882270.14064: variable 'profile_stat' from source: set_fact 15330 1726882270.14097: Evaluated conditional (profile_stat.stat.exists): False 15330 1726882270.14111: when evaluation is False, skipping this task 15330 1726882270.14125: _execute() done 15330 1726882270.14138: dumping result to json 15330 1726882270.14145: done dumping result, returning 15330 1726882270.14154: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000274] 15330 1726882270.14161: sending task result for task 12673a56-9f93-e4fe-1358-000000000274 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15330 1726882270.14339: no more pending results, returning what we have 15330 1726882270.14342: results queue empty 15330 1726882270.14344: checking for any_errors_fatal 15330 1726882270.14351: done checking for any_errors_fatal 15330 1726882270.14351: checking for max_fail_percentage 15330 1726882270.14353: done checking for max_fail_percentage 15330 1726882270.14354: checking to see if all hosts have failed and the running result is not ok 15330 1726882270.14355: done checking to see if all hosts have failed 15330 1726882270.14356: getting the remaining hosts for this loop 15330 1726882270.14357: done getting the remaining hosts for this loop 15330 1726882270.14361: getting the next task for host managed_node3 15330 1726882270.14369: done getting next task for host managed_node3 15330 1726882270.14372: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15330 1726882270.14377: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882270.14381: getting variables 15330 1726882270.14383: in VariableManager get_vars() 15330 1726882270.14415: Calling all_inventory to load vars for managed_node3 15330 1726882270.14418: Calling groups_inventory to load vars for managed_node3 15330 1726882270.14422: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882270.14436: Calling all_plugins_play to load vars for managed_node3 15330 1726882270.14439: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882270.14442: Calling groups_plugins_play to load vars for managed_node3 15330 1726882270.15399: done sending task result for task 12673a56-9f93-e4fe-1358-000000000274 15330 1726882270.15403: WORKER PROCESS EXITING 15330 1726882270.17682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.21087: done with get_vars() 15330 1726882270.21121: done getting variables 15330 1726882270.21228: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882270.21457: variable 'profile' from source: play vars 15330 1726882270.21462: variable 'interface' from source: set_fact 15330 1726882270.21634: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:31:10 -0400 (0:00:00.105) 0:00:19.422 ****** 15330 1726882270.21662: entering _queue_task() for managed_node3/set_fact 15330 1726882270.22332: worker is 1 (out of 1 available) 15330 1726882270.22343: exiting _queue_task() for managed_node3/set_fact 15330 1726882270.22575: done queuing things up, now waiting for results queue to drain 15330 1726882270.22577: waiting for pending results... 15330 1726882270.22866: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15330 1726882270.23120: in run() - task 12673a56-9f93-e4fe-1358-000000000275 15330 1726882270.23140: variable 'ansible_search_path' from source: unknown 15330 1726882270.23147: variable 'ansible_search_path' from source: unknown 15330 1726882270.23184: calling self._execute() 15330 1726882270.23276: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.23287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.23301: variable 'omit' from source: magic vars 15330 1726882270.23663: variable 'ansible_distribution_major_version' from source: facts 15330 1726882270.23680: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882270.23974: variable 'profile_stat' from source: set_fact 15330 1726882270.24301: Evaluated conditional (profile_stat.stat.exists): False 15330 1726882270.24304: when evaluation is False, skipping this task 15330 1726882270.24306: _execute() done 15330 1726882270.24309: dumping result to json 15330 1726882270.24311: done dumping result, returning 15330 1726882270.24313: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000275] 15330 1726882270.24315: sending task result for task 12673a56-9f93-e4fe-1358-000000000275 15330 1726882270.24381: done sending task result for task 12673a56-9f93-e4fe-1358-000000000275 15330 1726882270.24383: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15330 1726882270.24450: no more pending results, returning what we have 15330 1726882270.24455: results queue empty 15330 1726882270.24456: checking for any_errors_fatal 15330 1726882270.24462: done checking for any_errors_fatal 15330 1726882270.24462: checking for max_fail_percentage 15330 1726882270.24464: done checking for max_fail_percentage 15330 1726882270.24465: checking to see if all hosts have failed and the running result is not ok 15330 1726882270.24466: done checking to see if all hosts have failed 15330 1726882270.24466: getting the remaining hosts for this loop 15330 1726882270.24468: done getting the remaining hosts for this loop 15330 1726882270.24472: getting the next task for host managed_node3 15330 1726882270.24481: done getting next task for host managed_node3 15330 1726882270.24484: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 15330 1726882270.24488: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882270.24494: getting variables 15330 1726882270.24496: in VariableManager get_vars() 15330 1726882270.24524: Calling all_inventory to load vars for managed_node3 15330 1726882270.24526: Calling groups_inventory to load vars for managed_node3 15330 1726882270.24529: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882270.24542: Calling all_plugins_play to load vars for managed_node3 15330 1726882270.24544: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882270.24547: Calling groups_plugins_play to load vars for managed_node3 15330 1726882270.27206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.30155: done with get_vars() 15330 1726882270.30184: done getting variables 15330 1726882270.30450: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882270.30770: variable 'profile' from source: play vars 15330 1726882270.30774: variable 'interface' from source: set_fact 15330 1726882270.30835: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:31:10 -0400 (0:00:00.092) 0:00:19.514 ****** 15330 1726882270.30867: entering _queue_task() for managed_node3/assert 15330 1726882270.31536: worker is 1 (out of 1 available) 15330 1726882270.31548: exiting _queue_task() for managed_node3/assert 15330 1726882270.31561: done queuing things up, now waiting for results queue to drain 15330 1726882270.31562: waiting for pending results... 15330 1726882270.31774: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'LSR-TST-br31' 15330 1726882270.31883: in run() - task 12673a56-9f93-e4fe-1358-000000000260 15330 1726882270.31905: variable 'ansible_search_path' from source: unknown 15330 1726882270.31913: variable 'ansible_search_path' from source: unknown 15330 1726882270.31963: calling self._execute() 15330 1726882270.32060: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.32097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.32100: variable 'omit' from source: magic vars 15330 1726882270.32442: variable 'ansible_distribution_major_version' from source: facts 15330 1726882270.32458: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882270.32468: variable 'omit' from source: magic vars 15330 1726882270.32520: variable 'omit' from source: magic vars 15330 1726882270.32720: variable 'profile' from source: play vars 15330 1726882270.32724: variable 'interface' from source: set_fact 15330 1726882270.32726: variable 'interface' from source: set_fact 15330 1726882270.32731: variable 'omit' from source: magic vars 15330 1726882270.32776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882270.32819: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882270.32855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882270.32876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882270.32892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882270.32928: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882270.32945: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.32957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.33068: Set connection var ansible_pipelining to False 15330 1726882270.33169: Set connection var ansible_timeout to 10 15330 1726882270.33172: Set connection var ansible_connection to ssh 15330 1726882270.33174: Set connection var ansible_shell_type to sh 15330 1726882270.33176: Set connection var ansible_shell_executable to /bin/sh 15330 1726882270.33179: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882270.33181: variable 'ansible_shell_executable' from source: unknown 15330 1726882270.33183: variable 'ansible_connection' from source: unknown 15330 1726882270.33184: variable 'ansible_module_compression' from source: unknown 15330 1726882270.33186: variable 'ansible_shell_type' from source: unknown 15330 1726882270.33188: variable 'ansible_shell_executable' from source: unknown 15330 1726882270.33190: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.33191: variable 'ansible_pipelining' from source: unknown 15330 1726882270.33196: variable 'ansible_timeout' from source: unknown 15330 1726882270.33198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.33336: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882270.33351: variable 'omit' from source: magic vars 15330 1726882270.33386: starting attempt loop 15330 1726882270.33390: running the handler 15330 1726882270.33490: variable 'lsr_net_profile_exists' from source: set_fact 15330 1726882270.33506: Evaluated conditional (lsr_net_profile_exists): True 15330 1726882270.33530: handler run complete 15330 1726882270.33540: attempt loop complete, returning result 15330 1726882270.33548: _execute() done 15330 1726882270.33608: dumping result to json 15330 1726882270.33612: done dumping result, returning 15330 1726882270.33616: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'LSR-TST-br31' [12673a56-9f93-e4fe-1358-000000000260] 15330 1726882270.33618: sending task result for task 12673a56-9f93-e4fe-1358-000000000260 15330 1726882270.33684: done sending task result for task 12673a56-9f93-e4fe-1358-000000000260 15330 1726882270.33687: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15330 1726882270.33777: no more pending results, returning what we have 15330 1726882270.33781: results queue empty 15330 1726882270.33782: checking for any_errors_fatal 15330 1726882270.33795: done checking for any_errors_fatal 15330 1726882270.33796: checking for max_fail_percentage 15330 1726882270.33798: done checking for max_fail_percentage 15330 1726882270.33799: checking to see if all hosts have failed and the running result is not ok 15330 1726882270.33800: done checking to see if all hosts have failed 15330 1726882270.33800: getting the remaining hosts for this loop 15330 1726882270.33802: done getting the remaining hosts for this loop 15330 1726882270.33805: getting the next task for host managed_node3 15330 1726882270.33813: done getting next task for host managed_node3 15330 1726882270.33817: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 15330 1726882270.33820: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882270.33825: getting variables 15330 1726882270.33827: in VariableManager get_vars() 15330 1726882270.33858: Calling all_inventory to load vars for managed_node3 15330 1726882270.33861: Calling groups_inventory to load vars for managed_node3 15330 1726882270.33866: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882270.33878: Calling all_plugins_play to load vars for managed_node3 15330 1726882270.33882: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882270.33885: Calling groups_plugins_play to load vars for managed_node3 15330 1726882270.36397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.38077: done with get_vars() 15330 1726882270.38105: done getting variables 15330 1726882270.38169: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882270.38290: variable 'profile' from source: play vars 15330 1726882270.38296: variable 'interface' from source: set_fact 15330 1726882270.38355: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:31:10 -0400 (0:00:00.076) 0:00:19.591 ****** 15330 1726882270.38509: entering _queue_task() for managed_node3/assert 15330 1726882270.39182: worker is 1 (out of 1 available) 15330 1726882270.39196: exiting _queue_task() for managed_node3/assert 15330 1726882270.39208: done queuing things up, now waiting for results queue to drain 15330 1726882270.39209: waiting for pending results... 15330 1726882270.39757: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 15330 1726882270.39901: in run() - task 12673a56-9f93-e4fe-1358-000000000261 15330 1726882270.39905: variable 'ansible_search_path' from source: unknown 15330 1726882270.39908: variable 'ansible_search_path' from source: unknown 15330 1726882270.39910: calling self._execute() 15330 1726882270.40013: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.40030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.40044: variable 'omit' from source: magic vars 15330 1726882270.40415: variable 'ansible_distribution_major_version' from source: facts 15330 1726882270.40441: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882270.40453: variable 'omit' from source: magic vars 15330 1726882270.40498: variable 'omit' from source: magic vars 15330 1726882270.40659: variable 'profile' from source: play vars 15330 1726882270.40664: variable 'interface' from source: set_fact 15330 1726882270.40698: variable 'interface' from source: set_fact 15330 1726882270.40721: variable 'omit' from source: magic vars 15330 1726882270.40767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882270.40815: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882270.40840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882270.40861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882270.40883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882270.40985: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882270.40989: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.40991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.41043: Set connection var ansible_pipelining to False 15330 1726882270.41061: Set connection var ansible_timeout to 10 15330 1726882270.41068: Set connection var ansible_connection to ssh 15330 1726882270.41075: Set connection var ansible_shell_type to sh 15330 1726882270.41098: Set connection var ansible_shell_executable to /bin/sh 15330 1726882270.41117: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882270.41141: variable 'ansible_shell_executable' from source: unknown 15330 1726882270.41149: variable 'ansible_connection' from source: unknown 15330 1726882270.41156: variable 'ansible_module_compression' from source: unknown 15330 1726882270.41163: variable 'ansible_shell_type' from source: unknown 15330 1726882270.41169: variable 'ansible_shell_executable' from source: unknown 15330 1726882270.41176: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.41200: variable 'ansible_pipelining' from source: unknown 15330 1726882270.41203: variable 'ansible_timeout' from source: unknown 15330 1726882270.41205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.41354: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882270.41416: variable 'omit' from source: magic vars 15330 1726882270.41419: starting attempt loop 15330 1726882270.41422: running the handler 15330 1726882270.41526: variable 'lsr_net_profile_ansible_managed' from source: set_fact 15330 1726882270.41536: Evaluated conditional (lsr_net_profile_ansible_managed): True 15330 1726882270.41548: handler run complete 15330 1726882270.41566: attempt loop complete, returning result 15330 1726882270.41660: _execute() done 15330 1726882270.41669: dumping result to json 15330 1726882270.41677: done dumping result, returning 15330 1726882270.41698: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [12673a56-9f93-e4fe-1358-000000000261] 15330 1726882270.41700: sending task result for task 12673a56-9f93-e4fe-1358-000000000261 15330 1726882270.41809: done sending task result for task 12673a56-9f93-e4fe-1358-000000000261 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15330 1726882270.41914: no more pending results, returning what we have 15330 1726882270.41918: results queue empty 15330 1726882270.41919: checking for any_errors_fatal 15330 1726882270.41926: done checking for any_errors_fatal 15330 1726882270.41927: checking for max_fail_percentage 15330 1726882270.41929: done checking for max_fail_percentage 15330 1726882270.41931: checking to see if all hosts have failed and the running result is not ok 15330 1726882270.41932: done checking to see if all hosts have failed 15330 1726882270.41932: getting the remaining hosts for this loop 15330 1726882270.41933: done getting the remaining hosts for this loop 15330 1726882270.41937: getting the next task for host managed_node3 15330 1726882270.41944: done getting next task for host managed_node3 15330 1726882270.41947: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 15330 1726882270.41951: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882270.42199: getting variables 15330 1726882270.42201: in VariableManager get_vars() 15330 1726882270.42227: Calling all_inventory to load vars for managed_node3 15330 1726882270.42229: Calling groups_inventory to load vars for managed_node3 15330 1726882270.42232: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882270.42242: Calling all_plugins_play to load vars for managed_node3 15330 1726882270.42244: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882270.42247: Calling groups_plugins_play to load vars for managed_node3 15330 1726882270.42809: WORKER PROCESS EXITING 15330 1726882270.44701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.45565: done with get_vars() 15330 1726882270.45583: done getting variables 15330 1726882270.45641: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882270.45764: variable 'profile' from source: play vars 15330 1726882270.45767: variable 'interface' from source: set_fact 15330 1726882270.45838: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:31:10 -0400 (0:00:00.073) 0:00:19.664 ****** 15330 1726882270.45882: entering _queue_task() for managed_node3/assert 15330 1726882270.46235: worker is 1 (out of 1 available) 15330 1726882270.46246: exiting _queue_task() for managed_node3/assert 15330 1726882270.46258: done queuing things up, now waiting for results queue to drain 15330 1726882270.46259: waiting for pending results... 15330 1726882270.46565: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 15330 1726882270.46722: in run() - task 12673a56-9f93-e4fe-1358-000000000262 15330 1726882270.46753: variable 'ansible_search_path' from source: unknown 15330 1726882270.46762: variable 'ansible_search_path' from source: unknown 15330 1726882270.46812: calling self._execute() 15330 1726882270.46934: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.46965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.46980: variable 'omit' from source: magic vars 15330 1726882270.47598: variable 'ansible_distribution_major_version' from source: facts 15330 1726882270.47614: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882270.47619: variable 'omit' from source: magic vars 15330 1726882270.47700: variable 'omit' from source: magic vars 15330 1726882270.47760: variable 'profile' from source: play vars 15330 1726882270.47764: variable 'interface' from source: set_fact 15330 1726882270.47830: variable 'interface' from source: set_fact 15330 1726882270.47900: variable 'omit' from source: magic vars 15330 1726882270.47903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882270.47918: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882270.47942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882270.47958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882270.47970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882270.48004: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882270.48007: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.48010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.48121: Set connection var ansible_pipelining to False 15330 1726882270.48124: Set connection var ansible_timeout to 10 15330 1726882270.48126: Set connection var ansible_connection to ssh 15330 1726882270.48129: Set connection var ansible_shell_type to sh 15330 1726882270.48189: Set connection var ansible_shell_executable to /bin/sh 15330 1726882270.48195: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882270.48198: variable 'ansible_shell_executable' from source: unknown 15330 1726882270.48201: variable 'ansible_connection' from source: unknown 15330 1726882270.48203: variable 'ansible_module_compression' from source: unknown 15330 1726882270.48205: variable 'ansible_shell_type' from source: unknown 15330 1726882270.48207: variable 'ansible_shell_executable' from source: unknown 15330 1726882270.48209: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.48211: variable 'ansible_pipelining' from source: unknown 15330 1726882270.48213: variable 'ansible_timeout' from source: unknown 15330 1726882270.48215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.48358: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882270.48362: variable 'omit' from source: magic vars 15330 1726882270.48368: starting attempt loop 15330 1726882270.48371: running the handler 15330 1726882270.48452: variable 'lsr_net_profile_fingerprint' from source: set_fact 15330 1726882270.48456: Evaluated conditional (lsr_net_profile_fingerprint): True 15330 1726882270.48461: handler run complete 15330 1726882270.48476: attempt loop complete, returning result 15330 1726882270.48479: _execute() done 15330 1726882270.48481: dumping result to json 15330 1726882270.48484: done dumping result, returning 15330 1726882270.48498: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000262] 15330 1726882270.48501: sending task result for task 12673a56-9f93-e4fe-1358-000000000262 15330 1726882270.48576: done sending task result for task 12673a56-9f93-e4fe-1358-000000000262 15330 1726882270.48578: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15330 1726882270.48635: no more pending results, returning what we have 15330 1726882270.48637: results queue empty 15330 1726882270.48638: checking for any_errors_fatal 15330 1726882270.48644: done checking for any_errors_fatal 15330 1726882270.48645: checking for max_fail_percentage 15330 1726882270.48646: done checking for max_fail_percentage 15330 1726882270.48648: checking to see if all hosts have failed and the running result is not ok 15330 1726882270.48649: done checking to see if all hosts have failed 15330 1726882270.48649: getting the remaining hosts for this loop 15330 1726882270.48650: done getting the remaining hosts for this loop 15330 1726882270.48654: getting the next task for host managed_node3 15330 1726882270.48663: done getting next task for host managed_node3 15330 1726882270.48664: ^ task is: TASK: meta (flush_handlers) 15330 1726882270.48666: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882270.48671: getting variables 15330 1726882270.48672: in VariableManager get_vars() 15330 1726882270.48705: Calling all_inventory to load vars for managed_node3 15330 1726882270.48707: Calling groups_inventory to load vars for managed_node3 15330 1726882270.48714: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882270.48724: Calling all_plugins_play to load vars for managed_node3 15330 1726882270.48727: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882270.48729: Calling groups_plugins_play to load vars for managed_node3 15330 1726882270.49830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.51644: done with get_vars() 15330 1726882270.51660: done getting variables 15330 1726882270.51713: in VariableManager get_vars() 15330 1726882270.51720: Calling all_inventory to load vars for managed_node3 15330 1726882270.51721: Calling groups_inventory to load vars for managed_node3 15330 1726882270.51723: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882270.51726: Calling all_plugins_play to load vars for managed_node3 15330 1726882270.51727: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882270.51729: Calling groups_plugins_play to load vars for managed_node3 15330 1726882270.52370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.53878: done with get_vars() 15330 1726882270.53914: done queuing things up, now waiting for results queue to drain 15330 1726882270.53917: results queue empty 15330 1726882270.53917: checking for any_errors_fatal 15330 1726882270.53920: done checking for any_errors_fatal 15330 1726882270.53920: checking for max_fail_percentage 15330 1726882270.53921: done checking for max_fail_percentage 15330 1726882270.53926: checking to see if all hosts have failed and the running result is not ok 15330 1726882270.53927: done checking to see if all hosts have failed 15330 1726882270.53928: getting the remaining hosts for this loop 15330 1726882270.53929: done getting the remaining hosts for this loop 15330 1726882270.53932: getting the next task for host managed_node3 15330 1726882270.53937: done getting next task for host managed_node3 15330 1726882270.53939: ^ task is: TASK: meta (flush_handlers) 15330 1726882270.53941: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882270.53943: getting variables 15330 1726882270.53944: in VariableManager get_vars() 15330 1726882270.53959: Calling all_inventory to load vars for managed_node3 15330 1726882270.53961: Calling groups_inventory to load vars for managed_node3 15330 1726882270.53965: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882270.53973: Calling all_plugins_play to load vars for managed_node3 15330 1726882270.53975: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882270.53982: Calling groups_plugins_play to load vars for managed_node3 15330 1726882270.55569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.57078: done with get_vars() 15330 1726882270.57110: done getting variables 15330 1726882270.57211: in VariableManager get_vars() 15330 1726882270.57222: Calling all_inventory to load vars for managed_node3 15330 1726882270.57224: Calling groups_inventory to load vars for managed_node3 15330 1726882270.57226: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882270.57230: Calling all_plugins_play to load vars for managed_node3 15330 1726882270.57235: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882270.57238: Calling groups_plugins_play to load vars for managed_node3 15330 1726882270.58602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.59592: done with get_vars() 15330 1726882270.59617: done queuing things up, now waiting for results queue to drain 15330 1726882270.59619: results queue empty 15330 1726882270.59619: checking for any_errors_fatal 15330 1726882270.59620: done checking for any_errors_fatal 15330 1726882270.59620: checking for max_fail_percentage 15330 1726882270.59621: done checking for max_fail_percentage 15330 1726882270.59622: checking to see if all hosts have failed and the running result is not ok 15330 1726882270.59622: done checking to see if all hosts have failed 15330 1726882270.59623: getting the remaining hosts for this loop 15330 1726882270.59623: done getting the remaining hosts for this loop 15330 1726882270.59625: getting the next task for host managed_node3 15330 1726882270.59628: done getting next task for host managed_node3 15330 1726882270.59628: ^ task is: None 15330 1726882270.59629: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882270.59630: done queuing things up, now waiting for results queue to drain 15330 1726882270.59630: results queue empty 15330 1726882270.59631: checking for any_errors_fatal 15330 1726882270.59631: done checking for any_errors_fatal 15330 1726882270.59632: checking for max_fail_percentage 15330 1726882270.59633: done checking for max_fail_percentage 15330 1726882270.59633: checking to see if all hosts have failed and the running result is not ok 15330 1726882270.59633: done checking to see if all hosts have failed 15330 1726882270.59634: getting the next task for host managed_node3 15330 1726882270.59636: done getting next task for host managed_node3 15330 1726882270.59636: ^ task is: None 15330 1726882270.59637: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882270.59673: in VariableManager get_vars() 15330 1726882270.59690: done with get_vars() 15330 1726882270.59697: in VariableManager get_vars() 15330 1726882270.59705: done with get_vars() 15330 1726882270.59708: variable 'omit' from source: magic vars 15330 1726882270.59791: variable 'profile' from source: play vars 15330 1726882270.59874: in VariableManager get_vars() 15330 1726882270.59884: done with get_vars() 15330 1726882270.59901: variable 'omit' from source: magic vars 15330 1726882270.59944: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 15330 1726882270.60351: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15330 1726882270.60372: getting the remaining hosts for this loop 15330 1726882270.60373: done getting the remaining hosts for this loop 15330 1726882270.60374: getting the next task for host managed_node3 15330 1726882270.60376: done getting next task for host managed_node3 15330 1726882270.60378: ^ task is: TASK: Gathering Facts 15330 1726882270.60378: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882270.60380: getting variables 15330 1726882270.60380: in VariableManager get_vars() 15330 1726882270.60432: Calling all_inventory to load vars for managed_node3 15330 1726882270.60435: Calling groups_inventory to load vars for managed_node3 15330 1726882270.60436: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882270.60441: Calling all_plugins_play to load vars for managed_node3 15330 1726882270.60442: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882270.60444: Calling groups_plugins_play to load vars for managed_node3 15330 1726882270.61305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882270.63034: done with get_vars() 15330 1726882270.63057: done getting variables 15330 1726882270.63119: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:31:10 -0400 (0:00:00.172) 0:00:19.837 ****** 15330 1726882270.63153: entering _queue_task() for managed_node3/gather_facts 15330 1726882270.63649: worker is 1 (out of 1 available) 15330 1726882270.63663: exiting _queue_task() for managed_node3/gather_facts 15330 1726882270.63676: done queuing things up, now waiting for results queue to drain 15330 1726882270.63678: waiting for pending results... 15330 1726882270.63981: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882270.63997: in run() - task 12673a56-9f93-e4fe-1358-0000000002b5 15330 1726882270.64007: variable 'ansible_search_path' from source: unknown 15330 1726882270.64066: calling self._execute() 15330 1726882270.64285: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.64294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.64298: variable 'omit' from source: magic vars 15330 1726882270.64609: variable 'ansible_distribution_major_version' from source: facts 15330 1726882270.64613: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882270.64616: variable 'omit' from source: magic vars 15330 1726882270.64627: variable 'omit' from source: magic vars 15330 1726882270.64666: variable 'omit' from source: magic vars 15330 1726882270.64784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882270.64788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882270.64819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882270.64843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882270.64846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882270.64865: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882270.64868: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.64871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.64958: Set connection var ansible_pipelining to False 15330 1726882270.64990: Set connection var ansible_timeout to 10 15330 1726882270.64997: Set connection var ansible_connection to ssh 15330 1726882270.65000: Set connection var ansible_shell_type to sh 15330 1726882270.65005: Set connection var ansible_shell_executable to /bin/sh 15330 1726882270.65008: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882270.65042: variable 'ansible_shell_executable' from source: unknown 15330 1726882270.65046: variable 'ansible_connection' from source: unknown 15330 1726882270.65051: variable 'ansible_module_compression' from source: unknown 15330 1726882270.65054: variable 'ansible_shell_type' from source: unknown 15330 1726882270.65056: variable 'ansible_shell_executable' from source: unknown 15330 1726882270.65058: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882270.65061: variable 'ansible_pipelining' from source: unknown 15330 1726882270.65063: variable 'ansible_timeout' from source: unknown 15330 1726882270.65065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882270.65192: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882270.65266: variable 'omit' from source: magic vars 15330 1726882270.65270: starting attempt loop 15330 1726882270.65272: running the handler 15330 1726882270.65275: variable 'ansible_facts' from source: unknown 15330 1726882270.65278: _low_level_execute_command(): starting 15330 1726882270.65334: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882270.65984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882270.66099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882270.66115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882270.66132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882270.66148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882270.66200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882270.67885: stdout chunk (state=3): >>>/root <<< 15330 1726882270.68007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882270.68041: stderr chunk (state=3): >>><<< 15330 1726882270.68044: stdout chunk (state=3): >>><<< 15330 1726882270.68076: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882270.68113: _low_level_execute_command(): starting 15330 1726882270.68117: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957 `" && echo ansible-tmp-1726882270.6807451-16271-35469273935957="` echo /root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957 `" ) && sleep 0' 15330 1726882270.68816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882270.68923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882270.68927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882270.69000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882270.70805: stdout chunk (state=3): >>>ansible-tmp-1726882270.6807451-16271-35469273935957=/root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957 <<< 15330 1726882270.70911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882270.70936: stderr chunk (state=3): >>><<< 15330 1726882270.70939: stdout chunk (state=3): >>><<< 15330 1726882270.70962: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882270.6807451-16271-35469273935957=/root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882270.71026: variable 'ansible_module_compression' from source: unknown 15330 1726882270.71061: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882270.71128: variable 'ansible_facts' from source: unknown 15330 1726882270.71268: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957/AnsiballZ_setup.py 15330 1726882270.71372: Sending initial data 15330 1726882270.71375: Sent initial data (153 bytes) 15330 1726882270.71838: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882270.71841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882270.71843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882270.71848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882270.71850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882270.71903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882270.71906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882270.71911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882270.71955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882270.73443: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15330 1726882270.73447: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882270.73489: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882270.73530: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp6bv02t04 /root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957/AnsiballZ_setup.py <<< 15330 1726882270.73537: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957/AnsiballZ_setup.py" <<< 15330 1726882270.73574: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp6bv02t04" to remote "/root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957/AnsiballZ_setup.py" <<< 15330 1726882270.74741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882270.74787: stderr chunk (state=3): >>><<< 15330 1726882270.74797: stdout chunk (state=3): >>><<< 15330 1726882270.74841: done transferring module to remote 15330 1726882270.74844: _low_level_execute_command(): starting 15330 1726882270.74847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957/ /root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957/AnsiballZ_setup.py && sleep 0' 15330 1726882270.75423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882270.75426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882270.75428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882270.75430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882270.75432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882270.75508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882270.75514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882270.75570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882270.77288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882270.77304: stderr chunk (state=3): >>><<< 15330 1726882270.77338: stdout chunk (state=3): >>><<< 15330 1726882270.77369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882270.77376: _low_level_execute_command(): starting 15330 1726882270.77379: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957/AnsiballZ_setup.py && sleep 0' 15330 1726882270.78049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882270.78052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882270.78055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882270.78057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882270.78059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882270.78061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882270.78104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882270.78151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882271.40853: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2968, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 563, "free": 2968}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 578, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805199360, "block_size": 4096, "block_total": 65519099, "block_available": 63917285, "block_used": 1601814, "inode_total": 131070960, "inode_available": 131029133, "inode_used": 41827, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 1.02294921875, "5m": 0.49169921875, "15m": 0.2275390625}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "11", "epoch": "1726882271", "epoch_int": "1726882271", "date": "2024-09-20", "time": "21:31:11", "iso8601_micro": "2024-09-21T01:31:11.358873Z", "iso8601": "2024-09-21T01:31:11Z", "iso8601_basic": "20240920T213111358873", "iso8601_basic_short": "20240920T213111", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "26:84:62:af:e1:90", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882271.42736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882271.42764: stderr chunk (state=3): >>><<< 15330 1726882271.42767: stdout chunk (state=3): >>><<< 15330 1726882271.42818: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2968, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 563, "free": 2968}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 578, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805199360, "block_size": 4096, "block_total": 65519099, "block_available": 63917285, "block_used": 1601814, "inode_total": 131070960, "inode_available": 131029133, "inode_used": 41827, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 1.02294921875, "5m": 0.49169921875, "15m": 0.2275390625}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "11", "epoch": "1726882271", "epoch_int": "1726882271", "date": "2024-09-20", "time": "21:31:11", "iso8601_micro": "2024-09-21T01:31:11.358873Z", "iso8601": "2024-09-21T01:31:11Z", "iso8601_basic": "20240920T213111358873", "iso8601_basic_short": "20240920T213111", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "26:84:62:af:e1:90", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882271.43152: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882271.43173: _low_level_execute_command(): starting 15330 1726882271.43176: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882270.6807451-16271-35469273935957/ > /dev/null 2>&1 && sleep 0' 15330 1726882271.43842: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882271.43845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882271.43847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882271.43869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882271.43928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882271.43935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882271.43985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882271.45755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882271.45819: stderr chunk (state=3): >>><<< 15330 1726882271.45822: stdout chunk (state=3): >>><<< 15330 1726882271.45860: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882271.45863: handler run complete 15330 1726882271.45963: variable 'ansible_facts' from source: unknown 15330 1726882271.46032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882271.46269: variable 'ansible_facts' from source: unknown 15330 1726882271.46334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882271.46429: attempt loop complete, returning result 15330 1726882271.46432: _execute() done 15330 1726882271.46434: dumping result to json 15330 1726882271.46471: done dumping result, returning 15330 1726882271.46477: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-0000000002b5] 15330 1726882271.46512: sending task result for task 12673a56-9f93-e4fe-1358-0000000002b5 ok: [managed_node3] 15330 1726882271.47186: no more pending results, returning what we have 15330 1726882271.47189: results queue empty 15330 1726882271.47190: checking for any_errors_fatal 15330 1726882271.47191: done checking for any_errors_fatal 15330 1726882271.47191: checking for max_fail_percentage 15330 1726882271.47192: done checking for max_fail_percentage 15330 1726882271.47194: checking to see if all hosts have failed and the running result is not ok 15330 1726882271.47195: done checking to see if all hosts have failed 15330 1726882271.47195: getting the remaining hosts for this loop 15330 1726882271.47196: done getting the remaining hosts for this loop 15330 1726882271.47198: getting the next task for host managed_node3 15330 1726882271.47201: done getting next task for host managed_node3 15330 1726882271.47202: ^ task is: TASK: meta (flush_handlers) 15330 1726882271.47204: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882271.47207: getting variables 15330 1726882271.47208: in VariableManager get_vars() 15330 1726882271.47229: Calling all_inventory to load vars for managed_node3 15330 1726882271.47231: Calling groups_inventory to load vars for managed_node3 15330 1726882271.47233: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882271.47241: Calling all_plugins_play to load vars for managed_node3 15330 1726882271.47243: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882271.47246: Calling groups_plugins_play to load vars for managed_node3 15330 1726882271.47764: done sending task result for task 12673a56-9f93-e4fe-1358-0000000002b5 15330 1726882271.47768: WORKER PROCESS EXITING 15330 1726882271.47965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882271.48964: done with get_vars() 15330 1726882271.48988: done getting variables 15330 1726882271.49063: in VariableManager get_vars() 15330 1726882271.49072: Calling all_inventory to load vars for managed_node3 15330 1726882271.49074: Calling groups_inventory to load vars for managed_node3 15330 1726882271.49076: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882271.49080: Calling all_plugins_play to load vars for managed_node3 15330 1726882271.49081: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882271.49083: Calling groups_plugins_play to load vars for managed_node3 15330 1726882271.53426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882271.54268: done with get_vars() 15330 1726882271.54288: done queuing things up, now waiting for results queue to drain 15330 1726882271.54290: results queue empty 15330 1726882271.54290: checking for any_errors_fatal 15330 1726882271.54294: done checking for any_errors_fatal 15330 1726882271.54295: checking for max_fail_percentage 15330 1726882271.54296: done checking for max_fail_percentage 15330 1726882271.54300: checking to see if all hosts have failed and the running result is not ok 15330 1726882271.54300: done checking to see if all hosts have failed 15330 1726882271.54301: getting the remaining hosts for this loop 15330 1726882271.54301: done getting the remaining hosts for this loop 15330 1726882271.54303: getting the next task for host managed_node3 15330 1726882271.54306: done getting next task for host managed_node3 15330 1726882271.54308: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15330 1726882271.54309: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882271.54316: getting variables 15330 1726882271.54316: in VariableManager get_vars() 15330 1726882271.54326: Calling all_inventory to load vars for managed_node3 15330 1726882271.54328: Calling groups_inventory to load vars for managed_node3 15330 1726882271.54329: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882271.54332: Calling all_plugins_play to load vars for managed_node3 15330 1726882271.54334: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882271.54335: Calling groups_plugins_play to load vars for managed_node3 15330 1726882271.54957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882271.55883: done with get_vars() 15330 1726882271.55910: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:31:11 -0400 (0:00:00.928) 0:00:20.766 ****** 15330 1726882271.56015: entering _queue_task() for managed_node3/include_tasks 15330 1726882271.56364: worker is 1 (out of 1 available) 15330 1726882271.56377: exiting _queue_task() for managed_node3/include_tasks 15330 1726882271.56391: done queuing things up, now waiting for results queue to drain 15330 1726882271.56396: waiting for pending results... 15330 1726882271.56710: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15330 1726882271.56850: in run() - task 12673a56-9f93-e4fe-1358-00000000003a 15330 1726882271.56876: variable 'ansible_search_path' from source: unknown 15330 1726882271.56881: variable 'ansible_search_path' from source: unknown 15330 1726882271.57014: calling self._execute() 15330 1726882271.57092: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882271.57108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882271.57120: variable 'omit' from source: magic vars 15330 1726882271.57572: variable 'ansible_distribution_major_version' from source: facts 15330 1726882271.57575: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882271.57581: _execute() done 15330 1726882271.57584: dumping result to json 15330 1726882271.57592: done dumping result, returning 15330 1726882271.57600: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-e4fe-1358-00000000003a] 15330 1726882271.57603: sending task result for task 12673a56-9f93-e4fe-1358-00000000003a 15330 1726882271.57724: no more pending results, returning what we have 15330 1726882271.57729: in VariableManager get_vars() 15330 1726882271.57766: Calling all_inventory to load vars for managed_node3 15330 1726882271.57768: Calling groups_inventory to load vars for managed_node3 15330 1726882271.57770: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882271.57781: Calling all_plugins_play to load vars for managed_node3 15330 1726882271.57784: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882271.57786: Calling groups_plugins_play to load vars for managed_node3 15330 1726882271.58406: done sending task result for task 12673a56-9f93-e4fe-1358-00000000003a 15330 1726882271.58409: WORKER PROCESS EXITING 15330 1726882271.59605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882271.60474: done with get_vars() 15330 1726882271.60489: variable 'ansible_search_path' from source: unknown 15330 1726882271.60490: variable 'ansible_search_path' from source: unknown 15330 1726882271.60511: we have included files to process 15330 1726882271.60512: generating all_blocks data 15330 1726882271.60512: done generating all_blocks data 15330 1726882271.60513: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15330 1726882271.60514: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15330 1726882271.60515: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15330 1726882271.60886: done processing included file 15330 1726882271.60891: iterating over new_blocks loaded from include file 15330 1726882271.60892: in VariableManager get_vars() 15330 1726882271.60906: done with get_vars() 15330 1726882271.60908: filtering new block on tags 15330 1726882271.60917: done filtering new block on tags 15330 1726882271.60919: in VariableManager get_vars() 15330 1726882271.60929: done with get_vars() 15330 1726882271.60930: filtering new block on tags 15330 1726882271.60940: done filtering new block on tags 15330 1726882271.60942: in VariableManager get_vars() 15330 1726882271.60953: done with get_vars() 15330 1726882271.60954: filtering new block on tags 15330 1726882271.60962: done filtering new block on tags 15330 1726882271.60963: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 15330 1726882271.60967: extending task lists for all hosts with included blocks 15330 1726882271.61201: done extending task lists 15330 1726882271.61203: done processing included files 15330 1726882271.61203: results queue empty 15330 1726882271.61204: checking for any_errors_fatal 15330 1726882271.61205: done checking for any_errors_fatal 15330 1726882271.61206: checking for max_fail_percentage 15330 1726882271.61207: done checking for max_fail_percentage 15330 1726882271.61208: checking to see if all hosts have failed and the running result is not ok 15330 1726882271.61209: done checking to see if all hosts have failed 15330 1726882271.61209: getting the remaining hosts for this loop 15330 1726882271.61210: done getting the remaining hosts for this loop 15330 1726882271.61212: getting the next task for host managed_node3 15330 1726882271.61216: done getting next task for host managed_node3 15330 1726882271.61219: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15330 1726882271.61221: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882271.61229: getting variables 15330 1726882271.61230: in VariableManager get_vars() 15330 1726882271.61242: Calling all_inventory to load vars for managed_node3 15330 1726882271.61245: Calling groups_inventory to load vars for managed_node3 15330 1726882271.61247: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882271.61251: Calling all_plugins_play to load vars for managed_node3 15330 1726882271.61254: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882271.61257: Calling groups_plugins_play to load vars for managed_node3 15330 1726882271.62351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882271.63969: done with get_vars() 15330 1726882271.63990: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:31:11 -0400 (0:00:00.080) 0:00:20.846 ****** 15330 1726882271.64059: entering _queue_task() for managed_node3/setup 15330 1726882271.64370: worker is 1 (out of 1 available) 15330 1726882271.64382: exiting _queue_task() for managed_node3/setup 15330 1726882271.64598: done queuing things up, now waiting for results queue to drain 15330 1726882271.64599: waiting for pending results... 15330 1726882271.64669: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15330 1726882271.64807: in run() - task 12673a56-9f93-e4fe-1358-0000000002f6 15330 1726882271.64831: variable 'ansible_search_path' from source: unknown 15330 1726882271.64934: variable 'ansible_search_path' from source: unknown 15330 1726882271.64937: calling self._execute() 15330 1726882271.64979: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882271.64997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882271.65014: variable 'omit' from source: magic vars 15330 1726882271.65381: variable 'ansible_distribution_major_version' from source: facts 15330 1726882271.65405: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882271.65640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882271.67895: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882271.67977: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882271.68024: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882271.68062: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882271.68102: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882271.68189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882271.68226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882271.68296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882271.68308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882271.68328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882271.68386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882271.68423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882271.68453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882271.68508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882271.68598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882271.68681: variable '__network_required_facts' from source: role '' defaults 15330 1726882271.68702: variable 'ansible_facts' from source: unknown 15330 1726882271.69440: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15330 1726882271.69447: when evaluation is False, skipping this task 15330 1726882271.69453: _execute() done 15330 1726882271.69458: dumping result to json 15330 1726882271.69464: done dumping result, returning 15330 1726882271.69474: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-e4fe-1358-0000000002f6] 15330 1726882271.69490: sending task result for task 12673a56-9f93-e4fe-1358-0000000002f6 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882271.69634: no more pending results, returning what we have 15330 1726882271.69638: results queue empty 15330 1726882271.69640: checking for any_errors_fatal 15330 1726882271.69642: done checking for any_errors_fatal 15330 1726882271.69642: checking for max_fail_percentage 15330 1726882271.69644: done checking for max_fail_percentage 15330 1726882271.69645: checking to see if all hosts have failed and the running result is not ok 15330 1726882271.69646: done checking to see if all hosts have failed 15330 1726882271.69646: getting the remaining hosts for this loop 15330 1726882271.69648: done getting the remaining hosts for this loop 15330 1726882271.69651: getting the next task for host managed_node3 15330 1726882271.69661: done getting next task for host managed_node3 15330 1726882271.69665: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15330 1726882271.69668: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882271.69682: getting variables 15330 1726882271.69684: in VariableManager get_vars() 15330 1726882271.69730: Calling all_inventory to load vars for managed_node3 15330 1726882271.69733: Calling groups_inventory to load vars for managed_node3 15330 1726882271.69735: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882271.69747: Calling all_plugins_play to load vars for managed_node3 15330 1726882271.69751: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882271.69754: Calling groups_plugins_play to load vars for managed_node3 15330 1726882271.70606: done sending task result for task 12673a56-9f93-e4fe-1358-0000000002f6 15330 1726882271.70610: WORKER PROCESS EXITING 15330 1726882271.71407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882271.72964: done with get_vars() 15330 1726882271.72986: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:31:11 -0400 (0:00:00.090) 0:00:20.936 ****** 15330 1726882271.73080: entering _queue_task() for managed_node3/stat 15330 1726882271.73381: worker is 1 (out of 1 available) 15330 1726882271.73396: exiting _queue_task() for managed_node3/stat 15330 1726882271.73408: done queuing things up, now waiting for results queue to drain 15330 1726882271.73409: waiting for pending results... 15330 1726882271.73677: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 15330 1726882271.73813: in run() - task 12673a56-9f93-e4fe-1358-0000000002f8 15330 1726882271.73833: variable 'ansible_search_path' from source: unknown 15330 1726882271.73840: variable 'ansible_search_path' from source: unknown 15330 1726882271.73876: calling self._execute() 15330 1726882271.73977: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882271.73997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882271.74020: variable 'omit' from source: magic vars 15330 1726882271.74392: variable 'ansible_distribution_major_version' from source: facts 15330 1726882271.74411: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882271.74576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882271.74850: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882271.74911: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882271.74951: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882271.75000: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882271.75141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882271.75172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882271.75212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882271.75398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882271.75402: variable '__network_is_ostree' from source: set_fact 15330 1726882271.75404: Evaluated conditional (not __network_is_ostree is defined): False 15330 1726882271.75407: when evaluation is False, skipping this task 15330 1726882271.75409: _execute() done 15330 1726882271.75411: dumping result to json 15330 1726882271.75414: done dumping result, returning 15330 1726882271.75416: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-e4fe-1358-0000000002f8] 15330 1726882271.75419: sending task result for task 12673a56-9f93-e4fe-1358-0000000002f8 15330 1726882271.75486: done sending task result for task 12673a56-9f93-e4fe-1358-0000000002f8 15330 1726882271.75492: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15330 1726882271.75548: no more pending results, returning what we have 15330 1726882271.75552: results queue empty 15330 1726882271.75553: checking for any_errors_fatal 15330 1726882271.75560: done checking for any_errors_fatal 15330 1726882271.75560: checking for max_fail_percentage 15330 1726882271.75562: done checking for max_fail_percentage 15330 1726882271.75563: checking to see if all hosts have failed and the running result is not ok 15330 1726882271.75564: done checking to see if all hosts have failed 15330 1726882271.75565: getting the remaining hosts for this loop 15330 1726882271.75566: done getting the remaining hosts for this loop 15330 1726882271.75570: getting the next task for host managed_node3 15330 1726882271.75576: done getting next task for host managed_node3 15330 1726882271.75580: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15330 1726882271.75583: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882271.75602: getting variables 15330 1726882271.75604: in VariableManager get_vars() 15330 1726882271.75644: Calling all_inventory to load vars for managed_node3 15330 1726882271.75646: Calling groups_inventory to load vars for managed_node3 15330 1726882271.75649: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882271.75659: Calling all_plugins_play to load vars for managed_node3 15330 1726882271.75663: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882271.75666: Calling groups_plugins_play to load vars for managed_node3 15330 1726882271.77420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882271.78996: done with get_vars() 15330 1726882271.79018: done getting variables 15330 1726882271.79075: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:31:11 -0400 (0:00:00.060) 0:00:20.997 ****** 15330 1726882271.79112: entering _queue_task() for managed_node3/set_fact 15330 1726882271.79440: worker is 1 (out of 1 available) 15330 1726882271.79452: exiting _queue_task() for managed_node3/set_fact 15330 1726882271.79462: done queuing things up, now waiting for results queue to drain 15330 1726882271.79464: waiting for pending results... 15330 1726882271.79819: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15330 1726882271.79880: in run() - task 12673a56-9f93-e4fe-1358-0000000002f9 15330 1726882271.79905: variable 'ansible_search_path' from source: unknown 15330 1726882271.79916: variable 'ansible_search_path' from source: unknown 15330 1726882271.79953: calling self._execute() 15330 1726882271.80098: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882271.80103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882271.80105: variable 'omit' from source: magic vars 15330 1726882271.80441: variable 'ansible_distribution_major_version' from source: facts 15330 1726882271.80460: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882271.80641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882271.80926: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882271.80976: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882271.81024: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882271.81299: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882271.81303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882271.81306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882271.81309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882271.81312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882271.81378: variable '__network_is_ostree' from source: set_fact 15330 1726882271.81396: Evaluated conditional (not __network_is_ostree is defined): False 15330 1726882271.81405: when evaluation is False, skipping this task 15330 1726882271.81413: _execute() done 15330 1726882271.81422: dumping result to json 15330 1726882271.81498: done dumping result, returning 15330 1726882271.81502: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-e4fe-1358-0000000002f9] 15330 1726882271.81505: sending task result for task 12673a56-9f93-e4fe-1358-0000000002f9 15330 1726882271.81799: done sending task result for task 12673a56-9f93-e4fe-1358-0000000002f9 15330 1726882271.81802: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15330 1726882271.81839: no more pending results, returning what we have 15330 1726882271.81843: results queue empty 15330 1726882271.81844: checking for any_errors_fatal 15330 1726882271.81849: done checking for any_errors_fatal 15330 1726882271.81850: checking for max_fail_percentage 15330 1726882271.81851: done checking for max_fail_percentage 15330 1726882271.81852: checking to see if all hosts have failed and the running result is not ok 15330 1726882271.81853: done checking to see if all hosts have failed 15330 1726882271.81853: getting the remaining hosts for this loop 15330 1726882271.81855: done getting the remaining hosts for this loop 15330 1726882271.81858: getting the next task for host managed_node3 15330 1726882271.81866: done getting next task for host managed_node3 15330 1726882271.81869: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15330 1726882271.81872: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882271.81885: getting variables 15330 1726882271.81889: in VariableManager get_vars() 15330 1726882271.81924: Calling all_inventory to load vars for managed_node3 15330 1726882271.81927: Calling groups_inventory to load vars for managed_node3 15330 1726882271.81929: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882271.81937: Calling all_plugins_play to load vars for managed_node3 15330 1726882271.81940: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882271.81943: Calling groups_plugins_play to load vars for managed_node3 15330 1726882271.83248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882271.84931: done with get_vars() 15330 1726882271.84955: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:31:11 -0400 (0:00:00.059) 0:00:21.056 ****** 15330 1726882271.85056: entering _queue_task() for managed_node3/service_facts 15330 1726882271.85355: worker is 1 (out of 1 available) 15330 1726882271.85367: exiting _queue_task() for managed_node3/service_facts 15330 1726882271.85379: done queuing things up, now waiting for results queue to drain 15330 1726882271.85380: waiting for pending results... 15330 1726882271.85658: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 15330 1726882271.85781: in run() - task 12673a56-9f93-e4fe-1358-0000000002fb 15330 1726882271.85812: variable 'ansible_search_path' from source: unknown 15330 1726882271.85825: variable 'ansible_search_path' from source: unknown 15330 1726882271.85867: calling self._execute() 15330 1726882271.85971: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882271.85982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882271.86002: variable 'omit' from source: magic vars 15330 1726882271.86370: variable 'ansible_distribution_major_version' from source: facts 15330 1726882271.86389: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882271.86402: variable 'omit' from source: magic vars 15330 1726882271.86469: variable 'omit' from source: magic vars 15330 1726882271.86500: variable 'omit' from source: magic vars 15330 1726882271.86577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882271.86580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882271.86607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882271.86629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882271.86645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882271.86681: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882271.86700: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882271.86796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882271.86819: Set connection var ansible_pipelining to False 15330 1726882271.86838: Set connection var ansible_timeout to 10 15330 1726882271.86844: Set connection var ansible_connection to ssh 15330 1726882271.86849: Set connection var ansible_shell_type to sh 15330 1726882271.86858: Set connection var ansible_shell_executable to /bin/sh 15330 1726882271.86866: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882271.86895: variable 'ansible_shell_executable' from source: unknown 15330 1726882271.86908: variable 'ansible_connection' from source: unknown 15330 1726882271.86919: variable 'ansible_module_compression' from source: unknown 15330 1726882271.86926: variable 'ansible_shell_type' from source: unknown 15330 1726882271.86931: variable 'ansible_shell_executable' from source: unknown 15330 1726882271.86937: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882271.86943: variable 'ansible_pipelining' from source: unknown 15330 1726882271.86949: variable 'ansible_timeout' from source: unknown 15330 1726882271.86955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882271.87163: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882271.87198: variable 'omit' from source: magic vars 15330 1726882271.87201: starting attempt loop 15330 1726882271.87204: running the handler 15330 1726882271.87216: _low_level_execute_command(): starting 15330 1726882271.87241: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882271.87949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882271.87962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882271.88006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15330 1726882271.88101: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882271.88105: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15330 1726882271.88120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882271.88133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882271.88217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882271.89861: stdout chunk (state=3): >>>/root <<< 15330 1726882271.90009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882271.90024: stdout chunk (state=3): >>><<< 15330 1726882271.90060: stderr chunk (state=3): >>><<< 15330 1726882271.90091: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882271.90114: _low_level_execute_command(): starting 15330 1726882271.90126: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080 `" && echo ansible-tmp-1726882271.9010086-16298-217130053203080="` echo /root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080 `" ) && sleep 0' 15330 1726882271.91384: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882271.91610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882271.91755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882271.93589: stdout chunk (state=3): >>>ansible-tmp-1726882271.9010086-16298-217130053203080=/root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080 <<< 15330 1726882271.93722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882271.93732: stdout chunk (state=3): >>><<< 15330 1726882271.93743: stderr chunk (state=3): >>><<< 15330 1726882271.93811: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882271.9010086-16298-217130053203080=/root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882271.93863: variable 'ansible_module_compression' from source: unknown 15330 1726882271.94046: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15330 1726882271.94109: variable 'ansible_facts' from source: unknown 15330 1726882271.94309: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080/AnsiballZ_service_facts.py 15330 1726882271.94813: Sending initial data 15330 1726882271.94823: Sent initial data (162 bytes) 15330 1726882271.95844: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882271.95857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882271.95869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882271.96019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882271.96023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882271.97516: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15330 1726882271.97520: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882271.97556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882271.97641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp1mse8gtd /root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080/AnsiballZ_service_facts.py <<< 15330 1726882271.97644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080/AnsiballZ_service_facts.py" <<< 15330 1726882271.97696: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp1mse8gtd" to remote "/root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080/AnsiballZ_service_facts.py" <<< 15330 1726882271.98596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882271.98600: stdout chunk (state=3): >>><<< 15330 1726882271.98602: stderr chunk (state=3): >>><<< 15330 1726882271.98611: done transferring module to remote 15330 1726882271.98625: _low_level_execute_command(): starting 15330 1726882271.98633: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080/ /root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080/AnsiballZ_service_facts.py && sleep 0' 15330 1726882271.99258: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882271.99273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882271.99286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882271.99366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882271.99408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882271.99430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882271.99446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882271.99531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882272.01219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882272.01243: stderr chunk (state=3): >>><<< 15330 1726882272.01299: stdout chunk (state=3): >>><<< 15330 1726882272.01303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882272.01306: _low_level_execute_command(): starting 15330 1726882272.01308: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080/AnsiballZ_service_facts.py && sleep 0' 15330 1726882272.01683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882272.01688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882272.01709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882272.01757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882272.01764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882272.01766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882272.01817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882273.49519: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 15330 1726882273.49583: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15330 1726882273.51400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882273.51404: stdout chunk (state=3): >>><<< 15330 1726882273.51406: stderr chunk (state=3): >>><<< 15330 1726882273.51410: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882273.52506: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882273.52529: _low_level_execute_command(): starting 15330 1726882273.52538: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882271.9010086-16298-217130053203080/ > /dev/null 2>&1 && sleep 0' 15330 1726882273.53306: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882273.53378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882273.53407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882273.53430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882273.53518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882273.55374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882273.55384: stdout chunk (state=3): >>><<< 15330 1726882273.55442: stderr chunk (state=3): >>><<< 15330 1726882273.55465: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882273.55482: handler run complete 15330 1726882273.55905: variable 'ansible_facts' from source: unknown 15330 1726882273.56167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882273.57100: variable 'ansible_facts' from source: unknown 15330 1726882273.57371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882273.58005: attempt loop complete, returning result 15330 1726882273.58102: _execute() done 15330 1726882273.58106: dumping result to json 15330 1726882273.58109: done dumping result, returning 15330 1726882273.58112: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-e4fe-1358-0000000002fb] 15330 1726882273.58124: sending task result for task 12673a56-9f93-e4fe-1358-0000000002fb 15330 1726882273.59871: done sending task result for task 12673a56-9f93-e4fe-1358-0000000002fb 15330 1726882273.59874: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882273.59983: no more pending results, returning what we have 15330 1726882273.59985: results queue empty 15330 1726882273.59986: checking for any_errors_fatal 15330 1726882273.59989: done checking for any_errors_fatal 15330 1726882273.59989: checking for max_fail_percentage 15330 1726882273.59991: done checking for max_fail_percentage 15330 1726882273.59991: checking to see if all hosts have failed and the running result is not ok 15330 1726882273.59992: done checking to see if all hosts have failed 15330 1726882273.59994: getting the remaining hosts for this loop 15330 1726882273.59995: done getting the remaining hosts for this loop 15330 1726882273.59998: getting the next task for host managed_node3 15330 1726882273.60002: done getting next task for host managed_node3 15330 1726882273.60005: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15330 1726882273.60007: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882273.60015: getting variables 15330 1726882273.60017: in VariableManager get_vars() 15330 1726882273.60041: Calling all_inventory to load vars for managed_node3 15330 1726882273.60043: Calling groups_inventory to load vars for managed_node3 15330 1726882273.60045: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882273.60166: Calling all_plugins_play to load vars for managed_node3 15330 1726882273.60170: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882273.60174: Calling groups_plugins_play to load vars for managed_node3 15330 1726882273.61950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882273.63625: done with get_vars() 15330 1726882273.63648: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:31:13 -0400 (0:00:01.786) 0:00:22.843 ****** 15330 1726882273.63748: entering _queue_task() for managed_node3/package_facts 15330 1726882273.64069: worker is 1 (out of 1 available) 15330 1726882273.64083: exiting _queue_task() for managed_node3/package_facts 15330 1726882273.64300: done queuing things up, now waiting for results queue to drain 15330 1726882273.64302: waiting for pending results... 15330 1726882273.64430: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 15330 1726882273.64639: in run() - task 12673a56-9f93-e4fe-1358-0000000002fc 15330 1726882273.64644: variable 'ansible_search_path' from source: unknown 15330 1726882273.64647: variable 'ansible_search_path' from source: unknown 15330 1726882273.64689: calling self._execute() 15330 1726882273.64854: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882273.64858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882273.64861: variable 'omit' from source: magic vars 15330 1726882273.65309: variable 'ansible_distribution_major_version' from source: facts 15330 1726882273.65328: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882273.65339: variable 'omit' from source: magic vars 15330 1726882273.65406: variable 'omit' from source: magic vars 15330 1726882273.65500: variable 'omit' from source: magic vars 15330 1726882273.65504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882273.65534: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882273.65561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882273.65584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882273.65618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882273.65661: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882273.65672: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882273.65689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882273.65898: Set connection var ansible_pipelining to False 15330 1726882273.65901: Set connection var ansible_timeout to 10 15330 1726882273.65904: Set connection var ansible_connection to ssh 15330 1726882273.65906: Set connection var ansible_shell_type to sh 15330 1726882273.65908: Set connection var ansible_shell_executable to /bin/sh 15330 1726882273.65910: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882273.65921: variable 'ansible_shell_executable' from source: unknown 15330 1726882273.65929: variable 'ansible_connection' from source: unknown 15330 1726882273.65947: variable 'ansible_module_compression' from source: unknown 15330 1726882273.65956: variable 'ansible_shell_type' from source: unknown 15330 1726882273.65967: variable 'ansible_shell_executable' from source: unknown 15330 1726882273.65975: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882273.66048: variable 'ansible_pipelining' from source: unknown 15330 1726882273.66052: variable 'ansible_timeout' from source: unknown 15330 1726882273.66055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882273.66271: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882273.66297: variable 'omit' from source: magic vars 15330 1726882273.66312: starting attempt loop 15330 1726882273.66319: running the handler 15330 1726882273.66347: _low_level_execute_command(): starting 15330 1726882273.66374: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882273.67262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882273.67367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882273.67568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882273.67572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882273.69150: stdout chunk (state=3): >>>/root <<< 15330 1726882273.69305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882273.69309: stdout chunk (state=3): >>><<< 15330 1726882273.69311: stderr chunk (state=3): >>><<< 15330 1726882273.69446: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882273.69449: _low_level_execute_command(): starting 15330 1726882273.69453: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426 `" && echo ansible-tmp-1726882273.6934252-16365-231085902283426="` echo /root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426 `" ) && sleep 0' 15330 1726882273.70112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882273.70144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882273.70161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882273.70174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882273.70257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882273.72100: stdout chunk (state=3): >>>ansible-tmp-1726882273.6934252-16365-231085902283426=/root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426 <<< 15330 1726882273.72206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882273.72256: stderr chunk (state=3): >>><<< 15330 1726882273.72266: stdout chunk (state=3): >>><<< 15330 1726882273.72290: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882273.6934252-16365-231085902283426=/root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882273.72432: variable 'ansible_module_compression' from source: unknown 15330 1726882273.72438: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15330 1726882273.72499: variable 'ansible_facts' from source: unknown 15330 1726882273.72732: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426/AnsiballZ_package_facts.py 15330 1726882273.72916: Sending initial data 15330 1726882273.73039: Sent initial data (162 bytes) 15330 1726882273.73678: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882273.73697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882273.73791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882273.73843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882273.73859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882273.73896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882273.74016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882273.75509: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15330 1726882273.75524: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882273.75596: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882273.75663: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpdfx50v33 /root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426/AnsiballZ_package_facts.py <<< 15330 1726882273.75670: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426/AnsiballZ_package_facts.py" <<< 15330 1726882273.75714: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpdfx50v33" to remote "/root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426/AnsiballZ_package_facts.py" <<< 15330 1726882273.77432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882273.77526: stderr chunk (state=3): >>><<< 15330 1726882273.77529: stdout chunk (state=3): >>><<< 15330 1726882273.77531: done transferring module to remote 15330 1726882273.77533: _low_level_execute_command(): starting 15330 1726882273.77535: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426/ /root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426/AnsiballZ_package_facts.py && sleep 0' 15330 1726882273.78161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882273.78197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882273.78313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882273.78335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882273.78403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882273.80175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882273.80225: stderr chunk (state=3): >>><<< 15330 1726882273.80284: stdout chunk (state=3): >>><<< 15330 1726882273.80437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882273.80441: _low_level_execute_command(): starting 15330 1726882273.80443: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426/AnsiballZ_package_facts.py && sleep 0' 15330 1726882273.81115: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882273.81119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882273.81142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882273.81230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882274.24789: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 15330 1726882274.24869: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 15330 1726882274.24944: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 15330 1726882274.25001: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15330 1726882274.26895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882274.26898: stdout chunk (state=3): >>><<< 15330 1726882274.26901: stderr chunk (state=3): >>><<< 15330 1726882274.26940: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882274.29568: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882274.29659: _low_level_execute_command(): starting 15330 1726882274.29779: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882273.6934252-16365-231085902283426/ > /dev/null 2>&1 && sleep 0' 15330 1726882274.30389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882274.30392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882274.30402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882274.30404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882274.30407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882274.30471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882274.30475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882274.30477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882274.30552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882274.32446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882274.32459: stdout chunk (state=3): >>><<< 15330 1726882274.32471: stderr chunk (state=3): >>><<< 15330 1726882274.32500: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882274.32519: handler run complete 15330 1726882274.33452: variable 'ansible_facts' from source: unknown 15330 1726882274.34037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882274.36352: variable 'ansible_facts' from source: unknown 15330 1726882274.36808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882274.37632: attempt loop complete, returning result 15330 1726882274.37635: _execute() done 15330 1726882274.37637: dumping result to json 15330 1726882274.37827: done dumping result, returning 15330 1726882274.37844: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-e4fe-1358-0000000002fc] 15330 1726882274.37857: sending task result for task 12673a56-9f93-e4fe-1358-0000000002fc 15330 1726882274.40499: done sending task result for task 12673a56-9f93-e4fe-1358-0000000002fc 15330 1726882274.40502: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882274.40663: no more pending results, returning what we have 15330 1726882274.40666: results queue empty 15330 1726882274.40667: checking for any_errors_fatal 15330 1726882274.40672: done checking for any_errors_fatal 15330 1726882274.40673: checking for max_fail_percentage 15330 1726882274.40675: done checking for max_fail_percentage 15330 1726882274.40675: checking to see if all hosts have failed and the running result is not ok 15330 1726882274.40676: done checking to see if all hosts have failed 15330 1726882274.40677: getting the remaining hosts for this loop 15330 1726882274.40678: done getting the remaining hosts for this loop 15330 1726882274.40682: getting the next task for host managed_node3 15330 1726882274.40691: done getting next task for host managed_node3 15330 1726882274.40713: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15330 1726882274.40716: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882274.40726: getting variables 15330 1726882274.40728: in VariableManager get_vars() 15330 1726882274.40759: Calling all_inventory to load vars for managed_node3 15330 1726882274.40762: Calling groups_inventory to load vars for managed_node3 15330 1726882274.40764: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882274.40777: Calling all_plugins_play to load vars for managed_node3 15330 1726882274.40781: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882274.40788: Calling groups_plugins_play to load vars for managed_node3 15330 1726882274.42203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882274.43854: done with get_vars() 15330 1726882274.43888: done getting variables 15330 1726882274.43950: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:31:14 -0400 (0:00:00.802) 0:00:23.645 ****** 15330 1726882274.43983: entering _queue_task() for managed_node3/debug 15330 1726882274.44351: worker is 1 (out of 1 available) 15330 1726882274.44365: exiting _queue_task() for managed_node3/debug 15330 1726882274.44376: done queuing things up, now waiting for results queue to drain 15330 1726882274.44377: waiting for pending results... 15330 1726882274.44809: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 15330 1726882274.44814: in run() - task 12673a56-9f93-e4fe-1358-00000000003b 15330 1726882274.44817: variable 'ansible_search_path' from source: unknown 15330 1726882274.44820: variable 'ansible_search_path' from source: unknown 15330 1726882274.44831: calling self._execute() 15330 1726882274.44936: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882274.44949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882274.44964: variable 'omit' from source: magic vars 15330 1726882274.45412: variable 'ansible_distribution_major_version' from source: facts 15330 1726882274.45435: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882274.45448: variable 'omit' from source: magic vars 15330 1726882274.45502: variable 'omit' from source: magic vars 15330 1726882274.45627: variable 'network_provider' from source: set_fact 15330 1726882274.45651: variable 'omit' from source: magic vars 15330 1726882274.45707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882274.45749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882274.45810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882274.45817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882274.45835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882274.45871: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882274.45886: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882274.45915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882274.46023: Set connection var ansible_pipelining to False 15330 1726882274.46134: Set connection var ansible_timeout to 10 15330 1726882274.46137: Set connection var ansible_connection to ssh 15330 1726882274.46140: Set connection var ansible_shell_type to sh 15330 1726882274.46142: Set connection var ansible_shell_executable to /bin/sh 15330 1726882274.46145: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882274.46147: variable 'ansible_shell_executable' from source: unknown 15330 1726882274.46150: variable 'ansible_connection' from source: unknown 15330 1726882274.46152: variable 'ansible_module_compression' from source: unknown 15330 1726882274.46154: variable 'ansible_shell_type' from source: unknown 15330 1726882274.46156: variable 'ansible_shell_executable' from source: unknown 15330 1726882274.46157: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882274.46159: variable 'ansible_pipelining' from source: unknown 15330 1726882274.46160: variable 'ansible_timeout' from source: unknown 15330 1726882274.46162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882274.46260: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882274.46275: variable 'omit' from source: magic vars 15330 1726882274.46285: starting attempt loop 15330 1726882274.46295: running the handler 15330 1726882274.46357: handler run complete 15330 1726882274.46382: attempt loop complete, returning result 15330 1726882274.46395: _execute() done 15330 1726882274.46404: dumping result to json 15330 1726882274.46412: done dumping result, returning 15330 1726882274.46432: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-e4fe-1358-00000000003b] 15330 1726882274.46444: sending task result for task 12673a56-9f93-e4fe-1358-00000000003b 15330 1726882274.46639: done sending task result for task 12673a56-9f93-e4fe-1358-00000000003b 15330 1726882274.46642: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 15330 1726882274.46740: no more pending results, returning what we have 15330 1726882274.46744: results queue empty 15330 1726882274.46745: checking for any_errors_fatal 15330 1726882274.46757: done checking for any_errors_fatal 15330 1726882274.46757: checking for max_fail_percentage 15330 1726882274.46759: done checking for max_fail_percentage 15330 1726882274.46760: checking to see if all hosts have failed and the running result is not ok 15330 1726882274.46761: done checking to see if all hosts have failed 15330 1726882274.46762: getting the remaining hosts for this loop 15330 1726882274.46763: done getting the remaining hosts for this loop 15330 1726882274.46767: getting the next task for host managed_node3 15330 1726882274.46774: done getting next task for host managed_node3 15330 1726882274.46778: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15330 1726882274.46781: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882274.46798: getting variables 15330 1726882274.46800: in VariableManager get_vars() 15330 1726882274.46844: Calling all_inventory to load vars for managed_node3 15330 1726882274.46847: Calling groups_inventory to load vars for managed_node3 15330 1726882274.46849: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882274.46861: Calling all_plugins_play to load vars for managed_node3 15330 1726882274.46864: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882274.46867: Calling groups_plugins_play to load vars for managed_node3 15330 1726882274.48478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882274.50101: done with get_vars() 15330 1726882274.50130: done getting variables 15330 1726882274.50217: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:31:14 -0400 (0:00:00.062) 0:00:23.708 ****** 15330 1726882274.50259: entering _queue_task() for managed_node3/fail 15330 1726882274.50614: worker is 1 (out of 1 available) 15330 1726882274.50625: exiting _queue_task() for managed_node3/fail 15330 1726882274.50636: done queuing things up, now waiting for results queue to drain 15330 1726882274.50637: waiting for pending results... 15330 1726882274.50963: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15330 1726882274.51231: in run() - task 12673a56-9f93-e4fe-1358-00000000003c 15330 1726882274.51236: variable 'ansible_search_path' from source: unknown 15330 1726882274.51241: variable 'ansible_search_path' from source: unknown 15330 1726882274.51244: calling self._execute() 15330 1726882274.51326: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882274.51359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882274.51383: variable 'omit' from source: magic vars 15330 1726882274.51842: variable 'ansible_distribution_major_version' from source: facts 15330 1726882274.51857: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882274.52027: variable 'network_state' from source: role '' defaults 15330 1726882274.52048: Evaluated conditional (network_state != {}): False 15330 1726882274.52056: when evaluation is False, skipping this task 15330 1726882274.52064: _execute() done 15330 1726882274.52072: dumping result to json 15330 1726882274.52107: done dumping result, returning 15330 1726882274.52111: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-e4fe-1358-00000000003c] 15330 1726882274.52114: sending task result for task 12673a56-9f93-e4fe-1358-00000000003c skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882274.52345: no more pending results, returning what we have 15330 1726882274.52349: results queue empty 15330 1726882274.52350: checking for any_errors_fatal 15330 1726882274.52358: done checking for any_errors_fatal 15330 1726882274.52359: checking for max_fail_percentage 15330 1726882274.52361: done checking for max_fail_percentage 15330 1726882274.52362: checking to see if all hosts have failed and the running result is not ok 15330 1726882274.52363: done checking to see if all hosts have failed 15330 1726882274.52363: getting the remaining hosts for this loop 15330 1726882274.52364: done getting the remaining hosts for this loop 15330 1726882274.52368: getting the next task for host managed_node3 15330 1726882274.52375: done getting next task for host managed_node3 15330 1726882274.52379: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15330 1726882274.52382: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882274.52399: getting variables 15330 1726882274.52401: in VariableManager get_vars() 15330 1726882274.52440: Calling all_inventory to load vars for managed_node3 15330 1726882274.52443: Calling groups_inventory to load vars for managed_node3 15330 1726882274.52446: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882274.52457: Calling all_plugins_play to load vars for managed_node3 15330 1726882274.52460: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882274.52464: Calling groups_plugins_play to load vars for managed_node3 15330 1726882274.53006: done sending task result for task 12673a56-9f93-e4fe-1358-00000000003c 15330 1726882274.53010: WORKER PROCESS EXITING 15330 1726882274.53848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882274.56297: done with get_vars() 15330 1726882274.56320: done getting variables 15330 1726882274.56375: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:31:14 -0400 (0:00:00.061) 0:00:23.770 ****** 15330 1726882274.56406: entering _queue_task() for managed_node3/fail 15330 1726882274.56808: worker is 1 (out of 1 available) 15330 1726882274.56820: exiting _queue_task() for managed_node3/fail 15330 1726882274.56830: done queuing things up, now waiting for results queue to drain 15330 1726882274.56832: waiting for pending results... 15330 1726882274.57030: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15330 1726882274.57136: in run() - task 12673a56-9f93-e4fe-1358-00000000003d 15330 1726882274.57155: variable 'ansible_search_path' from source: unknown 15330 1726882274.57168: variable 'ansible_search_path' from source: unknown 15330 1726882274.57210: calling self._execute() 15330 1726882274.57299: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882274.57312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882274.57327: variable 'omit' from source: magic vars 15330 1726882274.57690: variable 'ansible_distribution_major_version' from source: facts 15330 1726882274.57714: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882274.57834: variable 'network_state' from source: role '' defaults 15330 1726882274.57850: Evaluated conditional (network_state != {}): False 15330 1726882274.57857: when evaluation is False, skipping this task 15330 1726882274.57864: _execute() done 15330 1726882274.57871: dumping result to json 15330 1726882274.57878: done dumping result, returning 15330 1726882274.57888: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-e4fe-1358-00000000003d] 15330 1726882274.57899: sending task result for task 12673a56-9f93-e4fe-1358-00000000003d 15330 1726882274.58206: done sending task result for task 12673a56-9f93-e4fe-1358-00000000003d 15330 1726882274.58209: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882274.58248: no more pending results, returning what we have 15330 1726882274.58251: results queue empty 15330 1726882274.58252: checking for any_errors_fatal 15330 1726882274.58257: done checking for any_errors_fatal 15330 1726882274.58258: checking for max_fail_percentage 15330 1726882274.58259: done checking for max_fail_percentage 15330 1726882274.58260: checking to see if all hosts have failed and the running result is not ok 15330 1726882274.58261: done checking to see if all hosts have failed 15330 1726882274.58262: getting the remaining hosts for this loop 15330 1726882274.58263: done getting the remaining hosts for this loop 15330 1726882274.58266: getting the next task for host managed_node3 15330 1726882274.58270: done getting next task for host managed_node3 15330 1726882274.58274: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15330 1726882274.58276: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882274.58290: getting variables 15330 1726882274.58292: in VariableManager get_vars() 15330 1726882274.58323: Calling all_inventory to load vars for managed_node3 15330 1726882274.58326: Calling groups_inventory to load vars for managed_node3 15330 1726882274.58327: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882274.58335: Calling all_plugins_play to load vars for managed_node3 15330 1726882274.58337: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882274.58339: Calling groups_plugins_play to load vars for managed_node3 15330 1726882274.59636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882274.61134: done with get_vars() 15330 1726882274.61161: done getting variables 15330 1726882274.61225: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:31:14 -0400 (0:00:00.048) 0:00:23.818 ****** 15330 1726882274.61257: entering _queue_task() for managed_node3/fail 15330 1726882274.61599: worker is 1 (out of 1 available) 15330 1726882274.61612: exiting _queue_task() for managed_node3/fail 15330 1726882274.61625: done queuing things up, now waiting for results queue to drain 15330 1726882274.61626: waiting for pending results... 15330 1726882274.61903: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15330 1726882274.62006: in run() - task 12673a56-9f93-e4fe-1358-00000000003e 15330 1726882274.62029: variable 'ansible_search_path' from source: unknown 15330 1726882274.62037: variable 'ansible_search_path' from source: unknown 15330 1726882274.62074: calling self._execute() 15330 1726882274.62170: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882274.62182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882274.62198: variable 'omit' from source: magic vars 15330 1726882274.62564: variable 'ansible_distribution_major_version' from source: facts 15330 1726882274.62580: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882274.62753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882274.64975: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882274.65052: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882274.65169: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882274.65173: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882274.65180: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882274.65263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882274.65305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882274.65331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882274.65367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882274.65382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882274.65485: variable 'ansible_distribution_major_version' from source: facts 15330 1726882274.65511: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15330 1726882274.65630: variable 'ansible_distribution' from source: facts 15330 1726882274.65638: variable '__network_rh_distros' from source: role '' defaults 15330 1726882274.65649: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15330 1726882274.66298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882274.66302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882274.66304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882274.66306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882274.66308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882274.66449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882274.66476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882274.66569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882274.66613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882274.66667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882274.66713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882274.66824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882274.66851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882274.67200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882274.67203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882274.67623: variable 'network_connections' from source: play vars 15330 1726882274.67710: variable 'profile' from source: play vars 15330 1726882274.67889: variable 'profile' from source: play vars 15330 1726882274.67900: variable 'interface' from source: set_fact 15330 1726882274.67956: variable 'interface' from source: set_fact 15330 1726882274.67973: variable 'network_state' from source: role '' defaults 15330 1726882274.68135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882274.68496: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882274.68544: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882274.68648: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882274.68835: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882274.68845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882274.68881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882274.68913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882274.68971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882274.69079: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15330 1726882274.69087: when evaluation is False, skipping this task 15330 1726882274.69097: _execute() done 15330 1726882274.69105: dumping result to json 15330 1726882274.69114: done dumping result, returning 15330 1726882274.69127: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-e4fe-1358-00000000003e] 15330 1726882274.69380: sending task result for task 12673a56-9f93-e4fe-1358-00000000003e 15330 1726882274.69450: done sending task result for task 12673a56-9f93-e4fe-1358-00000000003e 15330 1726882274.69453: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15330 1726882274.69536: no more pending results, returning what we have 15330 1726882274.69540: results queue empty 15330 1726882274.69541: checking for any_errors_fatal 15330 1726882274.69549: done checking for any_errors_fatal 15330 1726882274.69550: checking for max_fail_percentage 15330 1726882274.69552: done checking for max_fail_percentage 15330 1726882274.69553: checking to see if all hosts have failed and the running result is not ok 15330 1726882274.69554: done checking to see if all hosts have failed 15330 1726882274.69555: getting the remaining hosts for this loop 15330 1726882274.69556: done getting the remaining hosts for this loop 15330 1726882274.69560: getting the next task for host managed_node3 15330 1726882274.69566: done getting next task for host managed_node3 15330 1726882274.69571: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15330 1726882274.69573: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882274.69587: getting variables 15330 1726882274.69589: in VariableManager get_vars() 15330 1726882274.69634: Calling all_inventory to load vars for managed_node3 15330 1726882274.69637: Calling groups_inventory to load vars for managed_node3 15330 1726882274.69639: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882274.69651: Calling all_plugins_play to load vars for managed_node3 15330 1726882274.69654: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882274.69658: Calling groups_plugins_play to load vars for managed_node3 15330 1726882274.72672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882274.76480: done with get_vars() 15330 1726882274.76510: done getting variables 15330 1726882274.76569: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:31:14 -0400 (0:00:00.154) 0:00:23.973 ****** 15330 1726882274.76715: entering _queue_task() for managed_node3/dnf 15330 1726882274.77369: worker is 1 (out of 1 available) 15330 1726882274.77382: exiting _queue_task() for managed_node3/dnf 15330 1726882274.77398: done queuing things up, now waiting for results queue to drain 15330 1726882274.77400: waiting for pending results... 15330 1726882274.77861: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15330 1726882274.78156: in run() - task 12673a56-9f93-e4fe-1358-00000000003f 15330 1726882274.78159: variable 'ansible_search_path' from source: unknown 15330 1726882274.78162: variable 'ansible_search_path' from source: unknown 15330 1726882274.78187: calling self._execute() 15330 1726882274.78276: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882274.78355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882274.78599: variable 'omit' from source: magic vars 15330 1726882274.79098: variable 'ansible_distribution_major_version' from source: facts 15330 1726882274.79181: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882274.79499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882274.83433: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882274.83510: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882274.83554: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882274.83599: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882274.83634: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882274.83715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882274.83751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882274.83780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882274.83825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882274.83843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882274.83964: variable 'ansible_distribution' from source: facts 15330 1726882274.83975: variable 'ansible_distribution_major_version' from source: facts 15330 1726882274.83996: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15330 1726882274.84120: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882274.84251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882274.84280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882274.84312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882274.84358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882274.84377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882274.84422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882274.84448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882274.84474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882274.84519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882274.84538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882274.84581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882274.84698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882274.84701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882274.84704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882274.84706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882274.84850: variable 'network_connections' from source: play vars 15330 1726882274.84866: variable 'profile' from source: play vars 15330 1726882274.84934: variable 'profile' from source: play vars 15330 1726882274.84944: variable 'interface' from source: set_fact 15330 1726882274.85008: variable 'interface' from source: set_fact 15330 1726882274.85080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882274.85269: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882274.85315: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882274.85347: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882274.85379: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882274.85426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882274.85450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882274.85486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882274.85698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882274.85701: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882274.85780: variable 'network_connections' from source: play vars 15330 1726882274.85790: variable 'profile' from source: play vars 15330 1726882274.85854: variable 'profile' from source: play vars 15330 1726882274.85863: variable 'interface' from source: set_fact 15330 1726882274.85925: variable 'interface' from source: set_fact 15330 1726882274.85952: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15330 1726882274.85959: when evaluation is False, skipping this task 15330 1726882274.85966: _execute() done 15330 1726882274.85972: dumping result to json 15330 1726882274.85978: done dumping result, returning 15330 1726882274.85989: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-00000000003f] 15330 1726882274.86000: sending task result for task 12673a56-9f93-e4fe-1358-00000000003f 15330 1726882274.86109: done sending task result for task 12673a56-9f93-e4fe-1358-00000000003f 15330 1726882274.86115: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15330 1726882274.86185: no more pending results, returning what we have 15330 1726882274.86191: results queue empty 15330 1726882274.86192: checking for any_errors_fatal 15330 1726882274.86201: done checking for any_errors_fatal 15330 1726882274.86202: checking for max_fail_percentage 15330 1726882274.86203: done checking for max_fail_percentage 15330 1726882274.86204: checking to see if all hosts have failed and the running result is not ok 15330 1726882274.86205: done checking to see if all hosts have failed 15330 1726882274.86206: getting the remaining hosts for this loop 15330 1726882274.86207: done getting the remaining hosts for this loop 15330 1726882274.86210: getting the next task for host managed_node3 15330 1726882274.86216: done getting next task for host managed_node3 15330 1726882274.86220: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15330 1726882274.86221: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882274.86234: getting variables 15330 1726882274.86236: in VariableManager get_vars() 15330 1726882274.86274: Calling all_inventory to load vars for managed_node3 15330 1726882274.86276: Calling groups_inventory to load vars for managed_node3 15330 1726882274.86278: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882274.86290: Calling all_plugins_play to load vars for managed_node3 15330 1726882274.86395: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882274.86401: Calling groups_plugins_play to load vars for managed_node3 15330 1726882274.89166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882274.92414: done with get_vars() 15330 1726882274.92449: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15330 1726882274.92540: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:31:14 -0400 (0:00:00.158) 0:00:24.131 ****** 15330 1726882274.92572: entering _queue_task() for managed_node3/yum 15330 1726882274.93557: worker is 1 (out of 1 available) 15330 1726882274.93570: exiting _queue_task() for managed_node3/yum 15330 1726882274.93582: done queuing things up, now waiting for results queue to drain 15330 1726882274.93583: waiting for pending results... 15330 1726882274.94078: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15330 1726882274.94296: in run() - task 12673a56-9f93-e4fe-1358-000000000040 15330 1726882274.94341: variable 'ansible_search_path' from source: unknown 15330 1726882274.94345: variable 'ansible_search_path' from source: unknown 15330 1726882274.94379: calling self._execute() 15330 1726882274.94621: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882274.94624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882274.94666: variable 'omit' from source: magic vars 15330 1726882274.95401: variable 'ansible_distribution_major_version' from source: facts 15330 1726882274.95522: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882274.95784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882275.00484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882275.00690: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882275.00739: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882275.00839: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882275.00871: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882275.01022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.01203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.01227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.01263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.01327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.01540: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.01559: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15330 1726882275.01566: when evaluation is False, skipping this task 15330 1726882275.01573: _execute() done 15330 1726882275.01579: dumping result to json 15330 1726882275.01589: done dumping result, returning 15330 1726882275.01648: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-000000000040] 15330 1726882275.01657: sending task result for task 12673a56-9f93-e4fe-1358-000000000040 15330 1726882275.01920: done sending task result for task 12673a56-9f93-e4fe-1358-000000000040 15330 1726882275.01923: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15330 1726882275.01975: no more pending results, returning what we have 15330 1726882275.01978: results queue empty 15330 1726882275.01979: checking for any_errors_fatal 15330 1726882275.01990: done checking for any_errors_fatal 15330 1726882275.01990: checking for max_fail_percentage 15330 1726882275.01992: done checking for max_fail_percentage 15330 1726882275.01995: checking to see if all hosts have failed and the running result is not ok 15330 1726882275.01996: done checking to see if all hosts have failed 15330 1726882275.01996: getting the remaining hosts for this loop 15330 1726882275.01998: done getting the remaining hosts for this loop 15330 1726882275.02010: getting the next task for host managed_node3 15330 1726882275.02020: done getting next task for host managed_node3 15330 1726882275.02025: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15330 1726882275.02027: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882275.02043: getting variables 15330 1726882275.02046: in VariableManager get_vars() 15330 1726882275.02089: Calling all_inventory to load vars for managed_node3 15330 1726882275.02495: Calling groups_inventory to load vars for managed_node3 15330 1726882275.02500: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882275.02509: Calling all_plugins_play to load vars for managed_node3 15330 1726882275.02512: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882275.02515: Calling groups_plugins_play to load vars for managed_node3 15330 1726882275.05420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882275.08907: done with get_vars() 15330 1726882275.08939: done getting variables 15330 1726882275.09214: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:31:15 -0400 (0:00:00.166) 0:00:24.298 ****** 15330 1726882275.09249: entering _queue_task() for managed_node3/fail 15330 1726882275.10126: worker is 1 (out of 1 available) 15330 1726882275.10136: exiting _queue_task() for managed_node3/fail 15330 1726882275.10145: done queuing things up, now waiting for results queue to drain 15330 1726882275.10146: waiting for pending results... 15330 1726882275.10614: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15330 1726882275.10730: in run() - task 12673a56-9f93-e4fe-1358-000000000041 15330 1726882275.10753: variable 'ansible_search_path' from source: unknown 15330 1726882275.10761: variable 'ansible_search_path' from source: unknown 15330 1726882275.10805: calling self._execute() 15330 1726882275.10997: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882275.11143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882275.11149: variable 'omit' from source: magic vars 15330 1726882275.11914: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.12143: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882275.12229: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882275.12757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882275.16902: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882275.16959: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882275.17047: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882275.17109: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882275.17232: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882275.17316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.17385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.17422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.17471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.17498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.17699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.17702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.17725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.17899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.17989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.17998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.18001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.18200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.18204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.18207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.18562: variable 'network_connections' from source: play vars 15330 1726882275.18581: variable 'profile' from source: play vars 15330 1726882275.18708: variable 'profile' from source: play vars 15330 1726882275.18808: variable 'interface' from source: set_fact 15330 1726882275.18881: variable 'interface' from source: set_fact 15330 1726882275.19013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882275.19232: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882275.19274: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882275.19319: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882275.19351: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882275.19405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882275.19431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882275.19461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.19495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882275.19555: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882275.19936: variable 'network_connections' from source: play vars 15330 1726882275.19939: variable 'profile' from source: play vars 15330 1726882275.19942: variable 'profile' from source: play vars 15330 1726882275.19944: variable 'interface' from source: set_fact 15330 1726882275.19991: variable 'interface' from source: set_fact 15330 1726882275.20023: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15330 1726882275.20031: when evaluation is False, skipping this task 15330 1726882275.20040: _execute() done 15330 1726882275.20049: dumping result to json 15330 1726882275.20155: done dumping result, returning 15330 1726882275.20159: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-000000000041] 15330 1726882275.20168: sending task result for task 12673a56-9f93-e4fe-1358-000000000041 15330 1726882275.20241: done sending task result for task 12673a56-9f93-e4fe-1358-000000000041 15330 1726882275.20244: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15330 1726882275.20315: no more pending results, returning what we have 15330 1726882275.20319: results queue empty 15330 1726882275.20320: checking for any_errors_fatal 15330 1726882275.20327: done checking for any_errors_fatal 15330 1726882275.20327: checking for max_fail_percentage 15330 1726882275.20330: done checking for max_fail_percentage 15330 1726882275.20331: checking to see if all hosts have failed and the running result is not ok 15330 1726882275.20332: done checking to see if all hosts have failed 15330 1726882275.20332: getting the remaining hosts for this loop 15330 1726882275.20334: done getting the remaining hosts for this loop 15330 1726882275.20338: getting the next task for host managed_node3 15330 1726882275.20343: done getting next task for host managed_node3 15330 1726882275.20347: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15330 1726882275.20350: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882275.20363: getting variables 15330 1726882275.20365: in VariableManager get_vars() 15330 1726882275.20510: Calling all_inventory to load vars for managed_node3 15330 1726882275.20513: Calling groups_inventory to load vars for managed_node3 15330 1726882275.20515: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882275.20528: Calling all_plugins_play to load vars for managed_node3 15330 1726882275.20530: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882275.20533: Calling groups_plugins_play to load vars for managed_node3 15330 1726882275.22531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882275.23923: done with get_vars() 15330 1726882275.23941: done getting variables 15330 1726882275.23983: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:31:15 -0400 (0:00:00.147) 0:00:24.446 ****** 15330 1726882275.24011: entering _queue_task() for managed_node3/package 15330 1726882275.24254: worker is 1 (out of 1 available) 15330 1726882275.24268: exiting _queue_task() for managed_node3/package 15330 1726882275.24281: done queuing things up, now waiting for results queue to drain 15330 1726882275.24283: waiting for pending results... 15330 1726882275.24462: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 15330 1726882275.24527: in run() - task 12673a56-9f93-e4fe-1358-000000000042 15330 1726882275.24538: variable 'ansible_search_path' from source: unknown 15330 1726882275.24541: variable 'ansible_search_path' from source: unknown 15330 1726882275.24571: calling self._execute() 15330 1726882275.24648: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882275.24654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882275.24662: variable 'omit' from source: magic vars 15330 1726882275.24949: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.24958: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882275.25198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882275.25390: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882275.25445: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882275.25487: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882275.25529: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882275.25638: variable 'network_packages' from source: role '' defaults 15330 1726882275.25744: variable '__network_provider_setup' from source: role '' defaults 15330 1726882275.25758: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882275.25823: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882275.25837: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882275.25942: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882275.26135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882275.32460: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882275.32599: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882275.32603: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882275.32625: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882275.32654: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882275.32729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.32771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.32800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.32826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.32837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.32880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.32901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.32924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.32960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.32970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.33178: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15330 1726882275.33309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.33397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.33401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.33404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.33406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.33502: variable 'ansible_python' from source: facts 15330 1726882275.33531: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15330 1726882275.33619: variable '__network_wpa_supplicant_required' from source: role '' defaults 15330 1726882275.33704: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15330 1726882275.33836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.33865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.33900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.33949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.34000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.34024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.34064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.34101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.34145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.34164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.34263: variable 'network_connections' from source: play vars 15330 1726882275.34270: variable 'profile' from source: play vars 15330 1726882275.34344: variable 'profile' from source: play vars 15330 1726882275.34348: variable 'interface' from source: set_fact 15330 1726882275.34402: variable 'interface' from source: set_fact 15330 1726882275.34450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882275.34473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882275.34498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.34518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882275.34545: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882275.34721: variable 'network_connections' from source: play vars 15330 1726882275.34724: variable 'profile' from source: play vars 15330 1726882275.34790: variable 'profile' from source: play vars 15330 1726882275.34800: variable 'interface' from source: set_fact 15330 1726882275.34849: variable 'interface' from source: set_fact 15330 1726882275.34873: variable '__network_packages_default_wireless' from source: role '' defaults 15330 1726882275.34932: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882275.35133: variable 'network_connections' from source: play vars 15330 1726882275.35138: variable 'profile' from source: play vars 15330 1726882275.35182: variable 'profile' from source: play vars 15330 1726882275.35186: variable 'interface' from source: set_fact 15330 1726882275.35257: variable 'interface' from source: set_fact 15330 1726882275.35276: variable '__network_packages_default_team' from source: role '' defaults 15330 1726882275.35333: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882275.35525: variable 'network_connections' from source: play vars 15330 1726882275.35528: variable 'profile' from source: play vars 15330 1726882275.35574: variable 'profile' from source: play vars 15330 1726882275.35578: variable 'interface' from source: set_fact 15330 1726882275.35648: variable 'interface' from source: set_fact 15330 1726882275.35685: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882275.35730: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882275.35736: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882275.35783: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882275.35988: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15330 1726882275.36451: variable 'network_connections' from source: play vars 15330 1726882275.36492: variable 'profile' from source: play vars 15330 1726882275.36529: variable 'profile' from source: play vars 15330 1726882275.36537: variable 'interface' from source: set_fact 15330 1726882275.36604: variable 'interface' from source: set_fact 15330 1726882275.36618: variable 'ansible_distribution' from source: facts 15330 1726882275.36626: variable '__network_rh_distros' from source: role '' defaults 15330 1726882275.36708: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.36712: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15330 1726882275.36831: variable 'ansible_distribution' from source: facts 15330 1726882275.36841: variable '__network_rh_distros' from source: role '' defaults 15330 1726882275.36851: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.36869: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15330 1726882275.37016: variable 'ansible_distribution' from source: facts 15330 1726882275.37019: variable '__network_rh_distros' from source: role '' defaults 15330 1726882275.37022: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.37051: variable 'network_provider' from source: set_fact 15330 1726882275.37062: variable 'ansible_facts' from source: unknown 15330 1726882275.37427: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15330 1726882275.37431: when evaluation is False, skipping this task 15330 1726882275.37433: _execute() done 15330 1726882275.37436: dumping result to json 15330 1726882275.37438: done dumping result, returning 15330 1726882275.37444: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-e4fe-1358-000000000042] 15330 1726882275.37447: sending task result for task 12673a56-9f93-e4fe-1358-000000000042 15330 1726882275.37533: done sending task result for task 12673a56-9f93-e4fe-1358-000000000042 15330 1726882275.37536: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15330 1726882275.37618: no more pending results, returning what we have 15330 1726882275.37621: results queue empty 15330 1726882275.37622: checking for any_errors_fatal 15330 1726882275.37628: done checking for any_errors_fatal 15330 1726882275.37629: checking for max_fail_percentage 15330 1726882275.37630: done checking for max_fail_percentage 15330 1726882275.37631: checking to see if all hosts have failed and the running result is not ok 15330 1726882275.37632: done checking to see if all hosts have failed 15330 1726882275.37632: getting the remaining hosts for this loop 15330 1726882275.37634: done getting the remaining hosts for this loop 15330 1726882275.37637: getting the next task for host managed_node3 15330 1726882275.37642: done getting next task for host managed_node3 15330 1726882275.37647: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15330 1726882275.37649: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882275.37661: getting variables 15330 1726882275.37663: in VariableManager get_vars() 15330 1726882275.37704: Calling all_inventory to load vars for managed_node3 15330 1726882275.37707: Calling groups_inventory to load vars for managed_node3 15330 1726882275.37709: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882275.37724: Calling all_plugins_play to load vars for managed_node3 15330 1726882275.37726: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882275.37729: Calling groups_plugins_play to load vars for managed_node3 15330 1726882275.43123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882275.44178: done with get_vars() 15330 1726882275.44211: done getting variables 15330 1726882275.44254: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:31:15 -0400 (0:00:00.202) 0:00:24.648 ****** 15330 1726882275.44273: entering _queue_task() for managed_node3/package 15330 1726882275.44598: worker is 1 (out of 1 available) 15330 1726882275.44610: exiting _queue_task() for managed_node3/package 15330 1726882275.44621: done queuing things up, now waiting for results queue to drain 15330 1726882275.44622: waiting for pending results... 15330 1726882275.44814: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15330 1726882275.44887: in run() - task 12673a56-9f93-e4fe-1358-000000000043 15330 1726882275.44901: variable 'ansible_search_path' from source: unknown 15330 1726882275.44906: variable 'ansible_search_path' from source: unknown 15330 1726882275.44938: calling self._execute() 15330 1726882275.45011: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882275.45016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882275.45027: variable 'omit' from source: magic vars 15330 1726882275.45316: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.45325: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882275.45412: variable 'network_state' from source: role '' defaults 15330 1726882275.45420: Evaluated conditional (network_state != {}): False 15330 1726882275.45423: when evaluation is False, skipping this task 15330 1726882275.45426: _execute() done 15330 1726882275.45429: dumping result to json 15330 1726882275.45431: done dumping result, returning 15330 1726882275.45438: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-e4fe-1358-000000000043] 15330 1726882275.45444: sending task result for task 12673a56-9f93-e4fe-1358-000000000043 15330 1726882275.45529: done sending task result for task 12673a56-9f93-e4fe-1358-000000000043 15330 1726882275.45531: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882275.45574: no more pending results, returning what we have 15330 1726882275.45577: results queue empty 15330 1726882275.45578: checking for any_errors_fatal 15330 1726882275.45586: done checking for any_errors_fatal 15330 1726882275.45587: checking for max_fail_percentage 15330 1726882275.45588: done checking for max_fail_percentage 15330 1726882275.45589: checking to see if all hosts have failed and the running result is not ok 15330 1726882275.45590: done checking to see if all hosts have failed 15330 1726882275.45591: getting the remaining hosts for this loop 15330 1726882275.45592: done getting the remaining hosts for this loop 15330 1726882275.45597: getting the next task for host managed_node3 15330 1726882275.45603: done getting next task for host managed_node3 15330 1726882275.45606: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15330 1726882275.45608: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882275.45622: getting variables 15330 1726882275.45624: in VariableManager get_vars() 15330 1726882275.45660: Calling all_inventory to load vars for managed_node3 15330 1726882275.45662: Calling groups_inventory to load vars for managed_node3 15330 1726882275.45664: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882275.45674: Calling all_plugins_play to load vars for managed_node3 15330 1726882275.45677: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882275.45679: Calling groups_plugins_play to load vars for managed_node3 15330 1726882275.46468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882275.47649: done with get_vars() 15330 1726882275.47676: done getting variables 15330 1726882275.47750: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:31:15 -0400 (0:00:00.035) 0:00:24.683 ****** 15330 1726882275.47789: entering _queue_task() for managed_node3/package 15330 1726882275.48157: worker is 1 (out of 1 available) 15330 1726882275.48175: exiting _queue_task() for managed_node3/package 15330 1726882275.48189: done queuing things up, now waiting for results queue to drain 15330 1726882275.48190: waiting for pending results... 15330 1726882275.48496: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15330 1726882275.48552: in run() - task 12673a56-9f93-e4fe-1358-000000000044 15330 1726882275.48565: variable 'ansible_search_path' from source: unknown 15330 1726882275.48569: variable 'ansible_search_path' from source: unknown 15330 1726882275.48674: calling self._execute() 15330 1726882275.48755: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882275.48764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882275.48769: variable 'omit' from source: magic vars 15330 1726882275.49185: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.49188: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882275.49262: variable 'network_state' from source: role '' defaults 15330 1726882275.49270: Evaluated conditional (network_state != {}): False 15330 1726882275.49273: when evaluation is False, skipping this task 15330 1726882275.49276: _execute() done 15330 1726882275.49279: dumping result to json 15330 1726882275.49281: done dumping result, returning 15330 1726882275.49291: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-e4fe-1358-000000000044] 15330 1726882275.49300: sending task result for task 12673a56-9f93-e4fe-1358-000000000044 15330 1726882275.49507: done sending task result for task 12673a56-9f93-e4fe-1358-000000000044 15330 1726882275.49510: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882275.49553: no more pending results, returning what we have 15330 1726882275.49556: results queue empty 15330 1726882275.49557: checking for any_errors_fatal 15330 1726882275.49562: done checking for any_errors_fatal 15330 1726882275.49563: checking for max_fail_percentage 15330 1726882275.49564: done checking for max_fail_percentage 15330 1726882275.49565: checking to see if all hosts have failed and the running result is not ok 15330 1726882275.49565: done checking to see if all hosts have failed 15330 1726882275.49566: getting the remaining hosts for this loop 15330 1726882275.49567: done getting the remaining hosts for this loop 15330 1726882275.49574: getting the next task for host managed_node3 15330 1726882275.49578: done getting next task for host managed_node3 15330 1726882275.49582: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15330 1726882275.49585: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882275.49602: getting variables 15330 1726882275.49605: in VariableManager get_vars() 15330 1726882275.49645: Calling all_inventory to load vars for managed_node3 15330 1726882275.49647: Calling groups_inventory to load vars for managed_node3 15330 1726882275.49649: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882275.49657: Calling all_plugins_play to load vars for managed_node3 15330 1726882275.49660: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882275.49662: Calling groups_plugins_play to load vars for managed_node3 15330 1726882275.50568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882275.51630: done with get_vars() 15330 1726882275.51645: done getting variables 15330 1726882275.51692: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:31:15 -0400 (0:00:00.039) 0:00:24.723 ****** 15330 1726882275.51716: entering _queue_task() for managed_node3/service 15330 1726882275.51957: worker is 1 (out of 1 available) 15330 1726882275.51970: exiting _queue_task() for managed_node3/service 15330 1726882275.51981: done queuing things up, now waiting for results queue to drain 15330 1726882275.51982: waiting for pending results... 15330 1726882275.52178: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15330 1726882275.52254: in run() - task 12673a56-9f93-e4fe-1358-000000000045 15330 1726882275.52265: variable 'ansible_search_path' from source: unknown 15330 1726882275.52269: variable 'ansible_search_path' from source: unknown 15330 1726882275.52302: calling self._execute() 15330 1726882275.52370: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882275.52376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882275.52384: variable 'omit' from source: magic vars 15330 1726882275.52741: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.52750: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882275.52853: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882275.53051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882275.54998: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882275.55102: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882275.55130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882275.55158: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882275.55177: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882275.55239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.55263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.55281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.55311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.55322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.55354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.55371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.55392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.55419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.55429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.55456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.55473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.55580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.55584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.55588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.55643: variable 'network_connections' from source: play vars 15330 1726882275.55654: variable 'profile' from source: play vars 15330 1726882275.55716: variable 'profile' from source: play vars 15330 1726882275.55719: variable 'interface' from source: set_fact 15330 1726882275.55765: variable 'interface' from source: set_fact 15330 1726882275.55821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882275.55946: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882275.55973: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882275.55997: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882275.56021: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882275.56053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882275.56069: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882275.56089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.56107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882275.56147: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882275.56302: variable 'network_connections' from source: play vars 15330 1726882275.56305: variable 'profile' from source: play vars 15330 1726882275.56351: variable 'profile' from source: play vars 15330 1726882275.56354: variable 'interface' from source: set_fact 15330 1726882275.56398: variable 'interface' from source: set_fact 15330 1726882275.56418: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15330 1726882275.56421: when evaluation is False, skipping this task 15330 1726882275.56424: _execute() done 15330 1726882275.56426: dumping result to json 15330 1726882275.56429: done dumping result, returning 15330 1726882275.56437: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-000000000045] 15330 1726882275.56448: sending task result for task 12673a56-9f93-e4fe-1358-000000000045 15330 1726882275.56543: done sending task result for task 12673a56-9f93-e4fe-1358-000000000045 15330 1726882275.56546: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15330 1726882275.56605: no more pending results, returning what we have 15330 1726882275.56608: results queue empty 15330 1726882275.56609: checking for any_errors_fatal 15330 1726882275.56618: done checking for any_errors_fatal 15330 1726882275.56619: checking for max_fail_percentage 15330 1726882275.56620: done checking for max_fail_percentage 15330 1726882275.56621: checking to see if all hosts have failed and the running result is not ok 15330 1726882275.56622: done checking to see if all hosts have failed 15330 1726882275.56622: getting the remaining hosts for this loop 15330 1726882275.56624: done getting the remaining hosts for this loop 15330 1726882275.56627: getting the next task for host managed_node3 15330 1726882275.56633: done getting next task for host managed_node3 15330 1726882275.56636: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15330 1726882275.56638: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882275.56651: getting variables 15330 1726882275.56652: in VariableManager get_vars() 15330 1726882275.56695: Calling all_inventory to load vars for managed_node3 15330 1726882275.56697: Calling groups_inventory to load vars for managed_node3 15330 1726882275.56699: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882275.56709: Calling all_plugins_play to load vars for managed_node3 15330 1726882275.56711: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882275.56713: Calling groups_plugins_play to load vars for managed_node3 15330 1726882275.57534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882275.58592: done with get_vars() 15330 1726882275.58609: done getting variables 15330 1726882275.58653: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:31:15 -0400 (0:00:00.069) 0:00:24.792 ****** 15330 1726882275.58675: entering _queue_task() for managed_node3/service 15330 1726882275.58950: worker is 1 (out of 1 available) 15330 1726882275.58963: exiting _queue_task() for managed_node3/service 15330 1726882275.58979: done queuing things up, now waiting for results queue to drain 15330 1726882275.58981: waiting for pending results... 15330 1726882275.59176: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15330 1726882275.59250: in run() - task 12673a56-9f93-e4fe-1358-000000000046 15330 1726882275.59260: variable 'ansible_search_path' from source: unknown 15330 1726882275.59264: variable 'ansible_search_path' from source: unknown 15330 1726882275.59295: calling self._execute() 15330 1726882275.59366: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882275.59369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882275.59379: variable 'omit' from source: magic vars 15330 1726882275.59656: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.59666: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882275.59778: variable 'network_provider' from source: set_fact 15330 1726882275.59783: variable 'network_state' from source: role '' defaults 15330 1726882275.59798: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15330 1726882275.59801: variable 'omit' from source: magic vars 15330 1726882275.59829: variable 'omit' from source: magic vars 15330 1726882275.59850: variable 'network_service_name' from source: role '' defaults 15330 1726882275.59901: variable 'network_service_name' from source: role '' defaults 15330 1726882275.59970: variable '__network_provider_setup' from source: role '' defaults 15330 1726882275.59974: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882275.60022: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882275.60029: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882275.60074: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882275.60234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882275.61784: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882275.61845: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882275.61876: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882275.61913: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882275.61935: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882275.62066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.62070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.62072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.62090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.62223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.62227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.62229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.62231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.62254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.62265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.62526: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15330 1726882275.62615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.62643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.62871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.62874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.62877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.62879: variable 'ansible_python' from source: facts 15330 1726882275.62899: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15330 1726882275.63060: variable '__network_wpa_supplicant_required' from source: role '' defaults 15330 1726882275.63108: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15330 1726882275.63201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.63220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.63237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.63264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.63283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.63376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882275.63390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882275.63395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.63422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882275.63431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882275.63585: variable 'network_connections' from source: play vars 15330 1726882275.63600: variable 'profile' from source: play vars 15330 1726882275.63636: variable 'profile' from source: play vars 15330 1726882275.63641: variable 'interface' from source: set_fact 15330 1726882275.63689: variable 'interface' from source: set_fact 15330 1726882275.63761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882275.63947: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882275.64011: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882275.64046: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882275.64075: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882275.64122: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882275.64142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882275.64166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882275.64191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882275.64227: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882275.64401: variable 'network_connections' from source: play vars 15330 1726882275.64406: variable 'profile' from source: play vars 15330 1726882275.64460: variable 'profile' from source: play vars 15330 1726882275.64463: variable 'interface' from source: set_fact 15330 1726882275.64509: variable 'interface' from source: set_fact 15330 1726882275.64533: variable '__network_packages_default_wireless' from source: role '' defaults 15330 1726882275.64586: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882275.64768: variable 'network_connections' from source: play vars 15330 1726882275.64771: variable 'profile' from source: play vars 15330 1726882275.64825: variable 'profile' from source: play vars 15330 1726882275.64828: variable 'interface' from source: set_fact 15330 1726882275.64880: variable 'interface' from source: set_fact 15330 1726882275.64902: variable '__network_packages_default_team' from source: role '' defaults 15330 1726882275.64956: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882275.65139: variable 'network_connections' from source: play vars 15330 1726882275.65142: variable 'profile' from source: play vars 15330 1726882275.65192: variable 'profile' from source: play vars 15330 1726882275.65198: variable 'interface' from source: set_fact 15330 1726882275.65247: variable 'interface' from source: set_fact 15330 1726882275.65285: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882275.65330: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882275.65336: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882275.65377: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882275.65511: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15330 1726882275.65819: variable 'network_connections' from source: play vars 15330 1726882275.65822: variable 'profile' from source: play vars 15330 1726882275.65865: variable 'profile' from source: play vars 15330 1726882275.65868: variable 'interface' from source: set_fact 15330 1726882275.65919: variable 'interface' from source: set_fact 15330 1726882275.65926: variable 'ansible_distribution' from source: facts 15330 1726882275.65929: variable '__network_rh_distros' from source: role '' defaults 15330 1726882275.65935: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.65946: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15330 1726882275.66061: variable 'ansible_distribution' from source: facts 15330 1726882275.66064: variable '__network_rh_distros' from source: role '' defaults 15330 1726882275.66067: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.66078: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15330 1726882275.66190: variable 'ansible_distribution' from source: facts 15330 1726882275.66195: variable '__network_rh_distros' from source: role '' defaults 15330 1726882275.66198: variable 'ansible_distribution_major_version' from source: facts 15330 1726882275.66223: variable 'network_provider' from source: set_fact 15330 1726882275.66241: variable 'omit' from source: magic vars 15330 1726882275.66263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882275.66285: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882275.66303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882275.66316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882275.66324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882275.66348: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882275.66351: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882275.66353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882275.66423: Set connection var ansible_pipelining to False 15330 1726882275.66433: Set connection var ansible_timeout to 10 15330 1726882275.66435: Set connection var ansible_connection to ssh 15330 1726882275.66438: Set connection var ansible_shell_type to sh 15330 1726882275.66442: Set connection var ansible_shell_executable to /bin/sh 15330 1726882275.66451: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882275.66466: variable 'ansible_shell_executable' from source: unknown 15330 1726882275.66469: variable 'ansible_connection' from source: unknown 15330 1726882275.66472: variable 'ansible_module_compression' from source: unknown 15330 1726882275.66474: variable 'ansible_shell_type' from source: unknown 15330 1726882275.66476: variable 'ansible_shell_executable' from source: unknown 15330 1726882275.66478: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882275.66489: variable 'ansible_pipelining' from source: unknown 15330 1726882275.66492: variable 'ansible_timeout' from source: unknown 15330 1726882275.66498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882275.66562: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882275.66568: variable 'omit' from source: magic vars 15330 1726882275.66574: starting attempt loop 15330 1726882275.66577: running the handler 15330 1726882275.66632: variable 'ansible_facts' from source: unknown 15330 1726882275.67089: _low_level_execute_command(): starting 15330 1726882275.67095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882275.67638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882275.67641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882275.67644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882275.67646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882275.67648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882275.67698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882275.67701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882275.67704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882275.67767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882275.69476: stdout chunk (state=3): >>>/root <<< 15330 1726882275.69598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882275.69622: stderr chunk (state=3): >>><<< 15330 1726882275.69626: stdout chunk (state=3): >>><<< 15330 1726882275.69645: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882275.69656: _low_level_execute_command(): starting 15330 1726882275.69666: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650 `" && echo ansible-tmp-1726882275.6964555-16452-112078211620650="` echo /root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650 `" ) && sleep 0' 15330 1726882275.70616: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882275.72275: stdout chunk (state=3): >>>ansible-tmp-1726882275.6964555-16452-112078211620650=/root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650 <<< 15330 1726882275.72383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882275.72406: stderr chunk (state=3): >>><<< 15330 1726882275.72409: stdout chunk (state=3): >>><<< 15330 1726882275.72424: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882275.6964555-16452-112078211620650=/root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882275.72452: variable 'ansible_module_compression' from source: unknown 15330 1726882275.72494: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15330 1726882275.72545: variable 'ansible_facts' from source: unknown 15330 1726882275.72681: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650/AnsiballZ_systemd.py 15330 1726882275.72779: Sending initial data 15330 1726882275.72782: Sent initial data (156 bytes) 15330 1726882275.73401: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882275.73421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882275.73432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882275.73709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882275.75026: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882275.75081: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882275.75137: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp92x455jx /root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650/AnsiballZ_systemd.py <<< 15330 1726882275.75141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650/AnsiballZ_systemd.py" <<< 15330 1726882275.75217: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp92x455jx" to remote "/root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650/AnsiballZ_systemd.py" <<< 15330 1726882275.76802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882275.76806: stderr chunk (state=3): >>><<< 15330 1726882275.76808: stdout chunk (state=3): >>><<< 15330 1726882275.76810: done transferring module to remote 15330 1726882275.76812: _low_level_execute_command(): starting 15330 1726882275.76815: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650/ /root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650/AnsiballZ_systemd.py && sleep 0' 15330 1726882275.77401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882275.77417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882275.77510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882275.77530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882275.77544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882275.77565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882275.77824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882275.79570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882275.79573: stdout chunk (state=3): >>><<< 15330 1726882275.79580: stderr chunk (state=3): >>><<< 15330 1726882275.79603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882275.79607: _low_level_execute_command(): starting 15330 1726882275.79611: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650/AnsiballZ_systemd.py && sleep 0' 15330 1726882275.80220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882275.80235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882275.80299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882275.80302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882275.80305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882275.80307: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882275.80317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882275.80329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882275.80341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882275.80363: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15330 1726882275.80372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882275.80381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882275.80405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882275.80533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882275.80537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882275.80539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882275.80623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882276.09318: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10493952", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3308384256", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1203121000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 15330 1726882276.09333: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15330 1726882276.11153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882276.11179: stderr chunk (state=3): >>><<< 15330 1726882276.11182: stdout chunk (state=3): >>><<< 15330 1726882276.11201: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10493952", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3308384256", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1203121000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882276.11318: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882276.11334: _low_level_execute_command(): starting 15330 1726882276.11339: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882275.6964555-16452-112078211620650/ > /dev/null 2>&1 && sleep 0' 15330 1726882276.11765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882276.11768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882276.11803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882276.11806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15330 1726882276.11808: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882276.11810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882276.11862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882276.11869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882276.11871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882276.11915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882276.13774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882276.14103: stderr chunk (state=3): >>><<< 15330 1726882276.14111: stdout chunk (state=3): >>><<< 15330 1726882276.14114: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882276.14116: handler run complete 15330 1726882276.14118: attempt loop complete, returning result 15330 1726882276.14120: _execute() done 15330 1726882276.14122: dumping result to json 15330 1726882276.14124: done dumping result, returning 15330 1726882276.14126: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-e4fe-1358-000000000046] 15330 1726882276.14127: sending task result for task 12673a56-9f93-e4fe-1358-000000000046 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882276.14626: no more pending results, returning what we have 15330 1726882276.14629: results queue empty 15330 1726882276.14631: checking for any_errors_fatal 15330 1726882276.14640: done checking for any_errors_fatal 15330 1726882276.14643: checking for max_fail_percentage 15330 1726882276.14645: done checking for max_fail_percentage 15330 1726882276.14646: checking to see if all hosts have failed and the running result is not ok 15330 1726882276.14647: done checking to see if all hosts have failed 15330 1726882276.14647: getting the remaining hosts for this loop 15330 1726882276.14649: done getting the remaining hosts for this loop 15330 1726882276.14658: getting the next task for host managed_node3 15330 1726882276.14664: done getting next task for host managed_node3 15330 1726882276.14668: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15330 1726882276.14672: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882276.14682: getting variables 15330 1726882276.14683: in VariableManager get_vars() 15330 1726882276.14720: Calling all_inventory to load vars for managed_node3 15330 1726882276.14723: Calling groups_inventory to load vars for managed_node3 15330 1726882276.14725: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882276.14734: Calling all_plugins_play to load vars for managed_node3 15330 1726882276.14736: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882276.14739: Calling groups_plugins_play to load vars for managed_node3 15330 1726882276.15609: done sending task result for task 12673a56-9f93-e4fe-1358-000000000046 15330 1726882276.15613: WORKER PROCESS EXITING 15330 1726882276.15625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882276.16824: done with get_vars() 15330 1726882276.16849: done getting variables 15330 1726882276.16916: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:31:16 -0400 (0:00:00.582) 0:00:25.375 ****** 15330 1726882276.16946: entering _queue_task() for managed_node3/service 15330 1726882276.17461: worker is 1 (out of 1 available) 15330 1726882276.17474: exiting _queue_task() for managed_node3/service 15330 1726882276.17496: done queuing things up, now waiting for results queue to drain 15330 1726882276.17498: waiting for pending results... 15330 1726882276.17911: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15330 1726882276.17929: in run() - task 12673a56-9f93-e4fe-1358-000000000047 15330 1726882276.17961: variable 'ansible_search_path' from source: unknown 15330 1726882276.17973: variable 'ansible_search_path' from source: unknown 15330 1726882276.18029: calling self._execute() 15330 1726882276.18143: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882276.18155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882276.18168: variable 'omit' from source: magic vars 15330 1726882276.18611: variable 'ansible_distribution_major_version' from source: facts 15330 1726882276.18632: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882276.18800: variable 'network_provider' from source: set_fact 15330 1726882276.18804: Evaluated conditional (network_provider == "nm"): True 15330 1726882276.18998: variable '__network_wpa_supplicant_required' from source: role '' defaults 15330 1726882276.19025: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15330 1726882276.19227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882276.22373: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882276.22450: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882276.22494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882276.22536: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882276.22569: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882276.22661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882276.22702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882276.22743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882276.22790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882276.22815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882276.22898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882276.22902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882276.22923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882276.22970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882276.22989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882276.23035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882276.23098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882276.23102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882276.23142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882276.23166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882276.23700: variable 'network_connections' from source: play vars 15330 1726882276.23704: variable 'profile' from source: play vars 15330 1726882276.23707: variable 'profile' from source: play vars 15330 1726882276.23709: variable 'interface' from source: set_fact 15330 1726882276.23711: variable 'interface' from source: set_fact 15330 1726882276.23862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882276.24305: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882276.24490: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882276.24531: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882276.24616: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882276.24682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882276.24729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882276.24770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882276.24830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882276.24882: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882276.25141: variable 'network_connections' from source: play vars 15330 1726882276.25150: variable 'profile' from source: play vars 15330 1726882276.25212: variable 'profile' from source: play vars 15330 1726882276.25226: variable 'interface' from source: set_fact 15330 1726882276.25284: variable 'interface' from source: set_fact 15330 1726882276.25321: Evaluated conditional (__network_wpa_supplicant_required): False 15330 1726882276.25333: when evaluation is False, skipping this task 15330 1726882276.25342: _execute() done 15330 1726882276.25357: dumping result to json 15330 1726882276.25367: done dumping result, returning 15330 1726882276.25379: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-e4fe-1358-000000000047] 15330 1726882276.25391: sending task result for task 12673a56-9f93-e4fe-1358-000000000047 15330 1726882276.25599: done sending task result for task 12673a56-9f93-e4fe-1358-000000000047 15330 1726882276.25602: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15330 1726882276.25649: no more pending results, returning what we have 15330 1726882276.25652: results queue empty 15330 1726882276.25653: checking for any_errors_fatal 15330 1726882276.25672: done checking for any_errors_fatal 15330 1726882276.25673: checking for max_fail_percentage 15330 1726882276.25674: done checking for max_fail_percentage 15330 1726882276.25675: checking to see if all hosts have failed and the running result is not ok 15330 1726882276.25676: done checking to see if all hosts have failed 15330 1726882276.25677: getting the remaining hosts for this loop 15330 1726882276.25678: done getting the remaining hosts for this loop 15330 1726882276.25681: getting the next task for host managed_node3 15330 1726882276.25689: done getting next task for host managed_node3 15330 1726882276.25692: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15330 1726882276.25696: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882276.25710: getting variables 15330 1726882276.25713: in VariableManager get_vars() 15330 1726882276.25751: Calling all_inventory to load vars for managed_node3 15330 1726882276.25753: Calling groups_inventory to load vars for managed_node3 15330 1726882276.25756: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882276.25766: Calling all_plugins_play to load vars for managed_node3 15330 1726882276.25769: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882276.25771: Calling groups_plugins_play to load vars for managed_node3 15330 1726882276.27988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882276.29523: done with get_vars() 15330 1726882276.29544: done getting variables 15330 1726882276.29607: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:31:16 -0400 (0:00:00.126) 0:00:25.502 ****** 15330 1726882276.29637: entering _queue_task() for managed_node3/service 15330 1726882276.29962: worker is 1 (out of 1 available) 15330 1726882276.29973: exiting _queue_task() for managed_node3/service 15330 1726882276.29984: done queuing things up, now waiting for results queue to drain 15330 1726882276.29985: waiting for pending results... 15330 1726882276.30265: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 15330 1726882276.30419: in run() - task 12673a56-9f93-e4fe-1358-000000000048 15330 1726882276.30424: variable 'ansible_search_path' from source: unknown 15330 1726882276.30426: variable 'ansible_search_path' from source: unknown 15330 1726882276.30599: calling self._execute() 15330 1726882276.30604: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882276.30607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882276.30609: variable 'omit' from source: magic vars 15330 1726882276.30965: variable 'ansible_distribution_major_version' from source: facts 15330 1726882276.30981: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882276.31108: variable 'network_provider' from source: set_fact 15330 1726882276.31119: Evaluated conditional (network_provider == "initscripts"): False 15330 1726882276.31127: when evaluation is False, skipping this task 15330 1726882276.31134: _execute() done 15330 1726882276.31140: dumping result to json 15330 1726882276.31147: done dumping result, returning 15330 1726882276.31161: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-e4fe-1358-000000000048] 15330 1726882276.31171: sending task result for task 12673a56-9f93-e4fe-1358-000000000048 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882276.31312: no more pending results, returning what we have 15330 1726882276.31316: results queue empty 15330 1726882276.31317: checking for any_errors_fatal 15330 1726882276.31327: done checking for any_errors_fatal 15330 1726882276.31327: checking for max_fail_percentage 15330 1726882276.31329: done checking for max_fail_percentage 15330 1726882276.31330: checking to see if all hosts have failed and the running result is not ok 15330 1726882276.31331: done checking to see if all hosts have failed 15330 1726882276.31332: getting the remaining hosts for this loop 15330 1726882276.31333: done getting the remaining hosts for this loop 15330 1726882276.31337: getting the next task for host managed_node3 15330 1726882276.31343: done getting next task for host managed_node3 15330 1726882276.31347: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15330 1726882276.31350: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882276.31364: getting variables 15330 1726882276.31366: in VariableManager get_vars() 15330 1726882276.31407: Calling all_inventory to load vars for managed_node3 15330 1726882276.31409: Calling groups_inventory to load vars for managed_node3 15330 1726882276.31411: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882276.31423: Calling all_plugins_play to load vars for managed_node3 15330 1726882276.31426: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882276.31429: Calling groups_plugins_play to load vars for managed_node3 15330 1726882276.32207: done sending task result for task 12673a56-9f93-e4fe-1358-000000000048 15330 1726882276.32210: WORKER PROCESS EXITING 15330 1726882276.33025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882276.34751: done with get_vars() 15330 1726882276.34772: done getting variables 15330 1726882276.34835: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:31:16 -0400 (0:00:00.052) 0:00:25.554 ****** 15330 1726882276.34866: entering _queue_task() for managed_node3/copy 15330 1726882276.35164: worker is 1 (out of 1 available) 15330 1726882276.35176: exiting _queue_task() for managed_node3/copy 15330 1726882276.35190: done queuing things up, now waiting for results queue to drain 15330 1726882276.35192: waiting for pending results... 15330 1726882276.35462: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15330 1726882276.35580: in run() - task 12673a56-9f93-e4fe-1358-000000000049 15330 1726882276.35607: variable 'ansible_search_path' from source: unknown 15330 1726882276.35620: variable 'ansible_search_path' from source: unknown 15330 1726882276.35657: calling self._execute() 15330 1726882276.35752: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882276.35765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882276.35779: variable 'omit' from source: magic vars 15330 1726882276.36168: variable 'ansible_distribution_major_version' from source: facts 15330 1726882276.36184: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882276.36310: variable 'network_provider' from source: set_fact 15330 1726882276.36321: Evaluated conditional (network_provider == "initscripts"): False 15330 1726882276.36328: when evaluation is False, skipping this task 15330 1726882276.36335: _execute() done 15330 1726882276.36342: dumping result to json 15330 1726882276.36348: done dumping result, returning 15330 1726882276.36360: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-e4fe-1358-000000000049] 15330 1726882276.36370: sending task result for task 12673a56-9f93-e4fe-1358-000000000049 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15330 1726882276.36643: no more pending results, returning what we have 15330 1726882276.36647: results queue empty 15330 1726882276.36648: checking for any_errors_fatal 15330 1726882276.36652: done checking for any_errors_fatal 15330 1726882276.36653: checking for max_fail_percentage 15330 1726882276.36655: done checking for max_fail_percentage 15330 1726882276.36656: checking to see if all hosts have failed and the running result is not ok 15330 1726882276.36656: done checking to see if all hosts have failed 15330 1726882276.36657: getting the remaining hosts for this loop 15330 1726882276.36658: done getting the remaining hosts for this loop 15330 1726882276.36662: getting the next task for host managed_node3 15330 1726882276.36668: done getting next task for host managed_node3 15330 1726882276.36671: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15330 1726882276.36674: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882276.36690: getting variables 15330 1726882276.36692: in VariableManager get_vars() 15330 1726882276.36732: Calling all_inventory to load vars for managed_node3 15330 1726882276.36734: Calling groups_inventory to load vars for managed_node3 15330 1726882276.36737: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882276.36749: Calling all_plugins_play to load vars for managed_node3 15330 1726882276.36751: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882276.36757: Calling groups_plugins_play to load vars for managed_node3 15330 1726882276.37306: done sending task result for task 12673a56-9f93-e4fe-1358-000000000049 15330 1726882276.37309: WORKER PROCESS EXITING 15330 1726882276.38229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882276.39850: done with get_vars() 15330 1726882276.39871: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:31:16 -0400 (0:00:00.050) 0:00:25.605 ****** 15330 1726882276.39951: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15330 1726882276.40425: worker is 1 (out of 1 available) 15330 1726882276.40433: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15330 1726882276.40443: done queuing things up, now waiting for results queue to drain 15330 1726882276.40444: waiting for pending results... 15330 1726882276.40528: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15330 1726882276.40627: in run() - task 12673a56-9f93-e4fe-1358-00000000004a 15330 1726882276.40646: variable 'ansible_search_path' from source: unknown 15330 1726882276.40654: variable 'ansible_search_path' from source: unknown 15330 1726882276.40706: calling self._execute() 15330 1726882276.40817: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882276.40832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882276.40846: variable 'omit' from source: magic vars 15330 1726882276.41238: variable 'ansible_distribution_major_version' from source: facts 15330 1726882276.41255: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882276.41267: variable 'omit' from source: magic vars 15330 1726882276.41324: variable 'omit' from source: magic vars 15330 1726882276.41495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882276.43694: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882276.43817: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882276.43830: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882276.43870: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882276.43906: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882276.44001: variable 'network_provider' from source: set_fact 15330 1726882276.44198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882276.44233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882276.44270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882276.44371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882276.44375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882276.44433: variable 'omit' from source: magic vars 15330 1726882276.44551: variable 'omit' from source: magic vars 15330 1726882276.44951: variable 'network_connections' from source: play vars 15330 1726882276.44955: variable 'profile' from source: play vars 15330 1726882276.44958: variable 'profile' from source: play vars 15330 1726882276.44961: variable 'interface' from source: set_fact 15330 1726882276.45101: variable 'interface' from source: set_fact 15330 1726882276.45427: variable 'omit' from source: magic vars 15330 1726882276.45505: variable '__lsr_ansible_managed' from source: task vars 15330 1726882276.45570: variable '__lsr_ansible_managed' from source: task vars 15330 1726882276.46004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15330 1726882276.47355: Loaded config def from plugin (lookup/template) 15330 1726882276.47360: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15330 1726882276.47363: File lookup term: get_ansible_managed.j2 15330 1726882276.47368: variable 'ansible_search_path' from source: unknown 15330 1726882276.47380: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15330 1726882276.47434: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15330 1726882276.47492: variable 'ansible_search_path' from source: unknown 15330 1726882276.57701: variable 'ansible_managed' from source: unknown 15330 1726882276.57850: variable 'omit' from source: magic vars 15330 1726882276.57964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882276.57997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882276.58024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882276.58260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882276.58263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882276.58266: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882276.58268: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882276.58269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882276.58421: Set connection var ansible_pipelining to False 15330 1726882276.58440: Set connection var ansible_timeout to 10 15330 1726882276.58447: Set connection var ansible_connection to ssh 15330 1726882276.58453: Set connection var ansible_shell_type to sh 15330 1726882276.58462: Set connection var ansible_shell_executable to /bin/sh 15330 1726882276.58473: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882276.58503: variable 'ansible_shell_executable' from source: unknown 15330 1726882276.58799: variable 'ansible_connection' from source: unknown 15330 1726882276.58804: variable 'ansible_module_compression' from source: unknown 15330 1726882276.58807: variable 'ansible_shell_type' from source: unknown 15330 1726882276.58809: variable 'ansible_shell_executable' from source: unknown 15330 1726882276.58811: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882276.58813: variable 'ansible_pipelining' from source: unknown 15330 1726882276.58815: variable 'ansible_timeout' from source: unknown 15330 1726882276.58817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882276.58869: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882276.58895: variable 'omit' from source: magic vars 15330 1726882276.58942: starting attempt loop 15330 1726882276.58949: running the handler 15330 1726882276.58966: _low_level_execute_command(): starting 15330 1726882276.59260: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882276.60459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882276.60476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882276.60658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882276.61007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882276.61085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882276.62749: stdout chunk (state=3): >>>/root <<< 15330 1726882276.62850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882276.62957: stderr chunk (state=3): >>><<< 15330 1726882276.62966: stdout chunk (state=3): >>><<< 15330 1726882276.62989: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882276.63008: _low_level_execute_command(): starting 15330 1726882276.63019: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895 `" && echo ansible-tmp-1726882276.6299677-16490-164219576240895="` echo /root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895 `" ) && sleep 0' 15330 1726882276.64220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882276.64261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882276.64273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882276.64480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882276.64499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882276.64518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882276.64590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882276.66457: stdout chunk (state=3): >>>ansible-tmp-1726882276.6299677-16490-164219576240895=/root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895 <<< 15330 1726882276.66564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882276.66651: stderr chunk (state=3): >>><<< 15330 1726882276.66661: stdout chunk (state=3): >>><<< 15330 1726882276.66685: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882276.6299677-16490-164219576240895=/root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882276.66736: variable 'ansible_module_compression' from source: unknown 15330 1726882276.66970: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15330 1726882276.66990: variable 'ansible_facts' from source: unknown 15330 1726882276.67227: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895/AnsiballZ_network_connections.py 15330 1726882276.67516: Sending initial data 15330 1726882276.67519: Sent initial data (168 bytes) 15330 1726882276.68873: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882276.69033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882276.69075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882276.70645: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882276.70760: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882276.70810: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmprvr6iu8j /root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895/AnsiballZ_network_connections.py <<< 15330 1726882276.70851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895/AnsiballZ_network_connections.py" <<< 15330 1726882276.70867: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmprvr6iu8j" to remote "/root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895/AnsiballZ_network_connections.py" <<< 15330 1726882276.73163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882276.73300: stderr chunk (state=3): >>><<< 15330 1726882276.73303: stdout chunk (state=3): >>><<< 15330 1726882276.73378: done transferring module to remote 15330 1726882276.73395: _low_level_execute_command(): starting 15330 1726882276.73405: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895/ /root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895/AnsiballZ_network_connections.py && sleep 0' 15330 1726882276.74649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882276.74664: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882276.74678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882276.74910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882276.74923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882276.75003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882276.76726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882276.76959: stderr chunk (state=3): >>><<< 15330 1726882276.76962: stdout chunk (state=3): >>><<< 15330 1726882276.76965: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882276.76967: _low_level_execute_command(): starting 15330 1726882276.76970: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895/AnsiballZ_network_connections.py && sleep 0' 15330 1726882276.78255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882276.78369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882276.78400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882276.78610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882276.78624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882276.78641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882276.78883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882277.08820: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15330 1726882277.10923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882277.10932: stderr chunk (state=3): >>><<< 15330 1726882277.10935: stdout chunk (state=3): >>><<< 15330 1726882277.10949: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882277.10987: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882277.11003: _low_level_execute_command(): starting 15330 1726882277.11008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882276.6299677-16490-164219576240895/ > /dev/null 2>&1 && sleep 0' 15330 1726882277.12138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882277.12405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882277.12413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882277.12427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882277.12698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882277.14345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882277.14349: stdout chunk (state=3): >>><<< 15330 1726882277.14355: stderr chunk (state=3): >>><<< 15330 1726882277.14372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882277.14379: handler run complete 15330 1726882277.14412: attempt loop complete, returning result 15330 1726882277.14416: _execute() done 15330 1726882277.14419: dumping result to json 15330 1726882277.14423: done dumping result, returning 15330 1726882277.14433: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-e4fe-1358-00000000004a] 15330 1726882277.14437: sending task result for task 12673a56-9f93-e4fe-1358-00000000004a 15330 1726882277.14546: done sending task result for task 12673a56-9f93-e4fe-1358-00000000004a 15330 1726882277.14549: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15330 1726882277.14670: no more pending results, returning what we have 15330 1726882277.14674: results queue empty 15330 1726882277.14675: checking for any_errors_fatal 15330 1726882277.14681: done checking for any_errors_fatal 15330 1726882277.14682: checking for max_fail_percentage 15330 1726882277.14684: done checking for max_fail_percentage 15330 1726882277.14685: checking to see if all hosts have failed and the running result is not ok 15330 1726882277.14685: done checking to see if all hosts have failed 15330 1726882277.14686: getting the remaining hosts for this loop 15330 1726882277.14687: done getting the remaining hosts for this loop 15330 1726882277.14692: getting the next task for host managed_node3 15330 1726882277.14699: done getting next task for host managed_node3 15330 1726882277.14704: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15330 1726882277.14710: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882277.14720: getting variables 15330 1726882277.14722: in VariableManager get_vars() 15330 1726882277.14760: Calling all_inventory to load vars for managed_node3 15330 1726882277.14762: Calling groups_inventory to load vars for managed_node3 15330 1726882277.14765: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882277.14775: Calling all_plugins_play to load vars for managed_node3 15330 1726882277.14778: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882277.14781: Calling groups_plugins_play to load vars for managed_node3 15330 1726882277.18136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882277.22158: done with get_vars() 15330 1726882277.22184: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:31:17 -0400 (0:00:00.826) 0:00:26.431 ****** 15330 1726882277.22570: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15330 1726882277.23311: worker is 1 (out of 1 available) 15330 1726882277.23321: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15330 1726882277.23331: done queuing things up, now waiting for results queue to drain 15330 1726882277.23333: waiting for pending results... 15330 1726882277.23684: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 15330 1726882277.23881: in run() - task 12673a56-9f93-e4fe-1358-00000000004b 15330 1726882277.24100: variable 'ansible_search_path' from source: unknown 15330 1726882277.24104: variable 'ansible_search_path' from source: unknown 15330 1726882277.24107: calling self._execute() 15330 1726882277.24301: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.24305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.24307: variable 'omit' from source: magic vars 15330 1726882277.25024: variable 'ansible_distribution_major_version' from source: facts 15330 1726882277.25075: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882277.25267: variable 'network_state' from source: role '' defaults 15330 1726882277.25404: Evaluated conditional (network_state != {}): False 15330 1726882277.25411: when evaluation is False, skipping this task 15330 1726882277.25417: _execute() done 15330 1726882277.25498: dumping result to json 15330 1726882277.25502: done dumping result, returning 15330 1726882277.25504: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-e4fe-1358-00000000004b] 15330 1726882277.25507: sending task result for task 12673a56-9f93-e4fe-1358-00000000004b 15330 1726882277.25577: done sending task result for task 12673a56-9f93-e4fe-1358-00000000004b 15330 1726882277.25581: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882277.25643: no more pending results, returning what we have 15330 1726882277.25647: results queue empty 15330 1726882277.25649: checking for any_errors_fatal 15330 1726882277.25664: done checking for any_errors_fatal 15330 1726882277.25664: checking for max_fail_percentage 15330 1726882277.25667: done checking for max_fail_percentage 15330 1726882277.25668: checking to see if all hosts have failed and the running result is not ok 15330 1726882277.25669: done checking to see if all hosts have failed 15330 1726882277.25670: getting the remaining hosts for this loop 15330 1726882277.25671: done getting the remaining hosts for this loop 15330 1726882277.25675: getting the next task for host managed_node3 15330 1726882277.25682: done getting next task for host managed_node3 15330 1726882277.25690: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15330 1726882277.25694: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882277.25712: getting variables 15330 1726882277.25714: in VariableManager get_vars() 15330 1726882277.25752: Calling all_inventory to load vars for managed_node3 15330 1726882277.25755: Calling groups_inventory to load vars for managed_node3 15330 1726882277.25757: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882277.25768: Calling all_plugins_play to load vars for managed_node3 15330 1726882277.25771: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882277.25774: Calling groups_plugins_play to load vars for managed_node3 15330 1726882277.28755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882277.31952: done with get_vars() 15330 1726882277.31981: done getting variables 15330 1726882277.32044: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:31:17 -0400 (0:00:00.095) 0:00:26.526 ****** 15330 1726882277.32074: entering _queue_task() for managed_node3/debug 15330 1726882277.32839: worker is 1 (out of 1 available) 15330 1726882277.32851: exiting _queue_task() for managed_node3/debug 15330 1726882277.32863: done queuing things up, now waiting for results queue to drain 15330 1726882277.32864: waiting for pending results... 15330 1726882277.33813: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15330 1726882277.34099: in run() - task 12673a56-9f93-e4fe-1358-00000000004c 15330 1726882277.34298: variable 'ansible_search_path' from source: unknown 15330 1726882277.34301: variable 'ansible_search_path' from source: unknown 15330 1726882277.34304: calling self._execute() 15330 1726882277.34306: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.34310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.34312: variable 'omit' from source: magic vars 15330 1726882277.35225: variable 'ansible_distribution_major_version' from source: facts 15330 1726882277.35697: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882277.35700: variable 'omit' from source: magic vars 15330 1726882277.35703: variable 'omit' from source: magic vars 15330 1726882277.35705: variable 'omit' from source: magic vars 15330 1726882277.35935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882277.35972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882277.36000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882277.36024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882277.36041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882277.36078: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882277.36498: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.36501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.36505: Set connection var ansible_pipelining to False 15330 1726882277.36507: Set connection var ansible_timeout to 10 15330 1726882277.36509: Set connection var ansible_connection to ssh 15330 1726882277.36511: Set connection var ansible_shell_type to sh 15330 1726882277.36517: Set connection var ansible_shell_executable to /bin/sh 15330 1726882277.36520: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882277.36522: variable 'ansible_shell_executable' from source: unknown 15330 1726882277.36524: variable 'ansible_connection' from source: unknown 15330 1726882277.36526: variable 'ansible_module_compression' from source: unknown 15330 1726882277.36528: variable 'ansible_shell_type' from source: unknown 15330 1726882277.36530: variable 'ansible_shell_executable' from source: unknown 15330 1726882277.36531: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.36533: variable 'ansible_pipelining' from source: unknown 15330 1726882277.36535: variable 'ansible_timeout' from source: unknown 15330 1726882277.36537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.36818: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882277.36835: variable 'omit' from source: magic vars 15330 1726882277.36845: starting attempt loop 15330 1726882277.36853: running the handler 15330 1726882277.37165: variable '__network_connections_result' from source: set_fact 15330 1726882277.37222: handler run complete 15330 1726882277.37418: attempt loop complete, returning result 15330 1726882277.37699: _execute() done 15330 1726882277.37703: dumping result to json 15330 1726882277.37706: done dumping result, returning 15330 1726882277.37709: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-e4fe-1358-00000000004c] 15330 1726882277.37711: sending task result for task 12673a56-9f93-e4fe-1358-00000000004c 15330 1726882277.37779: done sending task result for task 12673a56-9f93-e4fe-1358-00000000004c 15330 1726882277.37782: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 15330 1726882277.37885: no more pending results, returning what we have 15330 1726882277.37891: results queue empty 15330 1726882277.37895: checking for any_errors_fatal 15330 1726882277.37901: done checking for any_errors_fatal 15330 1726882277.37901: checking for max_fail_percentage 15330 1726882277.37903: done checking for max_fail_percentage 15330 1726882277.37904: checking to see if all hosts have failed and the running result is not ok 15330 1726882277.37905: done checking to see if all hosts have failed 15330 1726882277.37906: getting the remaining hosts for this loop 15330 1726882277.37907: done getting the remaining hosts for this loop 15330 1726882277.37912: getting the next task for host managed_node3 15330 1726882277.37917: done getting next task for host managed_node3 15330 1726882277.37921: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15330 1726882277.37923: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882277.37933: getting variables 15330 1726882277.37935: in VariableManager get_vars() 15330 1726882277.37974: Calling all_inventory to load vars for managed_node3 15330 1726882277.37977: Calling groups_inventory to load vars for managed_node3 15330 1726882277.37979: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882277.38295: Calling all_plugins_play to load vars for managed_node3 15330 1726882277.38301: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882277.38306: Calling groups_plugins_play to load vars for managed_node3 15330 1726882277.41827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882277.45029: done with get_vars() 15330 1726882277.45056: done getting variables 15330 1726882277.45425: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:31:17 -0400 (0:00:00.133) 0:00:26.660 ****** 15330 1726882277.45457: entering _queue_task() for managed_node3/debug 15330 1726882277.46227: worker is 1 (out of 1 available) 15330 1726882277.46237: exiting _queue_task() for managed_node3/debug 15330 1726882277.46248: done queuing things up, now waiting for results queue to drain 15330 1726882277.46249: waiting for pending results... 15330 1726882277.46519: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15330 1726882277.46900: in run() - task 12673a56-9f93-e4fe-1358-00000000004d 15330 1726882277.46906: variable 'ansible_search_path' from source: unknown 15330 1726882277.46910: variable 'ansible_search_path' from source: unknown 15330 1726882277.46940: calling self._execute() 15330 1726882277.47070: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.47146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.47162: variable 'omit' from source: magic vars 15330 1726882277.48079: variable 'ansible_distribution_major_version' from source: facts 15330 1726882277.48132: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882277.48144: variable 'omit' from source: magic vars 15330 1726882277.48299: variable 'omit' from source: magic vars 15330 1726882277.48354: variable 'omit' from source: magic vars 15330 1726882277.48450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882277.48491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882277.48576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882277.48772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882277.48775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882277.48778: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882277.48780: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.48782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.48955: Set connection var ansible_pipelining to False 15330 1726882277.49009: Set connection var ansible_timeout to 10 15330 1726882277.49199: Set connection var ansible_connection to ssh 15330 1726882277.49204: Set connection var ansible_shell_type to sh 15330 1726882277.49206: Set connection var ansible_shell_executable to /bin/sh 15330 1726882277.49208: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882277.49210: variable 'ansible_shell_executable' from source: unknown 15330 1726882277.49212: variable 'ansible_connection' from source: unknown 15330 1726882277.49216: variable 'ansible_module_compression' from source: unknown 15330 1726882277.49218: variable 'ansible_shell_type' from source: unknown 15330 1726882277.49220: variable 'ansible_shell_executable' from source: unknown 15330 1726882277.49222: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.49224: variable 'ansible_pipelining' from source: unknown 15330 1726882277.49226: variable 'ansible_timeout' from source: unknown 15330 1726882277.49228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.49506: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882277.49598: variable 'omit' from source: magic vars 15330 1726882277.49601: starting attempt loop 15330 1726882277.49603: running the handler 15330 1726882277.49781: variable '__network_connections_result' from source: set_fact 15330 1726882277.49785: variable '__network_connections_result' from source: set_fact 15330 1726882277.50297: handler run complete 15330 1726882277.50300: attempt loop complete, returning result 15330 1726882277.50303: _execute() done 15330 1726882277.50305: dumping result to json 15330 1726882277.50307: done dumping result, returning 15330 1726882277.50310: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-e4fe-1358-00000000004d] 15330 1726882277.50312: sending task result for task 12673a56-9f93-e4fe-1358-00000000004d ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15330 1726882277.50528: no more pending results, returning what we have 15330 1726882277.50532: results queue empty 15330 1726882277.50533: checking for any_errors_fatal 15330 1726882277.50539: done checking for any_errors_fatal 15330 1726882277.50540: checking for max_fail_percentage 15330 1726882277.50542: done checking for max_fail_percentage 15330 1726882277.50543: checking to see if all hosts have failed and the running result is not ok 15330 1726882277.50546: done checking to see if all hosts have failed 15330 1726882277.50546: getting the remaining hosts for this loop 15330 1726882277.50548: done getting the remaining hosts for this loop 15330 1726882277.50552: getting the next task for host managed_node3 15330 1726882277.50558: done getting next task for host managed_node3 15330 1726882277.50562: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15330 1726882277.50564: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882277.50575: getting variables 15330 1726882277.50577: in VariableManager get_vars() 15330 1726882277.50836: Calling all_inventory to load vars for managed_node3 15330 1726882277.50839: Calling groups_inventory to load vars for managed_node3 15330 1726882277.50841: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882277.50852: Calling all_plugins_play to load vars for managed_node3 15330 1726882277.50856: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882277.50859: Calling groups_plugins_play to load vars for managed_node3 15330 1726882277.51485: done sending task result for task 12673a56-9f93-e4fe-1358-00000000004d 15330 1726882277.51492: WORKER PROCESS EXITING 15330 1726882277.53612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882277.56037: done with get_vars() 15330 1726882277.56068: done getting variables 15330 1726882277.56140: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:31:17 -0400 (0:00:00.107) 0:00:26.767 ****** 15330 1726882277.56180: entering _queue_task() for managed_node3/debug 15330 1726882277.56846: worker is 1 (out of 1 available) 15330 1726882277.56856: exiting _queue_task() for managed_node3/debug 15330 1726882277.56866: done queuing things up, now waiting for results queue to drain 15330 1726882277.56867: waiting for pending results... 15330 1726882277.57062: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15330 1726882277.57184: in run() - task 12673a56-9f93-e4fe-1358-00000000004e 15330 1726882277.57215: variable 'ansible_search_path' from source: unknown 15330 1726882277.57224: variable 'ansible_search_path' from source: unknown 15330 1726882277.57272: calling self._execute() 15330 1726882277.57375: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.57424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.57439: variable 'omit' from source: magic vars 15330 1726882277.57940: variable 'ansible_distribution_major_version' from source: facts 15330 1726882277.57958: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882277.58157: variable 'network_state' from source: role '' defaults 15330 1726882277.58174: Evaluated conditional (network_state != {}): False 15330 1726882277.58182: when evaluation is False, skipping this task 15330 1726882277.58198: _execute() done 15330 1726882277.58207: dumping result to json 15330 1726882277.58215: done dumping result, returning 15330 1726882277.58239: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-e4fe-1358-00000000004e] 15330 1726882277.58242: sending task result for task 12673a56-9f93-e4fe-1358-00000000004e 15330 1726882277.58418: done sending task result for task 12673a56-9f93-e4fe-1358-00000000004e 15330 1726882277.58421: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 15330 1726882277.58518: no more pending results, returning what we have 15330 1726882277.58522: results queue empty 15330 1726882277.58523: checking for any_errors_fatal 15330 1726882277.58533: done checking for any_errors_fatal 15330 1726882277.58534: checking for max_fail_percentage 15330 1726882277.58536: done checking for max_fail_percentage 15330 1726882277.58537: checking to see if all hosts have failed and the running result is not ok 15330 1726882277.58538: done checking to see if all hosts have failed 15330 1726882277.58538: getting the remaining hosts for this loop 15330 1726882277.58540: done getting the remaining hosts for this loop 15330 1726882277.58543: getting the next task for host managed_node3 15330 1726882277.58549: done getting next task for host managed_node3 15330 1726882277.58553: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15330 1726882277.58555: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882277.58707: getting variables 15330 1726882277.58709: in VariableManager get_vars() 15330 1726882277.58741: Calling all_inventory to load vars for managed_node3 15330 1726882277.58744: Calling groups_inventory to load vars for managed_node3 15330 1726882277.58746: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882277.58754: Calling all_plugins_play to load vars for managed_node3 15330 1726882277.58757: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882277.58760: Calling groups_plugins_play to load vars for managed_node3 15330 1726882277.60472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882277.65014: done with get_vars() 15330 1726882277.65047: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:31:17 -0400 (0:00:00.092) 0:00:26.859 ****** 15330 1726882277.65399: entering _queue_task() for managed_node3/ping 15330 1726882277.66164: worker is 1 (out of 1 available) 15330 1726882277.66176: exiting _queue_task() for managed_node3/ping 15330 1726882277.66185: done queuing things up, now waiting for results queue to drain 15330 1726882277.66190: waiting for pending results... 15330 1726882277.66543: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 15330 1726882277.66856: in run() - task 12673a56-9f93-e4fe-1358-00000000004f 15330 1726882277.66916: variable 'ansible_search_path' from source: unknown 15330 1726882277.66920: variable 'ansible_search_path' from source: unknown 15330 1726882277.67073: calling self._execute() 15330 1726882277.67165: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.67292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.67303: variable 'omit' from source: magic vars 15330 1726882277.68352: variable 'ansible_distribution_major_version' from source: facts 15330 1726882277.68360: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882277.68502: variable 'omit' from source: magic vars 15330 1726882277.68697: variable 'omit' from source: magic vars 15330 1726882277.68701: variable 'omit' from source: magic vars 15330 1726882277.68872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882277.68945: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882277.69062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882277.69084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882277.69097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882277.69292: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882277.69297: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.69362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.69676: Set connection var ansible_pipelining to False 15330 1726882277.69679: Set connection var ansible_timeout to 10 15330 1726882277.69681: Set connection var ansible_connection to ssh 15330 1726882277.69685: Set connection var ansible_shell_type to sh 15330 1726882277.69689: Set connection var ansible_shell_executable to /bin/sh 15330 1726882277.69691: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882277.69816: variable 'ansible_shell_executable' from source: unknown 15330 1726882277.69897: variable 'ansible_connection' from source: unknown 15330 1726882277.69901: variable 'ansible_module_compression' from source: unknown 15330 1726882277.69904: variable 'ansible_shell_type' from source: unknown 15330 1726882277.69907: variable 'ansible_shell_executable' from source: unknown 15330 1726882277.69992: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882277.70003: variable 'ansible_pipelining' from source: unknown 15330 1726882277.70006: variable 'ansible_timeout' from source: unknown 15330 1726882277.70011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882277.70498: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882277.70509: variable 'omit' from source: magic vars 15330 1726882277.70547: starting attempt loop 15330 1726882277.70551: running the handler 15330 1726882277.70553: _low_level_execute_command(): starting 15330 1726882277.70652: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882277.72069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882277.72328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882277.72331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882277.72334: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882277.72336: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15330 1726882277.72340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882277.72343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882277.72378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882277.72384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882277.74045: stdout chunk (state=3): >>>/root <<< 15330 1726882277.74192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882277.74201: stdout chunk (state=3): >>><<< 15330 1726882277.74284: stderr chunk (state=3): >>><<< 15330 1726882277.74437: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882277.74462: _low_level_execute_command(): starting 15330 1726882277.74469: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871 `" && echo ansible-tmp-1726882277.7444184-16533-182850924613871="` echo /root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871 `" ) && sleep 0' 15330 1726882277.75946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882277.75962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882277.75979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882277.76003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882277.76160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882277.76300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882277.76315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882277.76459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882277.78371: stdout chunk (state=3): >>>ansible-tmp-1726882277.7444184-16533-182850924613871=/root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871 <<< 15330 1726882277.78549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882277.78589: stderr chunk (state=3): >>><<< 15330 1726882277.78759: stdout chunk (state=3): >>><<< 15330 1726882277.78762: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882277.7444184-16533-182850924613871=/root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882277.78765: variable 'ansible_module_compression' from source: unknown 15330 1726882277.78882: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15330 1726882277.78928: variable 'ansible_facts' from source: unknown 15330 1726882277.79201: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871/AnsiballZ_ping.py 15330 1726882277.79424: Sending initial data 15330 1726882277.79428: Sent initial data (153 bytes) 15330 1726882277.80612: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882277.80711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882277.80861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882277.82382: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15330 1726882277.82400: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882277.82481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871/AnsiballZ_ping.py" <<< 15330 1726882277.82494: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpdfd6415i /root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871/AnsiballZ_ping.py <<< 15330 1726882277.82517: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpdfd6415i" to remote "/root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871/AnsiballZ_ping.py" <<< 15330 1726882277.82532: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871/AnsiballZ_ping.py" <<< 15330 1726882277.83932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882277.84080: stderr chunk (state=3): >>><<< 15330 1726882277.84083: stdout chunk (state=3): >>><<< 15330 1726882277.84086: done transferring module to remote 15330 1726882277.84090: _low_level_execute_command(): starting 15330 1726882277.84096: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871/ /root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871/AnsiballZ_ping.py && sleep 0' 15330 1726882277.85237: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882277.85241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882277.85243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882277.85245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882277.85247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882277.85412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882277.85429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882277.85613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882277.87189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882277.87221: stderr chunk (state=3): >>><<< 15330 1726882277.87236: stdout chunk (state=3): >>><<< 15330 1726882277.87434: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882277.87437: _low_level_execute_command(): starting 15330 1726882277.87444: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871/AnsiballZ_ping.py && sleep 0' 15330 1726882277.88425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882277.88437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882277.88508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882277.88533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882277.88709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882277.88713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882277.88798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882278.03762: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15330 1726882278.04982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882278.05015: stderr chunk (state=3): >>><<< 15330 1726882278.05018: stdout chunk (state=3): >>><<< 15330 1726882278.05038: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882278.05055: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882278.05063: _low_level_execute_command(): starting 15330 1726882278.05068: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882277.7444184-16533-182850924613871/ > /dev/null 2>&1 && sleep 0' 15330 1726882278.05478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882278.05511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882278.05515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882278.05517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882278.05519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882278.05559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882278.05573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882278.05639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882278.07557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882278.07561: stdout chunk (state=3): >>><<< 15330 1726882278.07564: stderr chunk (state=3): >>><<< 15330 1726882278.07571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882278.07573: handler run complete 15330 1726882278.07579: attempt loop complete, returning result 15330 1726882278.07581: _execute() done 15330 1726882278.07584: dumping result to json 15330 1726882278.07591: done dumping result, returning 15330 1726882278.07607: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-e4fe-1358-00000000004f] 15330 1726882278.07610: sending task result for task 12673a56-9f93-e4fe-1358-00000000004f 15330 1726882278.07738: done sending task result for task 12673a56-9f93-e4fe-1358-00000000004f 15330 1726882278.07741: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 15330 1726882278.07808: no more pending results, returning what we have 15330 1726882278.07811: results queue empty 15330 1726882278.07812: checking for any_errors_fatal 15330 1726882278.07819: done checking for any_errors_fatal 15330 1726882278.07819: checking for max_fail_percentage 15330 1726882278.07821: done checking for max_fail_percentage 15330 1726882278.07822: checking to see if all hosts have failed and the running result is not ok 15330 1726882278.07823: done checking to see if all hosts have failed 15330 1726882278.07823: getting the remaining hosts for this loop 15330 1726882278.07825: done getting the remaining hosts for this loop 15330 1726882278.07828: getting the next task for host managed_node3 15330 1726882278.07835: done getting next task for host managed_node3 15330 1726882278.07838: ^ task is: TASK: meta (role_complete) 15330 1726882278.07840: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882278.07848: getting variables 15330 1726882278.07850: in VariableManager get_vars() 15330 1726882278.07895: Calling all_inventory to load vars for managed_node3 15330 1726882278.07898: Calling groups_inventory to load vars for managed_node3 15330 1726882278.07900: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882278.07918: Calling all_plugins_play to load vars for managed_node3 15330 1726882278.07921: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882278.07924: Calling groups_plugins_play to load vars for managed_node3 15330 1726882278.08745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882278.09858: done with get_vars() 15330 1726882278.09879: done getting variables 15330 1726882278.09960: done queuing things up, now waiting for results queue to drain 15330 1726882278.09964: results queue empty 15330 1726882278.09965: checking for any_errors_fatal 15330 1726882278.09967: done checking for any_errors_fatal 15330 1726882278.09968: checking for max_fail_percentage 15330 1726882278.09969: done checking for max_fail_percentage 15330 1726882278.09970: checking to see if all hosts have failed and the running result is not ok 15330 1726882278.09970: done checking to see if all hosts have failed 15330 1726882278.09971: getting the remaining hosts for this loop 15330 1726882278.09972: done getting the remaining hosts for this loop 15330 1726882278.09975: getting the next task for host managed_node3 15330 1726882278.09978: done getting next task for host managed_node3 15330 1726882278.09979: ^ task is: TASK: meta (flush_handlers) 15330 1726882278.09981: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882278.09983: getting variables 15330 1726882278.09984: in VariableManager get_vars() 15330 1726882278.10006: Calling all_inventory to load vars for managed_node3 15330 1726882278.10008: Calling groups_inventory to load vars for managed_node3 15330 1726882278.10010: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882278.10016: Calling all_plugins_play to load vars for managed_node3 15330 1726882278.10018: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882278.10021: Calling groups_plugins_play to load vars for managed_node3 15330 1726882278.11090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882278.13073: done with get_vars() 15330 1726882278.13296: done getting variables 15330 1726882278.13349: in VariableManager get_vars() 15330 1726882278.13362: Calling all_inventory to load vars for managed_node3 15330 1726882278.13365: Calling groups_inventory to load vars for managed_node3 15330 1726882278.13367: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882278.13372: Calling all_plugins_play to load vars for managed_node3 15330 1726882278.13374: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882278.13377: Calling groups_plugins_play to load vars for managed_node3 15330 1726882278.14619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882278.16759: done with get_vars() 15330 1726882278.16789: done queuing things up, now waiting for results queue to drain 15330 1726882278.16791: results queue empty 15330 1726882278.16792: checking for any_errors_fatal 15330 1726882278.16795: done checking for any_errors_fatal 15330 1726882278.16796: checking for max_fail_percentage 15330 1726882278.16797: done checking for max_fail_percentage 15330 1726882278.16797: checking to see if all hosts have failed and the running result is not ok 15330 1726882278.16798: done checking to see if all hosts have failed 15330 1726882278.16799: getting the remaining hosts for this loop 15330 1726882278.16800: done getting the remaining hosts for this loop 15330 1726882278.16803: getting the next task for host managed_node3 15330 1726882278.16811: done getting next task for host managed_node3 15330 1726882278.16814: ^ task is: TASK: meta (flush_handlers) 15330 1726882278.16815: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882278.16818: getting variables 15330 1726882278.16819: in VariableManager get_vars() 15330 1726882278.16837: Calling all_inventory to load vars for managed_node3 15330 1726882278.16839: Calling groups_inventory to load vars for managed_node3 15330 1726882278.16848: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882278.16854: Calling all_plugins_play to load vars for managed_node3 15330 1726882278.16857: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882278.16862: Calling groups_plugins_play to load vars for managed_node3 15330 1726882278.17988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882278.19535: done with get_vars() 15330 1726882278.19562: done getting variables 15330 1726882278.19623: in VariableManager get_vars() 15330 1726882278.19643: Calling all_inventory to load vars for managed_node3 15330 1726882278.19646: Calling groups_inventory to load vars for managed_node3 15330 1726882278.19648: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882278.19653: Calling all_plugins_play to load vars for managed_node3 15330 1726882278.19656: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882278.19660: Calling groups_plugins_play to load vars for managed_node3 15330 1726882278.20998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882278.22883: done with get_vars() 15330 1726882278.22916: done queuing things up, now waiting for results queue to drain 15330 1726882278.22923: results queue empty 15330 1726882278.22925: checking for any_errors_fatal 15330 1726882278.22926: done checking for any_errors_fatal 15330 1726882278.22927: checking for max_fail_percentage 15330 1726882278.22928: done checking for max_fail_percentage 15330 1726882278.22929: checking to see if all hosts have failed and the running result is not ok 15330 1726882278.22930: done checking to see if all hosts have failed 15330 1726882278.22930: getting the remaining hosts for this loop 15330 1726882278.22931: done getting the remaining hosts for this loop 15330 1726882278.22934: getting the next task for host managed_node3 15330 1726882278.22938: done getting next task for host managed_node3 15330 1726882278.22939: ^ task is: None 15330 1726882278.22940: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882278.22941: done queuing things up, now waiting for results queue to drain 15330 1726882278.22942: results queue empty 15330 1726882278.22944: checking for any_errors_fatal 15330 1726882278.22945: done checking for any_errors_fatal 15330 1726882278.22946: checking for max_fail_percentage 15330 1726882278.22947: done checking for max_fail_percentage 15330 1726882278.22948: checking to see if all hosts have failed and the running result is not ok 15330 1726882278.22949: done checking to see if all hosts have failed 15330 1726882278.22950: getting the next task for host managed_node3 15330 1726882278.22953: done getting next task for host managed_node3 15330 1726882278.22953: ^ task is: None 15330 1726882278.22955: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882278.23211: in VariableManager get_vars() 15330 1726882278.23229: done with get_vars() 15330 1726882278.23236: in VariableManager get_vars() 15330 1726882278.23246: done with get_vars() 15330 1726882278.23251: variable 'omit' from source: magic vars 15330 1726882278.23285: in VariableManager get_vars() 15330 1726882278.23298: done with get_vars() 15330 1726882278.23320: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 15330 1726882278.23720: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15330 1726882278.23855: getting the remaining hosts for this loop 15330 1726882278.23857: done getting the remaining hosts for this loop 15330 1726882278.23860: getting the next task for host managed_node3 15330 1726882278.23862: done getting next task for host managed_node3 15330 1726882278.23864: ^ task is: TASK: Gathering Facts 15330 1726882278.23865: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882278.23868: getting variables 15330 1726882278.23868: in VariableManager get_vars() 15330 1726882278.23880: Calling all_inventory to load vars for managed_node3 15330 1726882278.23884: Calling groups_inventory to load vars for managed_node3 15330 1726882278.23887: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882278.23895: Calling all_plugins_play to load vars for managed_node3 15330 1726882278.23898: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882278.23902: Calling groups_plugins_play to load vars for managed_node3 15330 1726882278.25734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882278.34094: done with get_vars() 15330 1726882278.34118: done getting variables 15330 1726882278.34148: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:31:18 -0400 (0:00:00.687) 0:00:27.547 ****** 15330 1726882278.34164: entering _queue_task() for managed_node3/gather_facts 15330 1726882278.34453: worker is 1 (out of 1 available) 15330 1726882278.34465: exiting _queue_task() for managed_node3/gather_facts 15330 1726882278.34475: done queuing things up, now waiting for results queue to drain 15330 1726882278.34476: waiting for pending results... 15330 1726882278.34678: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882278.34774: in run() - task 12673a56-9f93-e4fe-1358-000000000382 15330 1726882278.34785: variable 'ansible_search_path' from source: unknown 15330 1726882278.34829: calling self._execute() 15330 1726882278.34933: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882278.34937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882278.34947: variable 'omit' from source: magic vars 15330 1726882278.35329: variable 'ansible_distribution_major_version' from source: facts 15330 1726882278.35332: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882278.35336: variable 'omit' from source: magic vars 15330 1726882278.35399: variable 'omit' from source: magic vars 15330 1726882278.35412: variable 'omit' from source: magic vars 15330 1726882278.35452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882278.35477: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882278.35506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882278.35534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882278.35537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882278.35589: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882278.35607: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882278.35611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882278.35716: Set connection var ansible_pipelining to False 15330 1726882278.35747: Set connection var ansible_timeout to 10 15330 1726882278.35751: Set connection var ansible_connection to ssh 15330 1726882278.35754: Set connection var ansible_shell_type to sh 15330 1726882278.35756: Set connection var ansible_shell_executable to /bin/sh 15330 1726882278.35760: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882278.35982: variable 'ansible_shell_executable' from source: unknown 15330 1726882278.35985: variable 'ansible_connection' from source: unknown 15330 1726882278.35991: variable 'ansible_module_compression' from source: unknown 15330 1726882278.36017: variable 'ansible_shell_type' from source: unknown 15330 1726882278.36020: variable 'ansible_shell_executable' from source: unknown 15330 1726882278.36022: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882278.36025: variable 'ansible_pipelining' from source: unknown 15330 1726882278.36027: variable 'ansible_timeout' from source: unknown 15330 1726882278.36029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882278.36031: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882278.36034: variable 'omit' from source: magic vars 15330 1726882278.36247: starting attempt loop 15330 1726882278.36250: running the handler 15330 1726882278.36253: variable 'ansible_facts' from source: unknown 15330 1726882278.36255: _low_level_execute_command(): starting 15330 1726882278.36257: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882278.37029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882278.37049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882278.37064: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882278.37100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882278.37142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882278.37201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882278.38850: stdout chunk (state=3): >>>/root <<< 15330 1726882278.38966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882278.38973: stdout chunk (state=3): >>><<< 15330 1726882278.38979: stderr chunk (state=3): >>><<< 15330 1726882278.39000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882278.39011: _low_level_execute_command(): starting 15330 1726882278.39016: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985 `" && echo ansible-tmp-1726882278.3899896-16559-97256600334985="` echo /root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985 `" ) && sleep 0' 15330 1726882278.39457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882278.39460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882278.39463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882278.39466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882278.39476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882278.39530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882278.39533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882278.39574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882278.41415: stdout chunk (state=3): >>>ansible-tmp-1726882278.3899896-16559-97256600334985=/root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985 <<< 15330 1726882278.41532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882278.41543: stderr chunk (state=3): >>><<< 15330 1726882278.41552: stdout chunk (state=3): >>><<< 15330 1726882278.41567: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882278.3899896-16559-97256600334985=/root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882278.41595: variable 'ansible_module_compression' from source: unknown 15330 1726882278.41650: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882278.41712: variable 'ansible_facts' from source: unknown 15330 1726882278.41845: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985/AnsiballZ_setup.py 15330 1726882278.41978: Sending initial data 15330 1726882278.41982: Sent initial data (153 bytes) 15330 1726882278.42458: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882278.42461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882278.42465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882278.42468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882278.42513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882278.42520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882278.42577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882278.44205: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882278.44216: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882278.44272: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmphzse_zxv /root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985/AnsiballZ_setup.py <<< 15330 1726882278.44278: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985/AnsiballZ_setup.py" <<< 15330 1726882278.44345: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmphzse_zxv" to remote "/root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985/AnsiballZ_setup.py" <<< 15330 1726882278.45629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882278.45696: stderr chunk (state=3): >>><<< 15330 1726882278.45700: stdout chunk (state=3): >>><<< 15330 1726882278.45702: done transferring module to remote 15330 1726882278.45705: _low_level_execute_command(): starting 15330 1726882278.45708: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985/ /root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985/AnsiballZ_setup.py && sleep 0' 15330 1726882278.46159: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882278.46163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882278.46165: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882278.46168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882278.46170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882278.46217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882278.46226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882278.46272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882278.47969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882278.48036: stderr chunk (state=3): >>><<< 15330 1726882278.48040: stdout chunk (state=3): >>><<< 15330 1726882278.48147: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882278.48150: _low_level_execute_command(): starting 15330 1726882278.48153: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985/AnsiballZ_setup.py && sleep 0' 15330 1726882278.48903: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882278.48958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882278.49025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882278.49068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882278.49127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882279.12337: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 1.01904296875, "5m": 0.5087890625, "15m": 0.236328125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "18", "epoch": "1726882278", "epoch_int": "1726882278", "date": "2024-09-20", "time": "21:31:18", "iso8601_micro": "2024-09-21T01:31:18.765234Z", "iso8601": "2024-09-21T01:31:18Z", "iso8601_basic": "20240920T213118765234", "iso8601_basic_short": "20240920T213118", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3274, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_<<< 15330 1726882279.12353: stdout chunk (state=3): >>>uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 586, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805199360, "block_size": 4096, "block_total": 65519099, "block_available": 63917285, "block_used": 1601814, "inode_total": 131070960, "inode_available": 131029133, "inode_used": 41827, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882279.14473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882279.14478: stdout chunk (state=3): >>><<< 15330 1726882279.14483: stderr chunk (state=3): >>><<< 15330 1726882279.14523: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 1.01904296875, "5m": 0.5087890625, "15m": 0.236328125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "18", "epoch": "1726882278", "epoch_int": "1726882278", "date": "2024-09-20", "time": "21:31:18", "iso8601_micro": "2024-09-21T01:31:18.765234Z", "iso8601": "2024-09-21T01:31:18Z", "iso8601_basic": "20240920T213118765234", "iso8601_basic_short": "20240920T213118", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3274, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 586, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805199360, "block_size": 4096, "block_total": 65519099, "block_available": 63917285, "block_used": 1601814, "inode_total": 131070960, "inode_available": 131029133, "inode_used": 41827, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882279.14950: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882279.14953: _low_level_execute_command(): starting 15330 1726882279.14957: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882278.3899896-16559-97256600334985/ > /dev/null 2>&1 && sleep 0' 15330 1726882279.16320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882279.16324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882279.16326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882279.16329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882279.16334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882279.16336: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882279.16339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882279.16341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882279.16343: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882279.16345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15330 1726882279.16348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882279.16350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882279.16352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882279.16592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882279.16597: stderr chunk (state=3): >>>debug2: match found <<< 15330 1726882279.16603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882279.16607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882279.16610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882279.16710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882279.16900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882279.18719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882279.18723: stdout chunk (state=3): >>><<< 15330 1726882279.18725: stderr chunk (state=3): >>><<< 15330 1726882279.18753: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882279.18759: handler run complete 15330 1726882279.18889: variable 'ansible_facts' from source: unknown 15330 1726882279.19059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882279.19608: variable 'ansible_facts' from source: unknown 15330 1726882279.19611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882279.19615: attempt loop complete, returning result 15330 1726882279.19617: _execute() done 15330 1726882279.19619: dumping result to json 15330 1726882279.19622: done dumping result, returning 15330 1726882279.19624: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-000000000382] 15330 1726882279.19626: sending task result for task 12673a56-9f93-e4fe-1358-000000000382 ok: [managed_node3] 15330 1726882279.20635: no more pending results, returning what we have 15330 1726882279.20639: results queue empty 15330 1726882279.20640: checking for any_errors_fatal 15330 1726882279.20641: done checking for any_errors_fatal 15330 1726882279.20642: checking for max_fail_percentage 15330 1726882279.20643: done checking for max_fail_percentage 15330 1726882279.20644: checking to see if all hosts have failed and the running result is not ok 15330 1726882279.20645: done checking to see if all hosts have failed 15330 1726882279.20646: getting the remaining hosts for this loop 15330 1726882279.20647: done getting the remaining hosts for this loop 15330 1726882279.20742: getting the next task for host managed_node3 15330 1726882279.20748: done getting next task for host managed_node3 15330 1726882279.20750: ^ task is: TASK: meta (flush_handlers) 15330 1726882279.20752: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882279.20757: getting variables 15330 1726882279.20758: in VariableManager get_vars() 15330 1726882279.20782: Calling all_inventory to load vars for managed_node3 15330 1726882279.20784: Calling groups_inventory to load vars for managed_node3 15330 1726882279.20787: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882279.20971: Calling all_plugins_play to load vars for managed_node3 15330 1726882279.20975: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882279.20978: Calling groups_plugins_play to load vars for managed_node3 15330 1726882279.21701: done sending task result for task 12673a56-9f93-e4fe-1358-000000000382 15330 1726882279.21705: WORKER PROCESS EXITING 15330 1726882279.23522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882279.25102: done with get_vars() 15330 1726882279.25190: done getting variables 15330 1726882279.25273: in VariableManager get_vars() 15330 1726882279.25289: Calling all_inventory to load vars for managed_node3 15330 1726882279.25292: Calling groups_inventory to load vars for managed_node3 15330 1726882279.25296: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882279.25301: Calling all_plugins_play to load vars for managed_node3 15330 1726882279.25308: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882279.25312: Calling groups_plugins_play to load vars for managed_node3 15330 1726882279.27703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882279.29235: done with get_vars() 15330 1726882279.29259: done queuing things up, now waiting for results queue to drain 15330 1726882279.29261: results queue empty 15330 1726882279.29261: checking for any_errors_fatal 15330 1726882279.29264: done checking for any_errors_fatal 15330 1726882279.29265: checking for max_fail_percentage 15330 1726882279.29265: done checking for max_fail_percentage 15330 1726882279.29266: checking to see if all hosts have failed and the running result is not ok 15330 1726882279.29266: done checking to see if all hosts have failed 15330 1726882279.29267: getting the remaining hosts for this loop 15330 1726882279.29267: done getting the remaining hosts for this loop 15330 1726882279.29269: getting the next task for host managed_node3 15330 1726882279.29273: done getting next task for host managed_node3 15330 1726882279.29274: ^ task is: TASK: Include the task 'delete_interface.yml' 15330 1726882279.29275: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882279.29277: getting variables 15330 1726882279.29278: in VariableManager get_vars() 15330 1726882279.29287: Calling all_inventory to load vars for managed_node3 15330 1726882279.29289: Calling groups_inventory to load vars for managed_node3 15330 1726882279.29290: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882279.29296: Calling all_plugins_play to load vars for managed_node3 15330 1726882279.29298: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882279.29299: Calling groups_plugins_play to load vars for managed_node3 15330 1726882279.30647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882279.35030: done with get_vars() 15330 1726882279.35052: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:31:19 -0400 (0:00:01.010) 0:00:28.558 ****** 15330 1726882279.35257: entering _queue_task() for managed_node3/include_tasks 15330 1726882279.36072: worker is 1 (out of 1 available) 15330 1726882279.36084: exiting _queue_task() for managed_node3/include_tasks 15330 1726882279.36098: done queuing things up, now waiting for results queue to drain 15330 1726882279.36100: waiting for pending results... 15330 1726882279.37157: running TaskExecutor() for managed_node3/TASK: Include the task 'delete_interface.yml' 15330 1726882279.37509: in run() - task 12673a56-9f93-e4fe-1358-000000000052 15330 1726882279.37522: variable 'ansible_search_path' from source: unknown 15330 1726882279.37561: calling self._execute() 15330 1726882279.38050: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882279.38057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882279.38068: variable 'omit' from source: magic vars 15330 1726882279.39063: variable 'ansible_distribution_major_version' from source: facts 15330 1726882279.39067: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882279.39070: _execute() done 15330 1726882279.39073: dumping result to json 15330 1726882279.39075: done dumping result, returning 15330 1726882279.39078: done running TaskExecutor() for managed_node3/TASK: Include the task 'delete_interface.yml' [12673a56-9f93-e4fe-1358-000000000052] 15330 1726882279.39282: sending task result for task 12673a56-9f93-e4fe-1358-000000000052 15330 1726882279.39672: done sending task result for task 12673a56-9f93-e4fe-1358-000000000052 15330 1726882279.39676: WORKER PROCESS EXITING 15330 1726882279.39706: no more pending results, returning what we have 15330 1726882279.39711: in VariableManager get_vars() 15330 1726882279.39745: Calling all_inventory to load vars for managed_node3 15330 1726882279.39748: Calling groups_inventory to load vars for managed_node3 15330 1726882279.39751: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882279.39762: Calling all_plugins_play to load vars for managed_node3 15330 1726882279.39765: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882279.39767: Calling groups_plugins_play to load vars for managed_node3 15330 1726882279.43744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882279.48929: done with get_vars() 15330 1726882279.48959: variable 'ansible_search_path' from source: unknown 15330 1726882279.49041: we have included files to process 15330 1726882279.49068: generating all_blocks data 15330 1726882279.49070: done generating all_blocks data 15330 1726882279.49071: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15330 1726882279.49072: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15330 1726882279.49076: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15330 1726882279.49504: done processing included file 15330 1726882279.49507: iterating over new_blocks loaded from include file 15330 1726882279.49508: in VariableManager get_vars() 15330 1726882279.49520: done with get_vars() 15330 1726882279.49522: filtering new block on tags 15330 1726882279.49535: done filtering new block on tags 15330 1726882279.49538: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node3 15330 1726882279.49543: extending task lists for all hosts with included blocks 15330 1726882279.49582: done extending task lists 15330 1726882279.49583: done processing included files 15330 1726882279.49584: results queue empty 15330 1726882279.49585: checking for any_errors_fatal 15330 1726882279.49586: done checking for any_errors_fatal 15330 1726882279.49587: checking for max_fail_percentage 15330 1726882279.49588: done checking for max_fail_percentage 15330 1726882279.49589: checking to see if all hosts have failed and the running result is not ok 15330 1726882279.49590: done checking to see if all hosts have failed 15330 1726882279.49591: getting the remaining hosts for this loop 15330 1726882279.49592: done getting the remaining hosts for this loop 15330 1726882279.49596: getting the next task for host managed_node3 15330 1726882279.49600: done getting next task for host managed_node3 15330 1726882279.49603: ^ task is: TASK: Remove test interface if necessary 15330 1726882279.49605: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882279.49607: getting variables 15330 1726882279.49608: in VariableManager get_vars() 15330 1726882279.49618: Calling all_inventory to load vars for managed_node3 15330 1726882279.49620: Calling groups_inventory to load vars for managed_node3 15330 1726882279.49623: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882279.49628: Calling all_plugins_play to load vars for managed_node3 15330 1726882279.49631: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882279.49633: Calling groups_plugins_play to load vars for managed_node3 15330 1726882279.50958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882279.53598: done with get_vars() 15330 1726882279.53686: done getting variables 15330 1726882279.53738: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:31:19 -0400 (0:00:00.185) 0:00:28.743 ****** 15330 1726882279.53770: entering _queue_task() for managed_node3/command 15330 1726882279.54721: worker is 1 (out of 1 available) 15330 1726882279.54731: exiting _queue_task() for managed_node3/command 15330 1726882279.54741: done queuing things up, now waiting for results queue to drain 15330 1726882279.54742: waiting for pending results... 15330 1726882279.54986: running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary 15330 1726882279.55000: in run() - task 12673a56-9f93-e4fe-1358-000000000393 15330 1726882279.55022: variable 'ansible_search_path' from source: unknown 15330 1726882279.55031: variable 'ansible_search_path' from source: unknown 15330 1726882279.55072: calling self._execute() 15330 1726882279.55172: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882279.55190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882279.55299: variable 'omit' from source: magic vars 15330 1726882279.55650: variable 'ansible_distribution_major_version' from source: facts 15330 1726882279.55670: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882279.55681: variable 'omit' from source: magic vars 15330 1726882279.55729: variable 'omit' from source: magic vars 15330 1726882279.55832: variable 'interface' from source: set_fact 15330 1726882279.55863: variable 'omit' from source: magic vars 15330 1726882279.56000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882279.56066: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882279.56257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882279.56288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882279.56383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882279.56386: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882279.56388: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882279.56391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882279.56480: Set connection var ansible_pipelining to False 15330 1726882279.56508: Set connection var ansible_timeout to 10 15330 1726882279.56580: Set connection var ansible_connection to ssh 15330 1726882279.56599: Set connection var ansible_shell_type to sh 15330 1726882279.56614: Set connection var ansible_shell_executable to /bin/sh 15330 1726882279.56708: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882279.56711: variable 'ansible_shell_executable' from source: unknown 15330 1726882279.56713: variable 'ansible_connection' from source: unknown 15330 1726882279.56716: variable 'ansible_module_compression' from source: unknown 15330 1726882279.56719: variable 'ansible_shell_type' from source: unknown 15330 1726882279.56721: variable 'ansible_shell_executable' from source: unknown 15330 1726882279.56723: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882279.56726: variable 'ansible_pipelining' from source: unknown 15330 1726882279.56728: variable 'ansible_timeout' from source: unknown 15330 1726882279.56730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882279.56826: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882279.56846: variable 'omit' from source: magic vars 15330 1726882279.56856: starting attempt loop 15330 1726882279.56924: running the handler 15330 1726882279.56927: _low_level_execute_command(): starting 15330 1726882279.56930: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882279.57891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882279.57984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882279.58108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882279.58160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882279.59821: stdout chunk (state=3): >>>/root <<< 15330 1726882279.59933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882279.60053: stderr chunk (state=3): >>><<< 15330 1726882279.60057: stdout chunk (state=3): >>><<< 15330 1726882279.60208: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882279.60214: _low_level_execute_command(): starting 15330 1726882279.60218: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216 `" && echo ansible-tmp-1726882279.6009805-16601-54249709226216="` echo /root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216 `" ) && sleep 0' 15330 1726882279.60948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882279.60976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882279.61099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882279.62899: stdout chunk (state=3): >>>ansible-tmp-1726882279.6009805-16601-54249709226216=/root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216 <<< 15330 1726882279.62974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882279.63026: stderr chunk (state=3): >>><<< 15330 1726882279.63040: stdout chunk (state=3): >>><<< 15330 1726882279.63102: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882279.6009805-16601-54249709226216=/root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882279.63141: variable 'ansible_module_compression' from source: unknown 15330 1726882279.63264: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15330 1726882279.63279: variable 'ansible_facts' from source: unknown 15330 1726882279.63512: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216/AnsiballZ_command.py 15330 1726882279.63955: Sending initial data 15330 1726882279.63958: Sent initial data (155 bytes) 15330 1726882279.64670: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882279.64683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882279.64699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 15330 1726882279.64712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882279.64754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882279.64776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882279.64840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882279.66344: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882279.66381: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882279.66435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpn_m0vlsc /root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216/AnsiballZ_command.py <<< 15330 1726882279.66439: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216/AnsiballZ_command.py" <<< 15330 1726882279.66477: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpn_m0vlsc" to remote "/root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216/AnsiballZ_command.py" <<< 15330 1726882279.67733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882279.67737: stdout chunk (state=3): >>><<< 15330 1726882279.67740: stderr chunk (state=3): >>><<< 15330 1726882279.67742: done transferring module to remote 15330 1726882279.67744: _low_level_execute_command(): starting 15330 1726882279.67751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216/ /root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216/AnsiballZ_command.py && sleep 0' 15330 1726882279.68679: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882279.68705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882279.68766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882279.68807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882279.68810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882279.68875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882279.70617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882279.70631: stderr chunk (state=3): >>><<< 15330 1726882279.70634: stdout chunk (state=3): >>><<< 15330 1726882279.70653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882279.70661: _low_level_execute_command(): starting 15330 1726882279.70664: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216/AnsiballZ_command.py && sleep 0' 15330 1726882279.71287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882279.71338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882279.71341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882279.71343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882279.71346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882279.71358: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882279.71450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882279.71471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882279.71564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882279.87092: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 21:31:19.861877", "end": "2024-09-20 21:31:19.869168", "delta": "0:00:00.007291", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15330 1726882279.88444: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.10.229 closed. <<< 15330 1726882279.88473: stderr chunk (state=3): >>><<< 15330 1726882279.88476: stdout chunk (state=3): >>><<< 15330 1726882279.88497: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 21:31:19.861877", "end": "2024-09-20 21:31:19.869168", "delta": "0:00:00.007291", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.10.229 closed. 15330 1726882279.88528: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882279.88535: _low_level_execute_command(): starting 15330 1726882279.88540: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882279.6009805-16601-54249709226216/ > /dev/null 2>&1 && sleep 0' 15330 1726882279.89500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882279.89507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882279.89510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882279.91278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882279.91329: stderr chunk (state=3): >>><<< 15330 1726882279.91332: stdout chunk (state=3): >>><<< 15330 1726882279.91348: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882279.91355: handler run complete 15330 1726882279.91395: Evaluated conditional (False): False 15330 1726882279.91414: attempt loop complete, returning result 15330 1726882279.91418: _execute() done 15330 1726882279.91420: dumping result to json 15330 1726882279.91422: done dumping result, returning 15330 1726882279.91622: done running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary [12673a56-9f93-e4fe-1358-000000000393] 15330 1726882279.91625: sending task result for task 12673a56-9f93-e4fe-1358-000000000393 15330 1726882279.91690: done sending task result for task 12673a56-9f93-e4fe-1358-000000000393 15330 1726882279.91695: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.007291", "end": "2024-09-20 21:31:19.869168", "rc": 1, "start": "2024-09-20 21:31:19.861877" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 15330 1726882279.91757: no more pending results, returning what we have 15330 1726882279.91760: results queue empty 15330 1726882279.91760: checking for any_errors_fatal 15330 1726882279.91762: done checking for any_errors_fatal 15330 1726882279.91763: checking for max_fail_percentage 15330 1726882279.91764: done checking for max_fail_percentage 15330 1726882279.91765: checking to see if all hosts have failed and the running result is not ok 15330 1726882279.91765: done checking to see if all hosts have failed 15330 1726882279.91766: getting the remaining hosts for this loop 15330 1726882279.91767: done getting the remaining hosts for this loop 15330 1726882279.91770: getting the next task for host managed_node3 15330 1726882279.91776: done getting next task for host managed_node3 15330 1726882279.91778: ^ task is: TASK: meta (flush_handlers) 15330 1726882279.91779: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882279.91783: getting variables 15330 1726882279.91784: in VariableManager get_vars() 15330 1726882279.91811: Calling all_inventory to load vars for managed_node3 15330 1726882279.91813: Calling groups_inventory to load vars for managed_node3 15330 1726882279.91816: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882279.91825: Calling all_plugins_play to load vars for managed_node3 15330 1726882279.91828: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882279.91831: Calling groups_plugins_play to load vars for managed_node3 15330 1726882279.94115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882279.95383: done with get_vars() 15330 1726882279.95407: done getting variables 15330 1726882279.95458: in VariableManager get_vars() 15330 1726882279.95465: Calling all_inventory to load vars for managed_node3 15330 1726882279.95467: Calling groups_inventory to load vars for managed_node3 15330 1726882279.95472: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882279.95481: Calling all_plugins_play to load vars for managed_node3 15330 1726882279.95483: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882279.95489: Calling groups_plugins_play to load vars for managed_node3 15330 1726882279.96830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882279.98310: done with get_vars() 15330 1726882279.98332: done queuing things up, now waiting for results queue to drain 15330 1726882279.98334: results queue empty 15330 1726882279.98334: checking for any_errors_fatal 15330 1726882279.98337: done checking for any_errors_fatal 15330 1726882279.98338: checking for max_fail_percentage 15330 1726882279.98339: done checking for max_fail_percentage 15330 1726882279.98339: checking to see if all hosts have failed and the running result is not ok 15330 1726882279.98340: done checking to see if all hosts have failed 15330 1726882279.98340: getting the remaining hosts for this loop 15330 1726882279.98341: done getting the remaining hosts for this loop 15330 1726882279.98343: getting the next task for host managed_node3 15330 1726882279.98346: done getting next task for host managed_node3 15330 1726882279.98347: ^ task is: TASK: meta (flush_handlers) 15330 1726882279.98348: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882279.98350: getting variables 15330 1726882279.98351: in VariableManager get_vars() 15330 1726882279.98357: Calling all_inventory to load vars for managed_node3 15330 1726882279.98358: Calling groups_inventory to load vars for managed_node3 15330 1726882279.98360: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882279.98363: Calling all_plugins_play to load vars for managed_node3 15330 1726882279.98365: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882279.98366: Calling groups_plugins_play to load vars for managed_node3 15330 1726882279.99224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882280.00430: done with get_vars() 15330 1726882280.00452: done getting variables 15330 1726882280.00507: in VariableManager get_vars() 15330 1726882280.00517: Calling all_inventory to load vars for managed_node3 15330 1726882280.00519: Calling groups_inventory to load vars for managed_node3 15330 1726882280.00522: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882280.00526: Calling all_plugins_play to load vars for managed_node3 15330 1726882280.00529: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882280.00532: Calling groups_plugins_play to load vars for managed_node3 15330 1726882280.01753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882280.03340: done with get_vars() 15330 1726882280.03369: done queuing things up, now waiting for results queue to drain 15330 1726882280.03372: results queue empty 15330 1726882280.03373: checking for any_errors_fatal 15330 1726882280.03374: done checking for any_errors_fatal 15330 1726882280.03374: checking for max_fail_percentage 15330 1726882280.03376: done checking for max_fail_percentage 15330 1726882280.03376: checking to see if all hosts have failed and the running result is not ok 15330 1726882280.03377: done checking to see if all hosts have failed 15330 1726882280.03378: getting the remaining hosts for this loop 15330 1726882280.03378: done getting the remaining hosts for this loop 15330 1726882280.03381: getting the next task for host managed_node3 15330 1726882280.03385: done getting next task for host managed_node3 15330 1726882280.03387: ^ task is: None 15330 1726882280.03388: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882280.03389: done queuing things up, now waiting for results queue to drain 15330 1726882280.03390: results queue empty 15330 1726882280.03391: checking for any_errors_fatal 15330 1726882280.03391: done checking for any_errors_fatal 15330 1726882280.03392: checking for max_fail_percentage 15330 1726882280.03394: done checking for max_fail_percentage 15330 1726882280.03395: checking to see if all hosts have failed and the running result is not ok 15330 1726882280.03400: done checking to see if all hosts have failed 15330 1726882280.03402: getting the next task for host managed_node3 15330 1726882280.03404: done getting next task for host managed_node3 15330 1726882280.03405: ^ task is: None 15330 1726882280.03406: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882280.03462: in VariableManager get_vars() 15330 1726882280.03485: done with get_vars() 15330 1726882280.03491: in VariableManager get_vars() 15330 1726882280.03507: done with get_vars() 15330 1726882280.03512: variable 'omit' from source: magic vars 15330 1726882280.03626: variable 'profile' from source: play vars 15330 1726882280.03733: in VariableManager get_vars() 15330 1726882280.03746: done with get_vars() 15330 1726882280.03766: variable 'omit' from source: magic vars 15330 1726882280.03837: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 15330 1726882280.04624: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15330 1726882280.04651: getting the remaining hosts for this loop 15330 1726882280.04652: done getting the remaining hosts for this loop 15330 1726882280.04655: getting the next task for host managed_node3 15330 1726882280.04657: done getting next task for host managed_node3 15330 1726882280.04659: ^ task is: TASK: Gathering Facts 15330 1726882280.04660: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882280.04662: getting variables 15330 1726882280.04663: in VariableManager get_vars() 15330 1726882280.04675: Calling all_inventory to load vars for managed_node3 15330 1726882280.04677: Calling groups_inventory to load vars for managed_node3 15330 1726882280.04679: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882280.04684: Calling all_plugins_play to load vars for managed_node3 15330 1726882280.04689: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882280.04694: Calling groups_plugins_play to load vars for managed_node3 15330 1726882280.05907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882280.07561: done with get_vars() 15330 1726882280.07580: done getting variables 15330 1726882280.07627: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:31:20 -0400 (0:00:00.538) 0:00:29.282 ****** 15330 1726882280.07652: entering _queue_task() for managed_node3/gather_facts 15330 1726882280.08012: worker is 1 (out of 1 available) 15330 1726882280.08024: exiting _queue_task() for managed_node3/gather_facts 15330 1726882280.08037: done queuing things up, now waiting for results queue to drain 15330 1726882280.08038: waiting for pending results... 15330 1726882280.08306: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882280.08408: in run() - task 12673a56-9f93-e4fe-1358-0000000003a1 15330 1726882280.08499: variable 'ansible_search_path' from source: unknown 15330 1726882280.08505: calling self._execute() 15330 1726882280.08576: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882280.08592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882280.08610: variable 'omit' from source: magic vars 15330 1726882280.09000: variable 'ansible_distribution_major_version' from source: facts 15330 1726882280.09016: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882280.09028: variable 'omit' from source: magic vars 15330 1726882280.09067: variable 'omit' from source: magic vars 15330 1726882280.09164: variable 'omit' from source: magic vars 15330 1726882280.09168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882280.09203: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882280.09230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882280.09254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882280.09276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882280.09316: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882280.09325: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882280.09334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882280.09445: Set connection var ansible_pipelining to False 15330 1726882280.09494: Set connection var ansible_timeout to 10 15330 1726882280.09498: Set connection var ansible_connection to ssh 15330 1726882280.09500: Set connection var ansible_shell_type to sh 15330 1726882280.09503: Set connection var ansible_shell_executable to /bin/sh 15330 1726882280.09505: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882280.09525: variable 'ansible_shell_executable' from source: unknown 15330 1726882280.09698: variable 'ansible_connection' from source: unknown 15330 1726882280.09701: variable 'ansible_module_compression' from source: unknown 15330 1726882280.09704: variable 'ansible_shell_type' from source: unknown 15330 1726882280.09706: variable 'ansible_shell_executable' from source: unknown 15330 1726882280.09708: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882280.09710: variable 'ansible_pipelining' from source: unknown 15330 1726882280.09712: variable 'ansible_timeout' from source: unknown 15330 1726882280.09714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882280.09757: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882280.09773: variable 'omit' from source: magic vars 15330 1726882280.09783: starting attempt loop 15330 1726882280.09795: running the handler 15330 1726882280.09816: variable 'ansible_facts' from source: unknown 15330 1726882280.09845: _low_level_execute_command(): starting 15330 1726882280.09857: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882280.10714: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882280.10738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882280.10823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882280.12514: stdout chunk (state=3): >>>/root <<< 15330 1726882280.12654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882280.12667: stdout chunk (state=3): >>><<< 15330 1726882280.12681: stderr chunk (state=3): >>><<< 15330 1726882280.12719: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882280.12738: _low_level_execute_command(): starting 15330 1726882280.12771: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417 `" && echo ansible-tmp-1726882280.1272542-16625-114810037168417="` echo /root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417 `" ) && sleep 0' 15330 1726882280.13399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882280.13414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882280.13427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882280.13512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882280.13515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882280.13587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882280.13606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882280.13652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882280.13707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882280.15548: stdout chunk (state=3): >>>ansible-tmp-1726882280.1272542-16625-114810037168417=/root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417 <<< 15330 1726882280.15700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882280.15704: stdout chunk (state=3): >>><<< 15330 1726882280.15715: stderr chunk (state=3): >>><<< 15330 1726882280.15757: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882280.1272542-16625-114810037168417=/root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882280.15764: variable 'ansible_module_compression' from source: unknown 15330 1726882280.15810: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882280.15854: variable 'ansible_facts' from source: unknown 15330 1726882280.15983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417/AnsiballZ_setup.py 15330 1726882280.16087: Sending initial data 15330 1726882280.16096: Sent initial data (154 bytes) 15330 1726882280.16526: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882280.16530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882280.16542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882280.16601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882280.16604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882280.16652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882280.18191: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882280.18244: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882280.18330: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpldho140g /root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417/AnsiballZ_setup.py <<< 15330 1726882280.18333: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417/AnsiballZ_setup.py" <<< 15330 1726882280.18375: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpldho140g" to remote "/root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417/AnsiballZ_setup.py" <<< 15330 1726882280.19898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882280.19901: stderr chunk (state=3): >>><<< 15330 1726882280.19905: stdout chunk (state=3): >>><<< 15330 1726882280.19907: done transferring module to remote 15330 1726882280.19909: _low_level_execute_command(): starting 15330 1726882280.19911: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417/ /root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417/AnsiballZ_setup.py && sleep 0' 15330 1726882280.20289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882280.20295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882280.20297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 15330 1726882280.20300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882280.20302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882280.20344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882280.20356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882280.20411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882280.22112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882280.22142: stderr chunk (state=3): >>><<< 15330 1726882280.22146: stdout chunk (state=3): >>><<< 15330 1726882280.22161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882280.22164: _low_level_execute_command(): starting 15330 1726882280.22167: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417/AnsiballZ_setup.py && sleep 0' 15330 1726882280.22790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882280.22808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882280.22826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882280.22909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882280.85369: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2970, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 561, "free": 2970}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 587, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803102208, "block_size": 4096, "block_total": 65519099, "block_available": 63916773, "block_used": 1602326, "inode_total": 131070960, "inode_available": 131029133, "inode_used": 41827, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 1.01904296875, "5m": 0.5087890625, "15m": 0.236328125}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.<<< 15330 1726882280.85436: stdout chunk (state=3): >>>rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "20", "epoch": "1726882280", "epoch_int": "1726882280", "date": "2024-09-20", "time": "21:31:20", "iso8601_micro": "2024-09-21T01:31:20.849716Z", "iso8601": "2024-09-21T01:31:20Z", "iso8601_basic": "20240920T213120849716", "iso8601_basic_short": "20240920T213120", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882280.87458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882280.87462: stdout chunk (state=3): >>><<< 15330 1726882280.87464: stderr chunk (state=3): >>><<< 15330 1726882280.87474: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2970, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 561, "free": 2970}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 587, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803102208, "block_size": 4096, "block_total": 65519099, "block_available": 63916773, "block_used": 1602326, "inode_total": 131070960, "inode_available": 131029133, "inode_used": 41827, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 1.01904296875, "5m": 0.5087890625, "15m": 0.236328125}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "20", "epoch": "1726882280", "epoch_int": "1726882280", "date": "2024-09-20", "time": "21:31:20", "iso8601_micro": "2024-09-21T01:31:20.849716Z", "iso8601": "2024-09-21T01:31:20Z", "iso8601_basic": "20240920T213120849716", "iso8601_basic_short": "20240920T213120", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882280.88319: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882280.88463: _low_level_execute_command(): starting 15330 1726882280.88466: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882280.1272542-16625-114810037168417/ > /dev/null 2>&1 && sleep 0' 15330 1726882280.89596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882280.89600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882280.89831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882280.89834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882280.89836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882280.90414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882280.91730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882280.91756: stderr chunk (state=3): >>><<< 15330 1726882280.91760: stdout chunk (state=3): >>><<< 15330 1726882280.91799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882280.91803: handler run complete 15330 1726882280.91933: variable 'ansible_facts' from source: unknown 15330 1726882280.92222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882280.92898: variable 'ansible_facts' from source: unknown 15330 1726882280.92985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882280.93318: attempt loop complete, returning result 15330 1726882280.93328: _execute() done 15330 1726882280.93358: dumping result to json 15330 1726882280.93395: done dumping result, returning 15330 1726882280.93473: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-0000000003a1] 15330 1726882280.93484: sending task result for task 12673a56-9f93-e4fe-1358-0000000003a1 15330 1726882280.94525: done sending task result for task 12673a56-9f93-e4fe-1358-0000000003a1 15330 1726882280.94529: WORKER PROCESS EXITING ok: [managed_node3] 15330 1726882280.94951: no more pending results, returning what we have 15330 1726882280.94954: results queue empty 15330 1726882280.94955: checking for any_errors_fatal 15330 1726882280.94957: done checking for any_errors_fatal 15330 1726882280.94957: checking for max_fail_percentage 15330 1726882280.94959: done checking for max_fail_percentage 15330 1726882280.94960: checking to see if all hosts have failed and the running result is not ok 15330 1726882280.94960: done checking to see if all hosts have failed 15330 1726882280.94961: getting the remaining hosts for this loop 15330 1726882280.95194: done getting the remaining hosts for this loop 15330 1726882280.95199: getting the next task for host managed_node3 15330 1726882280.95204: done getting next task for host managed_node3 15330 1726882280.95206: ^ task is: TASK: meta (flush_handlers) 15330 1726882280.95207: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882280.95211: getting variables 15330 1726882280.95212: in VariableManager get_vars() 15330 1726882280.95239: Calling all_inventory to load vars for managed_node3 15330 1726882280.95241: Calling groups_inventory to load vars for managed_node3 15330 1726882280.95243: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882280.95252: Calling all_plugins_play to load vars for managed_node3 15330 1726882280.95254: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882280.95257: Calling groups_plugins_play to load vars for managed_node3 15330 1726882280.97798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882280.99517: done with get_vars() 15330 1726882280.99542: done getting variables 15330 1726882280.99624: in VariableManager get_vars() 15330 1726882280.99637: Calling all_inventory to load vars for managed_node3 15330 1726882280.99640: Calling groups_inventory to load vars for managed_node3 15330 1726882280.99642: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882280.99647: Calling all_plugins_play to load vars for managed_node3 15330 1726882280.99650: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882280.99653: Calling groups_plugins_play to load vars for managed_node3 15330 1726882281.01014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882281.04352: done with get_vars() 15330 1726882281.04388: done queuing things up, now waiting for results queue to drain 15330 1726882281.04391: results queue empty 15330 1726882281.04391: checking for any_errors_fatal 15330 1726882281.04397: done checking for any_errors_fatal 15330 1726882281.04398: checking for max_fail_percentage 15330 1726882281.04399: done checking for max_fail_percentage 15330 1726882281.04400: checking to see if all hosts have failed and the running result is not ok 15330 1726882281.04406: done checking to see if all hosts have failed 15330 1726882281.04406: getting the remaining hosts for this loop 15330 1726882281.04408: done getting the remaining hosts for this loop 15330 1726882281.04498: getting the next task for host managed_node3 15330 1726882281.04503: done getting next task for host managed_node3 15330 1726882281.04507: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15330 1726882281.04509: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882281.04526: getting variables 15330 1726882281.04527: in VariableManager get_vars() 15330 1726882281.04545: Calling all_inventory to load vars for managed_node3 15330 1726882281.04547: Calling groups_inventory to load vars for managed_node3 15330 1726882281.04549: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882281.04555: Calling all_plugins_play to load vars for managed_node3 15330 1726882281.04557: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882281.04560: Calling groups_plugins_play to load vars for managed_node3 15330 1726882281.06990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882281.10460: done with get_vars() 15330 1726882281.10484: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:31:21 -0400 (0:00:01.030) 0:00:30.312 ****** 15330 1726882281.10677: entering _queue_task() for managed_node3/include_tasks 15330 1726882281.11565: worker is 1 (out of 1 available) 15330 1726882281.11673: exiting _queue_task() for managed_node3/include_tasks 15330 1726882281.11712: done queuing things up, now waiting for results queue to drain 15330 1726882281.11714: waiting for pending results... 15330 1726882281.12081: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15330 1726882281.12575: in run() - task 12673a56-9f93-e4fe-1358-00000000005a 15330 1726882281.12592: variable 'ansible_search_path' from source: unknown 15330 1726882281.12597: variable 'ansible_search_path' from source: unknown 15330 1726882281.12933: calling self._execute() 15330 1726882281.13034: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882281.13038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882281.13049: variable 'omit' from source: magic vars 15330 1726882281.13844: variable 'ansible_distribution_major_version' from source: facts 15330 1726882281.13848: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882281.13851: _execute() done 15330 1726882281.13853: dumping result to json 15330 1726882281.13855: done dumping result, returning 15330 1726882281.13858: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-e4fe-1358-00000000005a] 15330 1726882281.13868: sending task result for task 12673a56-9f93-e4fe-1358-00000000005a 15330 1726882281.13943: done sending task result for task 12673a56-9f93-e4fe-1358-00000000005a 15330 1726882281.13949: WORKER PROCESS EXITING 15330 1726882281.14123: no more pending results, returning what we have 15330 1726882281.14128: in VariableManager get_vars() 15330 1726882281.14168: Calling all_inventory to load vars for managed_node3 15330 1726882281.14171: Calling groups_inventory to load vars for managed_node3 15330 1726882281.14173: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882281.14185: Calling all_plugins_play to load vars for managed_node3 15330 1726882281.14398: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882281.14404: Calling groups_plugins_play to load vars for managed_node3 15330 1726882281.17245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882281.21329: done with get_vars() 15330 1726882281.21358: variable 'ansible_search_path' from source: unknown 15330 1726882281.21360: variable 'ansible_search_path' from source: unknown 15330 1726882281.21391: we have included files to process 15330 1726882281.21394: generating all_blocks data 15330 1726882281.21396: done generating all_blocks data 15330 1726882281.21397: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15330 1726882281.21398: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15330 1726882281.21401: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15330 1726882281.22575: done processing included file 15330 1726882281.22576: iterating over new_blocks loaded from include file 15330 1726882281.22578: in VariableManager get_vars() 15330 1726882281.22712: done with get_vars() 15330 1726882281.22714: filtering new block on tags 15330 1726882281.22729: done filtering new block on tags 15330 1726882281.22732: in VariableManager get_vars() 15330 1726882281.22749: done with get_vars() 15330 1726882281.22750: filtering new block on tags 15330 1726882281.22766: done filtering new block on tags 15330 1726882281.22768: in VariableManager get_vars() 15330 1726882281.22785: done with get_vars() 15330 1726882281.22787: filtering new block on tags 15330 1726882281.22904: done filtering new block on tags 15330 1726882281.22907: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 15330 1726882281.22912: extending task lists for all hosts with included blocks 15330 1726882281.23728: done extending task lists 15330 1726882281.23729: done processing included files 15330 1726882281.23730: results queue empty 15330 1726882281.23731: checking for any_errors_fatal 15330 1726882281.23732: done checking for any_errors_fatal 15330 1726882281.23733: checking for max_fail_percentage 15330 1726882281.23734: done checking for max_fail_percentage 15330 1726882281.23735: checking to see if all hosts have failed and the running result is not ok 15330 1726882281.23735: done checking to see if all hosts have failed 15330 1726882281.23736: getting the remaining hosts for this loop 15330 1726882281.23737: done getting the remaining hosts for this loop 15330 1726882281.23740: getting the next task for host managed_node3 15330 1726882281.23744: done getting next task for host managed_node3 15330 1726882281.23747: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15330 1726882281.23749: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882281.23758: getting variables 15330 1726882281.23759: in VariableManager get_vars() 15330 1726882281.23774: Calling all_inventory to load vars for managed_node3 15330 1726882281.23776: Calling groups_inventory to load vars for managed_node3 15330 1726882281.23778: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882281.23783: Calling all_plugins_play to load vars for managed_node3 15330 1726882281.23898: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882281.23905: Calling groups_plugins_play to load vars for managed_node3 15330 1726882281.26511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882281.28244: done with get_vars() 15330 1726882281.28281: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:31:21 -0400 (0:00:00.176) 0:00:30.489 ****** 15330 1726882281.28364: entering _queue_task() for managed_node3/setup 15330 1726882281.28771: worker is 1 (out of 1 available) 15330 1726882281.28783: exiting _queue_task() for managed_node3/setup 15330 1726882281.29097: done queuing things up, now waiting for results queue to drain 15330 1726882281.29099: waiting for pending results... 15330 1726882281.29509: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15330 1726882281.29551: in run() - task 12673a56-9f93-e4fe-1358-0000000003e2 15330 1726882281.29797: variable 'ansible_search_path' from source: unknown 15330 1726882281.29801: variable 'ansible_search_path' from source: unknown 15330 1726882281.29803: calling self._execute() 15330 1726882281.29936: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882281.30000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882281.30009: variable 'omit' from source: magic vars 15330 1726882281.30700: variable 'ansible_distribution_major_version' from source: facts 15330 1726882281.30718: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882281.31200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882281.35661: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882281.35986: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882281.35989: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882281.35992: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882281.36009: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882281.36299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882281.36305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882281.36308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882281.36342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882281.36363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882281.36426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882281.36456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882281.36481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882281.36524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882281.36555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882281.36763: variable '__network_required_facts' from source: role '' defaults 15330 1726882281.36776: variable 'ansible_facts' from source: unknown 15330 1726882281.37567: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15330 1726882281.37575: when evaluation is False, skipping this task 15330 1726882281.37582: _execute() done 15330 1726882281.37589: dumping result to json 15330 1726882281.37599: done dumping result, returning 15330 1726882281.37610: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-e4fe-1358-0000000003e2] 15330 1726882281.37629: sending task result for task 12673a56-9f93-e4fe-1358-0000000003e2 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882281.37896: no more pending results, returning what we have 15330 1726882281.37901: results queue empty 15330 1726882281.37902: checking for any_errors_fatal 15330 1726882281.37904: done checking for any_errors_fatal 15330 1726882281.37905: checking for max_fail_percentage 15330 1726882281.37907: done checking for max_fail_percentage 15330 1726882281.37908: checking to see if all hosts have failed and the running result is not ok 15330 1726882281.37909: done checking to see if all hosts have failed 15330 1726882281.37909: getting the remaining hosts for this loop 15330 1726882281.37911: done getting the remaining hosts for this loop 15330 1726882281.37915: getting the next task for host managed_node3 15330 1726882281.37923: done getting next task for host managed_node3 15330 1726882281.37927: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15330 1726882281.37930: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882281.37943: getting variables 15330 1726882281.37945: in VariableManager get_vars() 15330 1726882281.38065: Calling all_inventory to load vars for managed_node3 15330 1726882281.38068: Calling groups_inventory to load vars for managed_node3 15330 1726882281.38071: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882281.38077: done sending task result for task 12673a56-9f93-e4fe-1358-0000000003e2 15330 1726882281.38080: WORKER PROCESS EXITING 15330 1726882281.38090: Calling all_plugins_play to load vars for managed_node3 15330 1726882281.38095: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882281.38098: Calling groups_plugins_play to load vars for managed_node3 15330 1726882281.40515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882281.45765: done with get_vars() 15330 1726882281.45957: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:31:21 -0400 (0:00:00.176) 0:00:30.666 ****** 15330 1726882281.46283: entering _queue_task() for managed_node3/stat 15330 1726882281.47207: worker is 1 (out of 1 available) 15330 1726882281.47222: exiting _queue_task() for managed_node3/stat 15330 1726882281.47235: done queuing things up, now waiting for results queue to drain 15330 1726882281.47236: waiting for pending results... 15330 1726882281.47816: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 15330 1726882281.47982: in run() - task 12673a56-9f93-e4fe-1358-0000000003e4 15330 1726882281.48201: variable 'ansible_search_path' from source: unknown 15330 1726882281.48204: variable 'ansible_search_path' from source: unknown 15330 1726882281.48207: calling self._execute() 15330 1726882281.48319: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882281.48323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882281.48347: variable 'omit' from source: magic vars 15330 1726882281.49364: variable 'ansible_distribution_major_version' from source: facts 15330 1726882281.49367: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882281.49801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882281.51272: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882281.51627: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882281.51663: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882281.52078: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882281.52174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882281.52206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882281.52234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882281.52259: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882281.52763: variable '__network_is_ostree' from source: set_fact 15330 1726882281.52799: Evaluated conditional (not __network_is_ostree is defined): False 15330 1726882281.52809: when evaluation is False, skipping this task 15330 1726882281.52812: _execute() done 15330 1726882281.52816: dumping result to json 15330 1726882281.52818: done dumping result, returning 15330 1726882281.52821: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-e4fe-1358-0000000003e4] 15330 1726882281.52823: sending task result for task 12673a56-9f93-e4fe-1358-0000000003e4 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15330 1726882281.53183: no more pending results, returning what we have 15330 1726882281.53187: results queue empty 15330 1726882281.53188: checking for any_errors_fatal 15330 1726882281.53199: done checking for any_errors_fatal 15330 1726882281.53200: checking for max_fail_percentage 15330 1726882281.53202: done checking for max_fail_percentage 15330 1726882281.53203: checking to see if all hosts have failed and the running result is not ok 15330 1726882281.53204: done checking to see if all hosts have failed 15330 1726882281.53204: getting the remaining hosts for this loop 15330 1726882281.53206: done getting the remaining hosts for this loop 15330 1726882281.53210: getting the next task for host managed_node3 15330 1726882281.53218: done getting next task for host managed_node3 15330 1726882281.53222: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15330 1726882281.53225: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882281.53237: getting variables 15330 1726882281.53240: in VariableManager get_vars() 15330 1726882281.53279: Calling all_inventory to load vars for managed_node3 15330 1726882281.53282: Calling groups_inventory to load vars for managed_node3 15330 1726882281.53284: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882281.53500: Calling all_plugins_play to load vars for managed_node3 15330 1726882281.53505: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882281.53510: Calling groups_plugins_play to load vars for managed_node3 15330 1726882281.54302: done sending task result for task 12673a56-9f93-e4fe-1358-0000000003e4 15330 1726882281.54306: WORKER PROCESS EXITING 15330 1726882281.58272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882281.61485: done with get_vars() 15330 1726882281.61520: done getting variables 15330 1726882281.61580: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:31:21 -0400 (0:00:00.155) 0:00:30.822 ****** 15330 1726882281.61621: entering _queue_task() for managed_node3/set_fact 15330 1726882281.61977: worker is 1 (out of 1 available) 15330 1726882281.61989: exiting _queue_task() for managed_node3/set_fact 15330 1726882281.62148: done queuing things up, now waiting for results queue to drain 15330 1726882281.62150: waiting for pending results... 15330 1726882281.62297: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15330 1726882281.62446: in run() - task 12673a56-9f93-e4fe-1358-0000000003e5 15330 1726882281.62471: variable 'ansible_search_path' from source: unknown 15330 1726882281.62484: variable 'ansible_search_path' from source: unknown 15330 1726882281.62526: calling self._execute() 15330 1726882281.62628: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882281.62640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882281.62654: variable 'omit' from source: magic vars 15330 1726882281.63074: variable 'ansible_distribution_major_version' from source: facts 15330 1726882281.63399: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882281.63530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882281.64264: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882281.64354: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882281.64554: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882281.64558: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882281.64650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882281.64800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882281.64830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882281.64858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882281.64988: variable '__network_is_ostree' from source: set_fact 15330 1726882281.65108: Evaluated conditional (not __network_is_ostree is defined): False 15330 1726882281.65116: when evaluation is False, skipping this task 15330 1726882281.65124: _execute() done 15330 1726882281.65131: dumping result to json 15330 1726882281.65139: done dumping result, returning 15330 1726882281.65151: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-e4fe-1358-0000000003e5] 15330 1726882281.65162: sending task result for task 12673a56-9f93-e4fe-1358-0000000003e5 15330 1726882281.65500: done sending task result for task 12673a56-9f93-e4fe-1358-0000000003e5 15330 1726882281.65503: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15330 1726882281.65556: no more pending results, returning what we have 15330 1726882281.65561: results queue empty 15330 1726882281.65562: checking for any_errors_fatal 15330 1726882281.65570: done checking for any_errors_fatal 15330 1726882281.65570: checking for max_fail_percentage 15330 1726882281.65572: done checking for max_fail_percentage 15330 1726882281.65574: checking to see if all hosts have failed and the running result is not ok 15330 1726882281.65575: done checking to see if all hosts have failed 15330 1726882281.65575: getting the remaining hosts for this loop 15330 1726882281.65577: done getting the remaining hosts for this loop 15330 1726882281.65581: getting the next task for host managed_node3 15330 1726882281.65591: done getting next task for host managed_node3 15330 1726882281.65597: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15330 1726882281.65600: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882281.65615: getting variables 15330 1726882281.65617: in VariableManager get_vars() 15330 1726882281.65659: Calling all_inventory to load vars for managed_node3 15330 1726882281.65662: Calling groups_inventory to load vars for managed_node3 15330 1726882281.65665: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882281.65677: Calling all_plugins_play to load vars for managed_node3 15330 1726882281.65681: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882281.65684: Calling groups_plugins_play to load vars for managed_node3 15330 1726882281.69158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882281.71846: done with get_vars() 15330 1726882281.71878: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:31:21 -0400 (0:00:00.103) 0:00:30.925 ****** 15330 1726882281.71974: entering _queue_task() for managed_node3/service_facts 15330 1726882281.72703: worker is 1 (out of 1 available) 15330 1726882281.72715: exiting _queue_task() for managed_node3/service_facts 15330 1726882281.72726: done queuing things up, now waiting for results queue to drain 15330 1726882281.73113: waiting for pending results... 15330 1726882281.73512: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 15330 1726882281.73800: in run() - task 12673a56-9f93-e4fe-1358-0000000003e7 15330 1726882281.73804: variable 'ansible_search_path' from source: unknown 15330 1726882281.73807: variable 'ansible_search_path' from source: unknown 15330 1726882281.73860: calling self._execute() 15330 1726882281.74076: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882281.74089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882281.74283: variable 'omit' from source: magic vars 15330 1726882281.74936: variable 'ansible_distribution_major_version' from source: facts 15330 1726882281.74955: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882281.74967: variable 'omit' from source: magic vars 15330 1726882281.75080: variable 'omit' from source: magic vars 15330 1726882281.75123: variable 'omit' from source: magic vars 15330 1726882281.75240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882281.75371: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882281.75398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882281.75420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882281.75494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882281.75527: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882281.75534: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882281.75697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882281.75700: Set connection var ansible_pipelining to False 15330 1726882281.75808: Set connection var ansible_timeout to 10 15330 1726882281.75815: Set connection var ansible_connection to ssh 15330 1726882281.75820: Set connection var ansible_shell_type to sh 15330 1726882281.75829: Set connection var ansible_shell_executable to /bin/sh 15330 1726882281.75836: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882281.75863: variable 'ansible_shell_executable' from source: unknown 15330 1726882281.75904: variable 'ansible_connection' from source: unknown 15330 1726882281.76021: variable 'ansible_module_compression' from source: unknown 15330 1726882281.76024: variable 'ansible_shell_type' from source: unknown 15330 1726882281.76026: variable 'ansible_shell_executable' from source: unknown 15330 1726882281.76028: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882281.76031: variable 'ansible_pipelining' from source: unknown 15330 1726882281.76032: variable 'ansible_timeout' from source: unknown 15330 1726882281.76034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882281.76566: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882281.76571: variable 'omit' from source: magic vars 15330 1726882281.76573: starting attempt loop 15330 1726882281.76576: running the handler 15330 1726882281.76578: _low_level_execute_command(): starting 15330 1726882281.76580: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882281.78039: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882281.78058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882281.78109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882281.78255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882281.78424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882281.78464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882281.78516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882281.80184: stdout chunk (state=3): >>>/root <<< 15330 1726882281.80316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882281.80333: stdout chunk (state=3): >>><<< 15330 1726882281.80346: stderr chunk (state=3): >>><<< 15330 1726882281.80374: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882281.80589: _low_level_execute_command(): starting 15330 1726882281.80596: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570 `" && echo ansible-tmp-1726882281.805015-16698-26901509354570="` echo /root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570 `" ) && sleep 0' 15330 1726882281.81809: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882281.81949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882281.81966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882281.81988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882281.82126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882281.84197: stdout chunk (state=3): >>>ansible-tmp-1726882281.805015-16698-26901509354570=/root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570 <<< 15330 1726882281.84259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882281.84262: stdout chunk (state=3): >>><<< 15330 1726882281.84265: stderr chunk (state=3): >>><<< 15330 1726882281.84281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882281.805015-16698-26901509354570=/root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882281.84334: variable 'ansible_module_compression' from source: unknown 15330 1726882281.84446: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15330 1726882281.84799: variable 'ansible_facts' from source: unknown 15330 1726882281.84803: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570/AnsiballZ_service_facts.py 15330 1726882281.85095: Sending initial data 15330 1726882281.85107: Sent initial data (160 bytes) 15330 1726882281.85861: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882281.85876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882281.85895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882281.85920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882281.85938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882281.85950: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882281.85965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882281.85984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882281.86009: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882281.86098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882281.86113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882281.86304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882281.87818: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882281.87862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882281.87916: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpdxj48o4u /root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570/AnsiballZ_service_facts.py <<< 15330 1726882281.87927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570/AnsiballZ_service_facts.py" <<< 15330 1726882281.87964: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpdxj48o4u" to remote "/root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570/AnsiballZ_service_facts.py" <<< 15330 1726882281.89552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882281.89561: stdout chunk (state=3): >>><<< 15330 1726882281.89574: stderr chunk (state=3): >>><<< 15330 1726882281.89719: done transferring module to remote 15330 1726882281.89800: _low_level_execute_command(): starting 15330 1726882281.89804: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570/ /root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570/AnsiballZ_service_facts.py && sleep 0' 15330 1726882281.90595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882281.90612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882281.90708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882281.90720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882281.90743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882281.90763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882281.90821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882281.92625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882281.92665: stdout chunk (state=3): >>><<< 15330 1726882281.92790: stderr chunk (state=3): >>><<< 15330 1726882281.92795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882281.92798: _low_level_execute_command(): starting 15330 1726882281.92801: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570/AnsiballZ_service_facts.py && sleep 0' 15330 1726882281.94095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882281.94181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882281.94202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882281.94273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882281.94307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882281.94333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882281.94427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882283.43703: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 15330 1726882283.43738: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15330 1726882283.45473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882283.45477: stdout chunk (state=3): >>><<< 15330 1726882283.45479: stderr chunk (state=3): >>><<< 15330 1726882283.45700: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882283.57211: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882283.57228: _low_level_execute_command(): starting 15330 1726882283.57238: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882281.805015-16698-26901509354570/ > /dev/null 2>&1 && sleep 0' 15330 1726882283.58466: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882283.58804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882283.58825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882283.58892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882283.60804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882283.60820: stdout chunk (state=3): >>><<< 15330 1726882283.60833: stderr chunk (state=3): >>><<< 15330 1726882283.60852: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882283.60863: handler run complete 15330 1726882283.61197: variable 'ansible_facts' from source: unknown 15330 1726882283.61538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882283.62517: variable 'ansible_facts' from source: unknown 15330 1726882283.63098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882283.63598: attempt loop complete, returning result 15330 1726882283.63602: _execute() done 15330 1726882283.63604: dumping result to json 15330 1726882283.63606: done dumping result, returning 15330 1726882283.63608: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-e4fe-1358-0000000003e7] 15330 1726882283.63610: sending task result for task 12673a56-9f93-e4fe-1358-0000000003e7 15330 1726882283.74103: done sending task result for task 12673a56-9f93-e4fe-1358-0000000003e7 15330 1726882283.74108: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882283.74150: no more pending results, returning what we have 15330 1726882283.74153: results queue empty 15330 1726882283.74153: checking for any_errors_fatal 15330 1726882283.74156: done checking for any_errors_fatal 15330 1726882283.74157: checking for max_fail_percentage 15330 1726882283.74158: done checking for max_fail_percentage 15330 1726882283.74159: checking to see if all hosts have failed and the running result is not ok 15330 1726882283.74160: done checking to see if all hosts have failed 15330 1726882283.74160: getting the remaining hosts for this loop 15330 1726882283.74162: done getting the remaining hosts for this loop 15330 1726882283.74164: getting the next task for host managed_node3 15330 1726882283.74168: done getting next task for host managed_node3 15330 1726882283.74171: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15330 1726882283.74173: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882283.74181: getting variables 15330 1726882283.74182: in VariableManager get_vars() 15330 1726882283.74210: Calling all_inventory to load vars for managed_node3 15330 1726882283.74213: Calling groups_inventory to load vars for managed_node3 15330 1726882283.74215: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882283.74224: Calling all_plugins_play to load vars for managed_node3 15330 1726882283.74226: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882283.74229: Calling groups_plugins_play to load vars for managed_node3 15330 1726882283.78079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882283.83097: done with get_vars() 15330 1726882283.83129: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:31:23 -0400 (0:00:02.113) 0:00:33.039 ****** 15330 1726882283.83332: entering _queue_task() for managed_node3/package_facts 15330 1726882283.84001: worker is 1 (out of 1 available) 15330 1726882283.84016: exiting _queue_task() for managed_node3/package_facts 15330 1726882283.84029: done queuing things up, now waiting for results queue to drain 15330 1726882283.84030: waiting for pending results... 15330 1726882283.84810: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 15330 1726882283.84906: in run() - task 12673a56-9f93-e4fe-1358-0000000003e8 15330 1726882283.85099: variable 'ansible_search_path' from source: unknown 15330 1726882283.85103: variable 'ansible_search_path' from source: unknown 15330 1726882283.85106: calling self._execute() 15330 1726882283.85228: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882283.85232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882283.85242: variable 'omit' from source: magic vars 15330 1726882283.86101: variable 'ansible_distribution_major_version' from source: facts 15330 1726882283.86112: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882283.86119: variable 'omit' from source: magic vars 15330 1726882283.86239: variable 'omit' from source: magic vars 15330 1726882283.86598: variable 'omit' from source: magic vars 15330 1726882283.86602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882283.86606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882283.86609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882283.86624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882283.86637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882283.86666: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882283.86670: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882283.86673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882283.86887: Set connection var ansible_pipelining to False 15330 1726882283.86904: Set connection var ansible_timeout to 10 15330 1726882283.86908: Set connection var ansible_connection to ssh 15330 1726882283.86910: Set connection var ansible_shell_type to sh 15330 1726882283.87005: Set connection var ansible_shell_executable to /bin/sh 15330 1726882283.87011: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882283.87039: variable 'ansible_shell_executable' from source: unknown 15330 1726882283.87043: variable 'ansible_connection' from source: unknown 15330 1726882283.87046: variable 'ansible_module_compression' from source: unknown 15330 1726882283.87048: variable 'ansible_shell_type' from source: unknown 15330 1726882283.87050: variable 'ansible_shell_executable' from source: unknown 15330 1726882283.87053: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882283.87059: variable 'ansible_pipelining' from source: unknown 15330 1726882283.87061: variable 'ansible_timeout' from source: unknown 15330 1726882283.87065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882283.87698: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882283.87703: variable 'omit' from source: magic vars 15330 1726882283.87705: starting attempt loop 15330 1726882283.87708: running the handler 15330 1726882283.87710: _low_level_execute_command(): starting 15330 1726882283.87712: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882283.89072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882283.89084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882283.89101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882283.89118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882283.89401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882283.89515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882283.89587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882283.91236: stdout chunk (state=3): >>>/root <<< 15330 1726882283.91365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882283.91369: stdout chunk (state=3): >>><<< 15330 1726882283.91499: stderr chunk (state=3): >>><<< 15330 1726882283.91503: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882283.91506: _low_level_execute_command(): starting 15330 1726882283.91508: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651 `" && echo ansible-tmp-1726882283.9140997-16809-13295510263651="` echo /root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651 `" ) && sleep 0' 15330 1726882283.92748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882283.92808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882283.92820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882283.92834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882283.92997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882283.93117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882283.93196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882283.95105: stdout chunk (state=3): >>>ansible-tmp-1726882283.9140997-16809-13295510263651=/root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651 <<< 15330 1726882283.95207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882283.95211: stdout chunk (state=3): >>><<< 15330 1726882283.95217: stderr chunk (state=3): >>><<< 15330 1726882283.95320: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882283.9140997-16809-13295510263651=/root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882283.95368: variable 'ansible_module_compression' from source: unknown 15330 1726882283.95698: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15330 1726882283.95702: variable 'ansible_facts' from source: unknown 15330 1726882283.95996: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651/AnsiballZ_package_facts.py 15330 1726882283.96418: Sending initial data 15330 1726882283.96422: Sent initial data (161 bytes) 15330 1726882283.97681: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882283.97800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882283.97804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882283.97806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882283.97809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882283.97861: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882283.97935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882283.98062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882283.98101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882283.99806: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882283.99837: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882283.99885: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp7u_wpm02 /root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651/AnsiballZ_package_facts.py <<< 15330 1726882283.99889: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651/AnsiballZ_package_facts.py" <<< 15330 1726882284.00017: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp7u_wpm02" to remote "/root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651/AnsiballZ_package_facts.py" <<< 15330 1726882284.03063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882284.03097: stderr chunk (state=3): >>><<< 15330 1726882284.03101: stdout chunk (state=3): >>><<< 15330 1726882284.03122: done transferring module to remote 15330 1726882284.03133: _low_level_execute_command(): starting 15330 1726882284.03136: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651/ /root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651/AnsiballZ_package_facts.py && sleep 0' 15330 1726882284.04369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882284.04391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882284.04408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882284.04426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882284.04442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882284.04450: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882284.04629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882284.04633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882284.04635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882284.04639: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882284.04641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882284.04654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882284.04687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882284.06591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882284.06919: stdout chunk (state=3): >>><<< 15330 1726882284.06923: stderr chunk (state=3): >>><<< 15330 1726882284.06926: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882284.06929: _low_level_execute_command(): starting 15330 1726882284.06932: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651/AnsiballZ_package_facts.py && sleep 0' 15330 1726882284.08514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882284.08560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882284.08638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882284.08764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882284.08820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882284.08930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882284.08986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882284.52194: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 15330 1726882284.52377: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 15330 1726882284.52479: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15330 1726882284.54153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882284.54157: stdout chunk (state=3): >>><<< 15330 1726882284.54160: stderr chunk (state=3): >>><<< 15330 1726882284.54412: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882284.57608: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882284.57639: _low_level_execute_command(): starting 15330 1726882284.57642: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882283.9140997-16809-13295510263651/ > /dev/null 2>&1 && sleep 0' 15330 1726882284.58224: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882284.58240: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882284.58243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882284.58299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882284.58304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882284.58307: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882284.58310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882284.58312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882284.58315: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882284.58317: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15330 1726882284.58319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882284.58327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882284.58337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882284.58402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882284.58406: stderr chunk (state=3): >>>debug2: match found <<< 15330 1726882284.58408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882284.58436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882284.58448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882284.58462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882284.58555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882284.60389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882284.60419: stderr chunk (state=3): >>><<< 15330 1726882284.60429: stdout chunk (state=3): >>><<< 15330 1726882284.60499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882284.60503: handler run complete 15330 1726882284.61325: variable 'ansible_facts' from source: unknown 15330 1726882284.61826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882284.65506: variable 'ansible_facts' from source: unknown 15330 1726882284.66282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882284.66905: attempt loop complete, returning result 15330 1726882284.66924: _execute() done 15330 1726882284.66931: dumping result to json 15330 1726882284.67138: done dumping result, returning 15330 1726882284.67152: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-e4fe-1358-0000000003e8] 15330 1726882284.67162: sending task result for task 12673a56-9f93-e4fe-1358-0000000003e8 15330 1726882284.70650: done sending task result for task 12673a56-9f93-e4fe-1358-0000000003e8 15330 1726882284.70654: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882284.70761: no more pending results, returning what we have 15330 1726882284.70770: results queue empty 15330 1726882284.70771: checking for any_errors_fatal 15330 1726882284.70776: done checking for any_errors_fatal 15330 1726882284.70777: checking for max_fail_percentage 15330 1726882284.70778: done checking for max_fail_percentage 15330 1726882284.70779: checking to see if all hosts have failed and the running result is not ok 15330 1726882284.70780: done checking to see if all hosts have failed 15330 1726882284.70781: getting the remaining hosts for this loop 15330 1726882284.70782: done getting the remaining hosts for this loop 15330 1726882284.70788: getting the next task for host managed_node3 15330 1726882284.70796: done getting next task for host managed_node3 15330 1726882284.70800: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15330 1726882284.70802: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882284.70812: getting variables 15330 1726882284.70814: in VariableManager get_vars() 15330 1726882284.70847: Calling all_inventory to load vars for managed_node3 15330 1726882284.70850: Calling groups_inventory to load vars for managed_node3 15330 1726882284.70852: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882284.70861: Calling all_plugins_play to load vars for managed_node3 15330 1726882284.70864: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882284.70867: Calling groups_plugins_play to load vars for managed_node3 15330 1726882284.72555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882284.74144: done with get_vars() 15330 1726882284.74167: done getting variables 15330 1726882284.74226: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:31:24 -0400 (0:00:00.909) 0:00:33.948 ****** 15330 1726882284.74268: entering _queue_task() for managed_node3/debug 15330 1726882284.74769: worker is 1 (out of 1 available) 15330 1726882284.74780: exiting _queue_task() for managed_node3/debug 15330 1726882284.74792: done queuing things up, now waiting for results queue to drain 15330 1726882284.74996: waiting for pending results... 15330 1726882284.75067: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 15330 1726882284.75178: in run() - task 12673a56-9f93-e4fe-1358-00000000005b 15330 1726882284.75199: variable 'ansible_search_path' from source: unknown 15330 1726882284.75206: variable 'ansible_search_path' from source: unknown 15330 1726882284.75250: calling self._execute() 15330 1726882284.75356: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882284.75367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882284.75381: variable 'omit' from source: magic vars 15330 1726882284.75850: variable 'ansible_distribution_major_version' from source: facts 15330 1726882284.75870: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882284.75882: variable 'omit' from source: magic vars 15330 1726882284.75922: variable 'omit' from source: magic vars 15330 1726882284.76088: variable 'network_provider' from source: set_fact 15330 1726882284.76091: variable 'omit' from source: magic vars 15330 1726882284.76096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882284.76133: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882284.76163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882284.76183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882284.76204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882284.76235: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882284.76244: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882284.76251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882284.76352: Set connection var ansible_pipelining to False 15330 1726882284.76369: Set connection var ansible_timeout to 10 15330 1726882284.76376: Set connection var ansible_connection to ssh 15330 1726882284.76382: Set connection var ansible_shell_type to sh 15330 1726882284.76391: Set connection var ansible_shell_executable to /bin/sh 15330 1726882284.76401: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882284.76431: variable 'ansible_shell_executable' from source: unknown 15330 1726882284.76498: variable 'ansible_connection' from source: unknown 15330 1726882284.76501: variable 'ansible_module_compression' from source: unknown 15330 1726882284.76503: variable 'ansible_shell_type' from source: unknown 15330 1726882284.76505: variable 'ansible_shell_executable' from source: unknown 15330 1726882284.76507: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882284.76509: variable 'ansible_pipelining' from source: unknown 15330 1726882284.76511: variable 'ansible_timeout' from source: unknown 15330 1726882284.76513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882284.76608: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882284.76628: variable 'omit' from source: magic vars 15330 1726882284.76643: starting attempt loop 15330 1726882284.76650: running the handler 15330 1726882284.76700: handler run complete 15330 1726882284.76718: attempt loop complete, returning result 15330 1726882284.76725: _execute() done 15330 1726882284.76731: dumping result to json 15330 1726882284.76742: done dumping result, returning 15330 1726882284.76852: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-e4fe-1358-00000000005b] 15330 1726882284.76855: sending task result for task 12673a56-9f93-e4fe-1358-00000000005b 15330 1726882284.76920: done sending task result for task 12673a56-9f93-e4fe-1358-00000000005b 15330 1726882284.76923: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 15330 1726882284.77012: no more pending results, returning what we have 15330 1726882284.77015: results queue empty 15330 1726882284.77016: checking for any_errors_fatal 15330 1726882284.77027: done checking for any_errors_fatal 15330 1726882284.77028: checking for max_fail_percentage 15330 1726882284.77029: done checking for max_fail_percentage 15330 1726882284.77030: checking to see if all hosts have failed and the running result is not ok 15330 1726882284.77031: done checking to see if all hosts have failed 15330 1726882284.77032: getting the remaining hosts for this loop 15330 1726882284.77033: done getting the remaining hosts for this loop 15330 1726882284.77037: getting the next task for host managed_node3 15330 1726882284.77043: done getting next task for host managed_node3 15330 1726882284.77047: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15330 1726882284.77049: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882284.77058: getting variables 15330 1726882284.77060: in VariableManager get_vars() 15330 1726882284.77099: Calling all_inventory to load vars for managed_node3 15330 1726882284.77102: Calling groups_inventory to load vars for managed_node3 15330 1726882284.77104: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882284.77115: Calling all_plugins_play to load vars for managed_node3 15330 1726882284.77118: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882284.77120: Calling groups_plugins_play to load vars for managed_node3 15330 1726882284.78687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882284.80186: done with get_vars() 15330 1726882284.80213: done getting variables 15330 1726882284.80275: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:31:24 -0400 (0:00:00.060) 0:00:34.009 ****** 15330 1726882284.80311: entering _queue_task() for managed_node3/fail 15330 1726882284.80645: worker is 1 (out of 1 available) 15330 1726882284.80657: exiting _queue_task() for managed_node3/fail 15330 1726882284.80670: done queuing things up, now waiting for results queue to drain 15330 1726882284.80671: waiting for pending results... 15330 1726882284.80957: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15330 1726882284.81101: in run() - task 12673a56-9f93-e4fe-1358-00000000005c 15330 1726882284.81105: variable 'ansible_search_path' from source: unknown 15330 1726882284.81108: variable 'ansible_search_path' from source: unknown 15330 1726882284.81132: calling self._execute() 15330 1726882284.81233: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882284.81244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882284.81256: variable 'omit' from source: magic vars 15330 1726882284.81633: variable 'ansible_distribution_major_version' from source: facts 15330 1726882284.81656: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882284.81871: variable 'network_state' from source: role '' defaults 15330 1726882284.81875: Evaluated conditional (network_state != {}): False 15330 1726882284.81878: when evaluation is False, skipping this task 15330 1726882284.81880: _execute() done 15330 1726882284.81883: dumping result to json 15330 1726882284.81885: done dumping result, returning 15330 1726882284.81887: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-e4fe-1358-00000000005c] 15330 1726882284.81890: sending task result for task 12673a56-9f93-e4fe-1358-00000000005c 15330 1726882284.81964: done sending task result for task 12673a56-9f93-e4fe-1358-00000000005c 15330 1726882284.81967: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882284.82017: no more pending results, returning what we have 15330 1726882284.82020: results queue empty 15330 1726882284.82021: checking for any_errors_fatal 15330 1726882284.82029: done checking for any_errors_fatal 15330 1726882284.82030: checking for max_fail_percentage 15330 1726882284.82032: done checking for max_fail_percentage 15330 1726882284.82033: checking to see if all hosts have failed and the running result is not ok 15330 1726882284.82034: done checking to see if all hosts have failed 15330 1726882284.82035: getting the remaining hosts for this loop 15330 1726882284.82036: done getting the remaining hosts for this loop 15330 1726882284.82040: getting the next task for host managed_node3 15330 1726882284.82047: done getting next task for host managed_node3 15330 1726882284.82050: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15330 1726882284.82053: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882284.82067: getting variables 15330 1726882284.82069: in VariableManager get_vars() 15330 1726882284.82109: Calling all_inventory to load vars for managed_node3 15330 1726882284.82112: Calling groups_inventory to load vars for managed_node3 15330 1726882284.82114: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882284.82127: Calling all_plugins_play to load vars for managed_node3 15330 1726882284.82130: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882284.82134: Calling groups_plugins_play to load vars for managed_node3 15330 1726882284.83642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882284.85183: done with get_vars() 15330 1726882284.85210: done getting variables 15330 1726882284.85268: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:31:24 -0400 (0:00:00.049) 0:00:34.059 ****** 15330 1726882284.85302: entering _queue_task() for managed_node3/fail 15330 1726882284.85730: worker is 1 (out of 1 available) 15330 1726882284.85741: exiting _queue_task() for managed_node3/fail 15330 1726882284.85752: done queuing things up, now waiting for results queue to drain 15330 1726882284.85754: waiting for pending results... 15330 1726882284.86099: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15330 1726882284.86104: in run() - task 12673a56-9f93-e4fe-1358-00000000005d 15330 1726882284.86108: variable 'ansible_search_path' from source: unknown 15330 1726882284.86111: variable 'ansible_search_path' from source: unknown 15330 1726882284.86122: calling self._execute() 15330 1726882284.86230: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882284.86242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882284.86256: variable 'omit' from source: magic vars 15330 1726882284.86630: variable 'ansible_distribution_major_version' from source: facts 15330 1726882284.86645: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882284.86769: variable 'network_state' from source: role '' defaults 15330 1726882284.86784: Evaluated conditional (network_state != {}): False 15330 1726882284.86791: when evaluation is False, skipping this task 15330 1726882284.86801: _execute() done 15330 1726882284.86809: dumping result to json 15330 1726882284.86818: done dumping result, returning 15330 1726882284.86830: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-e4fe-1358-00000000005d] 15330 1726882284.86873: sending task result for task 12673a56-9f93-e4fe-1358-00000000005d 15330 1726882284.86944: done sending task result for task 12673a56-9f93-e4fe-1358-00000000005d 15330 1726882284.86947: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882284.87025: no more pending results, returning what we have 15330 1726882284.87029: results queue empty 15330 1726882284.87030: checking for any_errors_fatal 15330 1726882284.87038: done checking for any_errors_fatal 15330 1726882284.87039: checking for max_fail_percentage 15330 1726882284.87041: done checking for max_fail_percentage 15330 1726882284.87042: checking to see if all hosts have failed and the running result is not ok 15330 1726882284.87043: done checking to see if all hosts have failed 15330 1726882284.87043: getting the remaining hosts for this loop 15330 1726882284.87045: done getting the remaining hosts for this loop 15330 1726882284.87050: getting the next task for host managed_node3 15330 1726882284.87056: done getting next task for host managed_node3 15330 1726882284.87060: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15330 1726882284.87062: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882284.87077: getting variables 15330 1726882284.87078: in VariableManager get_vars() 15330 1726882284.87119: Calling all_inventory to load vars for managed_node3 15330 1726882284.87122: Calling groups_inventory to load vars for managed_node3 15330 1726882284.87124: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882284.87137: Calling all_plugins_play to load vars for managed_node3 15330 1726882284.87140: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882284.87142: Calling groups_plugins_play to load vars for managed_node3 15330 1726882284.88808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882284.90284: done with get_vars() 15330 1726882284.90314: done getting variables 15330 1726882284.90376: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:31:24 -0400 (0:00:00.051) 0:00:34.110 ****** 15330 1726882284.90412: entering _queue_task() for managed_node3/fail 15330 1726882284.90760: worker is 1 (out of 1 available) 15330 1726882284.90773: exiting _queue_task() for managed_node3/fail 15330 1726882284.90785: done queuing things up, now waiting for results queue to drain 15330 1726882284.90787: waiting for pending results... 15330 1726882284.91214: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15330 1726882284.91220: in run() - task 12673a56-9f93-e4fe-1358-00000000005e 15330 1726882284.91226: variable 'ansible_search_path' from source: unknown 15330 1726882284.91234: variable 'ansible_search_path' from source: unknown 15330 1726882284.91277: calling self._execute() 15330 1726882284.91390: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882284.91406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882284.91427: variable 'omit' from source: magic vars 15330 1726882284.91819: variable 'ansible_distribution_major_version' from source: facts 15330 1726882284.91835: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882284.92016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882284.94468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882284.94581: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882284.94615: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882284.94728: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882284.94829: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882284.95063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882284.95159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882284.95162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882284.95212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882284.95288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882284.95517: variable 'ansible_distribution_major_version' from source: facts 15330 1726882284.95538: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15330 1726882284.95816: variable 'ansible_distribution' from source: facts 15330 1726882284.95827: variable '__network_rh_distros' from source: role '' defaults 15330 1726882284.95842: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15330 1726882284.96221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882284.96261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882284.96290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882284.96336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882284.96365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882284.96423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882284.96450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882284.96488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882284.96549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882284.96577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882284.96625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882284.96683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882284.96689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882284.96734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882284.96752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882284.97085: variable 'network_connections' from source: play vars 15330 1726882284.97119: variable 'profile' from source: play vars 15330 1726882284.97228: variable 'profile' from source: play vars 15330 1726882284.97231: variable 'interface' from source: set_fact 15330 1726882284.97264: variable 'interface' from source: set_fact 15330 1726882284.97280: variable 'network_state' from source: role '' defaults 15330 1726882284.97359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882284.97541: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882284.97591: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882284.97632: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882284.97698: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882284.97722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882284.97756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882284.97881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882284.97885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882284.97888: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15330 1726882284.97890: when evaluation is False, skipping this task 15330 1726882284.97892: _execute() done 15330 1726882284.97896: dumping result to json 15330 1726882284.97898: done dumping result, returning 15330 1726882284.97900: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-e4fe-1358-00000000005e] 15330 1726882284.97903: sending task result for task 12673a56-9f93-e4fe-1358-00000000005e 15330 1726882284.98204: done sending task result for task 12673a56-9f93-e4fe-1358-00000000005e 15330 1726882284.98208: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15330 1726882284.98256: no more pending results, returning what we have 15330 1726882284.98259: results queue empty 15330 1726882284.98260: checking for any_errors_fatal 15330 1726882284.98265: done checking for any_errors_fatal 15330 1726882284.98266: checking for max_fail_percentage 15330 1726882284.98267: done checking for max_fail_percentage 15330 1726882284.98268: checking to see if all hosts have failed and the running result is not ok 15330 1726882284.98269: done checking to see if all hosts have failed 15330 1726882284.98270: getting the remaining hosts for this loop 15330 1726882284.98271: done getting the remaining hosts for this loop 15330 1726882284.98275: getting the next task for host managed_node3 15330 1726882284.98281: done getting next task for host managed_node3 15330 1726882284.98285: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15330 1726882284.98287: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882284.98301: getting variables 15330 1726882284.98303: in VariableManager get_vars() 15330 1726882284.98347: Calling all_inventory to load vars for managed_node3 15330 1726882284.98350: Calling groups_inventory to load vars for managed_node3 15330 1726882284.98352: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882284.98362: Calling all_plugins_play to load vars for managed_node3 15330 1726882284.98365: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882284.98368: Calling groups_plugins_play to load vars for managed_node3 15330 1726882285.01475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882285.03625: done with get_vars() 15330 1726882285.03902: done getting variables 15330 1726882285.03960: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:31:25 -0400 (0:00:00.135) 0:00:34.245 ****** 15330 1726882285.04151: entering _queue_task() for managed_node3/dnf 15330 1726882285.04941: worker is 1 (out of 1 available) 15330 1726882285.04952: exiting _queue_task() for managed_node3/dnf 15330 1726882285.04965: done queuing things up, now waiting for results queue to drain 15330 1726882285.04966: waiting for pending results... 15330 1726882285.05485: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15330 1726882285.05966: in run() - task 12673a56-9f93-e4fe-1358-00000000005f 15330 1726882285.05970: variable 'ansible_search_path' from source: unknown 15330 1726882285.05973: variable 'ansible_search_path' from source: unknown 15330 1726882285.05975: calling self._execute() 15330 1726882285.06184: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882285.06402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882285.06406: variable 'omit' from source: magic vars 15330 1726882285.07053: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.07069: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882285.07584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882285.10250: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882285.10383: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882285.10633: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882285.10730: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882285.10778: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882285.10935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.11186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.11190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.11194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.11403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.11535: variable 'ansible_distribution' from source: facts 15330 1726882285.11546: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.11566: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15330 1726882285.11802: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882285.12122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.12151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.12184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.12313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.12332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.12376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.12698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.12703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.12706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.12709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.12823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.12826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.12829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.12998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.13001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.13280: variable 'network_connections' from source: play vars 15330 1726882285.13397: variable 'profile' from source: play vars 15330 1726882285.13451: variable 'profile' from source: play vars 15330 1726882285.13483: variable 'interface' from source: set_fact 15330 1726882285.13548: variable 'interface' from source: set_fact 15330 1726882285.13766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882285.14136: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882285.14282: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882285.14285: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882285.14295: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882285.14348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882285.14376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882285.14417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.14451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882285.14501: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882285.14755: variable 'network_connections' from source: play vars 15330 1726882285.14764: variable 'profile' from source: play vars 15330 1726882285.14833: variable 'profile' from source: play vars 15330 1726882285.14842: variable 'interface' from source: set_fact 15330 1726882285.14909: variable 'interface' from source: set_fact 15330 1726882285.14937: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15330 1726882285.14944: when evaluation is False, skipping this task 15330 1726882285.14951: _execute() done 15330 1726882285.14958: dumping result to json 15330 1726882285.14965: done dumping result, returning 15330 1726882285.14977: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-00000000005f] 15330 1726882285.14994: sending task result for task 12673a56-9f93-e4fe-1358-00000000005f 15330 1726882285.15165: done sending task result for task 12673a56-9f93-e4fe-1358-00000000005f 15330 1726882285.15168: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15330 1726882285.15256: no more pending results, returning what we have 15330 1726882285.15259: results queue empty 15330 1726882285.15260: checking for any_errors_fatal 15330 1726882285.15268: done checking for any_errors_fatal 15330 1726882285.15269: checking for max_fail_percentage 15330 1726882285.15271: done checking for max_fail_percentage 15330 1726882285.15272: checking to see if all hosts have failed and the running result is not ok 15330 1726882285.15273: done checking to see if all hosts have failed 15330 1726882285.15273: getting the remaining hosts for this loop 15330 1726882285.15275: done getting the remaining hosts for this loop 15330 1726882285.15279: getting the next task for host managed_node3 15330 1726882285.15286: done getting next task for host managed_node3 15330 1726882285.15291: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15330 1726882285.15294: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882285.15310: getting variables 15330 1726882285.15312: in VariableManager get_vars() 15330 1726882285.15358: Calling all_inventory to load vars for managed_node3 15330 1726882285.15362: Calling groups_inventory to load vars for managed_node3 15330 1726882285.15364: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882285.15375: Calling all_plugins_play to load vars for managed_node3 15330 1726882285.15378: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882285.15381: Calling groups_plugins_play to load vars for managed_node3 15330 1726882285.17684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882285.19444: done with get_vars() 15330 1726882285.19471: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15330 1726882285.19545: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:31:25 -0400 (0:00:00.155) 0:00:34.401 ****** 15330 1726882285.19574: entering _queue_task() for managed_node3/yum 15330 1726882285.20302: worker is 1 (out of 1 available) 15330 1726882285.20314: exiting _queue_task() for managed_node3/yum 15330 1726882285.20325: done queuing things up, now waiting for results queue to drain 15330 1726882285.20327: waiting for pending results... 15330 1726882285.21248: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15330 1726882285.21267: in run() - task 12673a56-9f93-e4fe-1358-000000000060 15330 1726882285.21280: variable 'ansible_search_path' from source: unknown 15330 1726882285.21283: variable 'ansible_search_path' from source: unknown 15330 1726882285.21726: calling self._execute() 15330 1726882285.21833: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882285.21837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882285.21848: variable 'omit' from source: magic vars 15330 1726882285.23030: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.23042: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882285.23630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882285.28634: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882285.28737: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882285.28780: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882285.28945: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882285.29115: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882285.29198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.29337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.29363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.29443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.29458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.29723: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.29773: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15330 1726882285.29776: when evaluation is False, skipping this task 15330 1726882285.29778: _execute() done 15330 1726882285.29781: dumping result to json 15330 1726882285.29783: done dumping result, returning 15330 1726882285.29785: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-000000000060] 15330 1726882285.29788: sending task result for task 12673a56-9f93-e4fe-1358-000000000060 15330 1726882285.29947: done sending task result for task 12673a56-9f93-e4fe-1358-000000000060 15330 1726882285.29950: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15330 1726882285.30033: no more pending results, returning what we have 15330 1726882285.30037: results queue empty 15330 1726882285.30038: checking for any_errors_fatal 15330 1726882285.30044: done checking for any_errors_fatal 15330 1726882285.30045: checking for max_fail_percentage 15330 1726882285.30046: done checking for max_fail_percentage 15330 1726882285.30047: checking to see if all hosts have failed and the running result is not ok 15330 1726882285.30048: done checking to see if all hosts have failed 15330 1726882285.30049: getting the remaining hosts for this loop 15330 1726882285.30050: done getting the remaining hosts for this loop 15330 1726882285.30053: getting the next task for host managed_node3 15330 1726882285.30059: done getting next task for host managed_node3 15330 1726882285.30062: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15330 1726882285.30064: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882285.30075: getting variables 15330 1726882285.30076: in VariableManager get_vars() 15330 1726882285.30123: Calling all_inventory to load vars for managed_node3 15330 1726882285.30126: Calling groups_inventory to load vars for managed_node3 15330 1726882285.30129: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882285.30138: Calling all_plugins_play to load vars for managed_node3 15330 1726882285.30141: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882285.30144: Calling groups_plugins_play to load vars for managed_node3 15330 1726882285.32684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882285.34682: done with get_vars() 15330 1726882285.34713: done getting variables 15330 1726882285.34851: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:31:25 -0400 (0:00:00.153) 0:00:34.554 ****** 15330 1726882285.34881: entering _queue_task() for managed_node3/fail 15330 1726882285.35848: worker is 1 (out of 1 available) 15330 1726882285.35860: exiting _queue_task() for managed_node3/fail 15330 1726882285.35873: done queuing things up, now waiting for results queue to drain 15330 1726882285.35874: waiting for pending results... 15330 1726882285.36925: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15330 1726882285.36931: in run() - task 12673a56-9f93-e4fe-1358-000000000061 15330 1726882285.37022: variable 'ansible_search_path' from source: unknown 15330 1726882285.37056: variable 'ansible_search_path' from source: unknown 15330 1726882285.37163: calling self._execute() 15330 1726882285.37457: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882285.37590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882285.37596: variable 'omit' from source: magic vars 15330 1726882285.38017: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.38041: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882285.38136: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882285.38266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882285.40790: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882285.41143: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882285.41180: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882285.41236: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882285.41264: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882285.41362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.41496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.41503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.41543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.41557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.41620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.41644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.41673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.41721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.41731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.41768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.41797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.41821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.41866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.41870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.42192: variable 'network_connections' from source: play vars 15330 1726882285.42198: variable 'profile' from source: play vars 15330 1726882285.42200: variable 'profile' from source: play vars 15330 1726882285.42203: variable 'interface' from source: set_fact 15330 1726882285.42224: variable 'interface' from source: set_fact 15330 1726882285.42301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882285.42461: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882285.42567: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882285.42570: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882285.42572: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882285.42600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882285.42629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882285.42664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.42688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882285.42741: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882285.43044: variable 'network_connections' from source: play vars 15330 1726882285.43049: variable 'profile' from source: play vars 15330 1726882285.43133: variable 'profile' from source: play vars 15330 1726882285.43136: variable 'interface' from source: set_fact 15330 1726882285.43319: variable 'interface' from source: set_fact 15330 1726882285.43321: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15330 1726882285.43323: when evaluation is False, skipping this task 15330 1726882285.43324: _execute() done 15330 1726882285.43326: dumping result to json 15330 1726882285.43328: done dumping result, returning 15330 1726882285.43329: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-000000000061] 15330 1726882285.43337: sending task result for task 12673a56-9f93-e4fe-1358-000000000061 15330 1726882285.43399: done sending task result for task 12673a56-9f93-e4fe-1358-000000000061 15330 1726882285.43402: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15330 1726882285.43469: no more pending results, returning what we have 15330 1726882285.43472: results queue empty 15330 1726882285.43473: checking for any_errors_fatal 15330 1726882285.43480: done checking for any_errors_fatal 15330 1726882285.43480: checking for max_fail_percentage 15330 1726882285.43482: done checking for max_fail_percentage 15330 1726882285.43483: checking to see if all hosts have failed and the running result is not ok 15330 1726882285.43483: done checking to see if all hosts have failed 15330 1726882285.43484: getting the remaining hosts for this loop 15330 1726882285.43485: done getting the remaining hosts for this loop 15330 1726882285.43489: getting the next task for host managed_node3 15330 1726882285.43522: done getting next task for host managed_node3 15330 1726882285.43536: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15330 1726882285.43538: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882285.43550: getting variables 15330 1726882285.43552: in VariableManager get_vars() 15330 1726882285.43586: Calling all_inventory to load vars for managed_node3 15330 1726882285.43589: Calling groups_inventory to load vars for managed_node3 15330 1726882285.43591: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882285.43708: Calling all_plugins_play to load vars for managed_node3 15330 1726882285.43712: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882285.43716: Calling groups_plugins_play to load vars for managed_node3 15330 1726882285.44964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882285.46704: done with get_vars() 15330 1726882285.46729: done getting variables 15330 1726882285.46805: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:31:25 -0400 (0:00:00.119) 0:00:34.674 ****** 15330 1726882285.46837: entering _queue_task() for managed_node3/package 15330 1726882285.47156: worker is 1 (out of 1 available) 15330 1726882285.47169: exiting _queue_task() for managed_node3/package 15330 1726882285.47181: done queuing things up, now waiting for results queue to drain 15330 1726882285.47182: waiting for pending results... 15330 1726882285.47455: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 15330 1726882285.47532: in run() - task 12673a56-9f93-e4fe-1358-000000000062 15330 1726882285.47542: variable 'ansible_search_path' from source: unknown 15330 1726882285.47546: variable 'ansible_search_path' from source: unknown 15330 1726882285.47575: calling self._execute() 15330 1726882285.47655: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882285.47658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882285.47667: variable 'omit' from source: magic vars 15330 1726882285.47958: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.47967: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882285.48100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882285.48291: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882285.48325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882285.48352: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882285.48420: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882285.48499: variable 'network_packages' from source: role '' defaults 15330 1726882285.48591: variable '__network_provider_setup' from source: role '' defaults 15330 1726882285.48596: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882285.48659: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882285.48668: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882285.48753: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882285.48989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882285.51725: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882285.51784: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882285.51802: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882285.51826: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882285.51844: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882285.51917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.51937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.51954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.52000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.52005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.52114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.52121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.52124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.52199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.52202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.52519: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15330 1726882285.52627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.52651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.52676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.52733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.52757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.52875: variable 'ansible_python' from source: facts 15330 1726882285.52878: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15330 1726882285.52990: variable '__network_wpa_supplicant_required' from source: role '' defaults 15330 1726882285.53064: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15330 1726882285.53218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.53243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.53255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.53308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.53312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.53381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.53452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.53455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.53501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.53553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.53702: variable 'network_connections' from source: play vars 15330 1726882285.53706: variable 'profile' from source: play vars 15330 1726882285.53857: variable 'profile' from source: play vars 15330 1726882285.53864: variable 'interface' from source: set_fact 15330 1726882285.53936: variable 'interface' from source: set_fact 15330 1726882285.53974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882285.53999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882285.54055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.54080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882285.54122: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882285.54317: variable 'network_connections' from source: play vars 15330 1726882285.54320: variable 'profile' from source: play vars 15330 1726882285.54410: variable 'profile' from source: play vars 15330 1726882285.54416: variable 'interface' from source: set_fact 15330 1726882285.54465: variable 'interface' from source: set_fact 15330 1726882285.54494: variable '__network_packages_default_wireless' from source: role '' defaults 15330 1726882285.54557: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882285.54788: variable 'network_connections' from source: play vars 15330 1726882285.54791: variable 'profile' from source: play vars 15330 1726882285.54853: variable 'profile' from source: play vars 15330 1726882285.54856: variable 'interface' from source: set_fact 15330 1726882285.54952: variable 'interface' from source: set_fact 15330 1726882285.54996: variable '__network_packages_default_team' from source: role '' defaults 15330 1726882285.55063: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882285.55330: variable 'network_connections' from source: play vars 15330 1726882285.55336: variable 'profile' from source: play vars 15330 1726882285.55395: variable 'profile' from source: play vars 15330 1726882285.55398: variable 'interface' from source: set_fact 15330 1726882285.55466: variable 'interface' from source: set_fact 15330 1726882285.55507: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882285.55550: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882285.55553: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882285.55607: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882285.55782: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15330 1726882285.56123: variable 'network_connections' from source: play vars 15330 1726882285.56126: variable 'profile' from source: play vars 15330 1726882285.56169: variable 'profile' from source: play vars 15330 1726882285.56172: variable 'interface' from source: set_fact 15330 1726882285.56221: variable 'interface' from source: set_fact 15330 1726882285.56228: variable 'ansible_distribution' from source: facts 15330 1726882285.56230: variable '__network_rh_distros' from source: role '' defaults 15330 1726882285.56233: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.56247: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15330 1726882285.56388: variable 'ansible_distribution' from source: facts 15330 1726882285.56392: variable '__network_rh_distros' from source: role '' defaults 15330 1726882285.56396: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.56406: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15330 1726882285.56656: variable 'ansible_distribution' from source: facts 15330 1726882285.56659: variable '__network_rh_distros' from source: role '' defaults 15330 1726882285.56664: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.56670: variable 'network_provider' from source: set_fact 15330 1726882285.56672: variable 'ansible_facts' from source: unknown 15330 1726882285.57439: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15330 1726882285.57442: when evaluation is False, skipping this task 15330 1726882285.57444: _execute() done 15330 1726882285.57447: dumping result to json 15330 1726882285.57449: done dumping result, returning 15330 1726882285.57451: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-e4fe-1358-000000000062] 15330 1726882285.57453: sending task result for task 12673a56-9f93-e4fe-1358-000000000062 15330 1726882285.57527: done sending task result for task 12673a56-9f93-e4fe-1358-000000000062 15330 1726882285.57530: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15330 1726882285.57615: no more pending results, returning what we have 15330 1726882285.57618: results queue empty 15330 1726882285.57619: checking for any_errors_fatal 15330 1726882285.57628: done checking for any_errors_fatal 15330 1726882285.57629: checking for max_fail_percentage 15330 1726882285.57631: done checking for max_fail_percentage 15330 1726882285.57631: checking to see if all hosts have failed and the running result is not ok 15330 1726882285.57632: done checking to see if all hosts have failed 15330 1726882285.57634: getting the remaining hosts for this loop 15330 1726882285.57636: done getting the remaining hosts for this loop 15330 1726882285.57640: getting the next task for host managed_node3 15330 1726882285.57647: done getting next task for host managed_node3 15330 1726882285.57651: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15330 1726882285.57652: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882285.57669: getting variables 15330 1726882285.57671: in VariableManager get_vars() 15330 1726882285.57861: Calling all_inventory to load vars for managed_node3 15330 1726882285.57863: Calling groups_inventory to load vars for managed_node3 15330 1726882285.57864: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882285.57875: Calling all_plugins_play to load vars for managed_node3 15330 1726882285.57876: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882285.57878: Calling groups_plugins_play to load vars for managed_node3 15330 1726882285.59175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882285.60083: done with get_vars() 15330 1726882285.60101: done getting variables 15330 1726882285.60144: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:31:25 -0400 (0:00:00.133) 0:00:34.807 ****** 15330 1726882285.60166: entering _queue_task() for managed_node3/package 15330 1726882285.60451: worker is 1 (out of 1 available) 15330 1726882285.60464: exiting _queue_task() for managed_node3/package 15330 1726882285.60477: done queuing things up, now waiting for results queue to drain 15330 1726882285.60481: waiting for pending results... 15330 1726882285.60726: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15330 1726882285.60815: in run() - task 12673a56-9f93-e4fe-1358-000000000063 15330 1726882285.60826: variable 'ansible_search_path' from source: unknown 15330 1726882285.60830: variable 'ansible_search_path' from source: unknown 15330 1726882285.60859: calling self._execute() 15330 1726882285.60968: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882285.60974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882285.60984: variable 'omit' from source: magic vars 15330 1726882285.61346: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.61351: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882285.61494: variable 'network_state' from source: role '' defaults 15330 1726882285.61499: Evaluated conditional (network_state != {}): False 15330 1726882285.61502: when evaluation is False, skipping this task 15330 1726882285.61505: _execute() done 15330 1726882285.61511: dumping result to json 15330 1726882285.61514: done dumping result, returning 15330 1726882285.61530: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-e4fe-1358-000000000063] 15330 1726882285.61534: sending task result for task 12673a56-9f93-e4fe-1358-000000000063 15330 1726882285.61634: done sending task result for task 12673a56-9f93-e4fe-1358-000000000063 15330 1726882285.61637: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882285.61694: no more pending results, returning what we have 15330 1726882285.61698: results queue empty 15330 1726882285.61699: checking for any_errors_fatal 15330 1726882285.61706: done checking for any_errors_fatal 15330 1726882285.61707: checking for max_fail_percentage 15330 1726882285.61708: done checking for max_fail_percentage 15330 1726882285.61709: checking to see if all hosts have failed and the running result is not ok 15330 1726882285.61710: done checking to see if all hosts have failed 15330 1726882285.61710: getting the remaining hosts for this loop 15330 1726882285.61712: done getting the remaining hosts for this loop 15330 1726882285.61718: getting the next task for host managed_node3 15330 1726882285.61727: done getting next task for host managed_node3 15330 1726882285.61731: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15330 1726882285.61733: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882285.61757: getting variables 15330 1726882285.61759: in VariableManager get_vars() 15330 1726882285.61794: Calling all_inventory to load vars for managed_node3 15330 1726882285.61797: Calling groups_inventory to load vars for managed_node3 15330 1726882285.61804: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882285.61814: Calling all_plugins_play to load vars for managed_node3 15330 1726882285.61830: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882285.61838: Calling groups_plugins_play to load vars for managed_node3 15330 1726882285.63150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882285.64594: done with get_vars() 15330 1726882285.64615: done getting variables 15330 1726882285.64658: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:31:25 -0400 (0:00:00.045) 0:00:34.852 ****** 15330 1726882285.64682: entering _queue_task() for managed_node3/package 15330 1726882285.64963: worker is 1 (out of 1 available) 15330 1726882285.64977: exiting _queue_task() for managed_node3/package 15330 1726882285.64990: done queuing things up, now waiting for results queue to drain 15330 1726882285.64992: waiting for pending results... 15330 1726882285.65206: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15330 1726882285.65371: in run() - task 12673a56-9f93-e4fe-1358-000000000064 15330 1726882285.65375: variable 'ansible_search_path' from source: unknown 15330 1726882285.65378: variable 'ansible_search_path' from source: unknown 15330 1726882285.65402: calling self._execute() 15330 1726882285.65481: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882285.65488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882285.65497: variable 'omit' from source: magic vars 15330 1726882285.65777: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.65788: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882285.65874: variable 'network_state' from source: role '' defaults 15330 1726882285.65883: Evaluated conditional (network_state != {}): False 15330 1726882285.65888: when evaluation is False, skipping this task 15330 1726882285.65892: _execute() done 15330 1726882285.65896: dumping result to json 15330 1726882285.65899: done dumping result, returning 15330 1726882285.65904: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-e4fe-1358-000000000064] 15330 1726882285.65906: sending task result for task 12673a56-9f93-e4fe-1358-000000000064 15330 1726882285.65996: done sending task result for task 12673a56-9f93-e4fe-1358-000000000064 15330 1726882285.65999: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882285.66073: no more pending results, returning what we have 15330 1726882285.66076: results queue empty 15330 1726882285.66077: checking for any_errors_fatal 15330 1726882285.66088: done checking for any_errors_fatal 15330 1726882285.66088: checking for max_fail_percentage 15330 1726882285.66090: done checking for max_fail_percentage 15330 1726882285.66091: checking to see if all hosts have failed and the running result is not ok 15330 1726882285.66092: done checking to see if all hosts have failed 15330 1726882285.66092: getting the remaining hosts for this loop 15330 1726882285.66095: done getting the remaining hosts for this loop 15330 1726882285.66098: getting the next task for host managed_node3 15330 1726882285.66104: done getting next task for host managed_node3 15330 1726882285.66108: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15330 1726882285.66109: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882285.66122: getting variables 15330 1726882285.66123: in VariableManager get_vars() 15330 1726882285.66155: Calling all_inventory to load vars for managed_node3 15330 1726882285.66158: Calling groups_inventory to load vars for managed_node3 15330 1726882285.66160: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882285.66168: Calling all_plugins_play to load vars for managed_node3 15330 1726882285.66170: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882285.66172: Calling groups_plugins_play to load vars for managed_node3 15330 1726882285.67155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882285.68080: done with get_vars() 15330 1726882285.68098: done getting variables 15330 1726882285.68137: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:31:25 -0400 (0:00:00.034) 0:00:34.887 ****** 15330 1726882285.68158: entering _queue_task() for managed_node3/service 15330 1726882285.68396: worker is 1 (out of 1 available) 15330 1726882285.68409: exiting _queue_task() for managed_node3/service 15330 1726882285.68421: done queuing things up, now waiting for results queue to drain 15330 1726882285.68423: waiting for pending results... 15330 1726882285.68590: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15330 1726882285.68658: in run() - task 12673a56-9f93-e4fe-1358-000000000065 15330 1726882285.68670: variable 'ansible_search_path' from source: unknown 15330 1726882285.68674: variable 'ansible_search_path' from source: unknown 15330 1726882285.68705: calling self._execute() 15330 1726882285.68794: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882285.68798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882285.68805: variable 'omit' from source: magic vars 15330 1726882285.69136: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.69146: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882285.69245: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882285.69402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882285.71072: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882285.71121: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882285.71160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882285.71197: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882285.71231: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882285.71316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.71363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.71379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.71412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.71422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.71482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.71514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.71617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.71620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.71623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.71659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.71683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.71728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.71757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.71761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.72154: variable 'network_connections' from source: play vars 15330 1726882285.72158: variable 'profile' from source: play vars 15330 1726882285.72231: variable 'profile' from source: play vars 15330 1726882285.72235: variable 'interface' from source: set_fact 15330 1726882285.72306: variable 'interface' from source: set_fact 15330 1726882285.72403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882285.72587: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882285.72624: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882285.72656: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882285.72687: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882285.72841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882285.72844: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882285.72847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.72849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882285.72903: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882285.73125: variable 'network_connections' from source: play vars 15330 1726882285.73129: variable 'profile' from source: play vars 15330 1726882285.73195: variable 'profile' from source: play vars 15330 1726882285.73199: variable 'interface' from source: set_fact 15330 1726882285.73274: variable 'interface' from source: set_fact 15330 1726882285.73283: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15330 1726882285.73287: when evaluation is False, skipping this task 15330 1726882285.73290: _execute() done 15330 1726882285.73294: dumping result to json 15330 1726882285.73299: done dumping result, returning 15330 1726882285.73311: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-e4fe-1358-000000000065] 15330 1726882285.73321: sending task result for task 12673a56-9f93-e4fe-1358-000000000065 15330 1726882285.73458: done sending task result for task 12673a56-9f93-e4fe-1358-000000000065 15330 1726882285.73462: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15330 1726882285.73517: no more pending results, returning what we have 15330 1726882285.73520: results queue empty 15330 1726882285.73521: checking for any_errors_fatal 15330 1726882285.73527: done checking for any_errors_fatal 15330 1726882285.73527: checking for max_fail_percentage 15330 1726882285.73529: done checking for max_fail_percentage 15330 1726882285.73530: checking to see if all hosts have failed and the running result is not ok 15330 1726882285.73530: done checking to see if all hosts have failed 15330 1726882285.73531: getting the remaining hosts for this loop 15330 1726882285.73532: done getting the remaining hosts for this loop 15330 1726882285.73538: getting the next task for host managed_node3 15330 1726882285.73546: done getting next task for host managed_node3 15330 1726882285.73549: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15330 1726882285.73551: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882285.73565: getting variables 15330 1726882285.73566: in VariableManager get_vars() 15330 1726882285.73603: Calling all_inventory to load vars for managed_node3 15330 1726882285.73606: Calling groups_inventory to load vars for managed_node3 15330 1726882285.73608: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882285.73617: Calling all_plugins_play to load vars for managed_node3 15330 1726882285.73621: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882285.73624: Calling groups_plugins_play to load vars for managed_node3 15330 1726882285.75472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882285.78026: done with get_vars() 15330 1726882285.78063: done getting variables 15330 1726882285.78254: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:31:25 -0400 (0:00:00.101) 0:00:34.988 ****** 15330 1726882285.78292: entering _queue_task() for managed_node3/service 15330 1726882285.79297: worker is 1 (out of 1 available) 15330 1726882285.79310: exiting _queue_task() for managed_node3/service 15330 1726882285.79321: done queuing things up, now waiting for results queue to drain 15330 1726882285.79322: waiting for pending results... 15330 1726882285.79525: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15330 1726882285.79577: in run() - task 12673a56-9f93-e4fe-1358-000000000066 15330 1726882285.79588: variable 'ansible_search_path' from source: unknown 15330 1726882285.79607: variable 'ansible_search_path' from source: unknown 15330 1726882285.79634: calling self._execute() 15330 1726882285.79741: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882285.79751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882285.79757: variable 'omit' from source: magic vars 15330 1726882285.80165: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.80174: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882285.80289: variable 'network_provider' from source: set_fact 15330 1726882285.80299: variable 'network_state' from source: role '' defaults 15330 1726882285.80307: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15330 1726882285.80312: variable 'omit' from source: magic vars 15330 1726882285.80346: variable 'omit' from source: magic vars 15330 1726882285.80378: variable 'network_service_name' from source: role '' defaults 15330 1726882285.80430: variable 'network_service_name' from source: role '' defaults 15330 1726882285.80509: variable '__network_provider_setup' from source: role '' defaults 15330 1726882285.80512: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882285.80560: variable '__network_service_name_default_nm' from source: role '' defaults 15330 1726882285.80568: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882285.80632: variable '__network_packages_default_nm' from source: role '' defaults 15330 1726882285.80786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882285.82779: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882285.82839: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882285.82952: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882285.82955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882285.82958: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882285.83099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.83104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.83106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.83135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.83155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.83205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.83239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.83267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.83338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.83358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.83615: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15330 1726882285.83823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.83851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.83908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.83945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.83990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.84098: variable 'ansible_python' from source: facts 15330 1726882285.84192: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15330 1726882285.84242: variable '__network_wpa_supplicant_required' from source: role '' defaults 15330 1726882285.84340: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15330 1726882285.84494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.84529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.84559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.84615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.84799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.84806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882285.84819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882285.84822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.84824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882285.84826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882285.85059: variable 'network_connections' from source: play vars 15330 1726882285.85072: variable 'profile' from source: play vars 15330 1726882285.85220: variable 'profile' from source: play vars 15330 1726882285.85241: variable 'interface' from source: set_fact 15330 1726882285.85320: variable 'interface' from source: set_fact 15330 1726882285.85500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882285.85672: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882285.85742: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882285.85790: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882285.85856: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882285.85943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882285.85991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882285.86049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882285.86103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882285.86245: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882285.86451: variable 'network_connections' from source: play vars 15330 1726882285.86455: variable 'profile' from source: play vars 15330 1726882285.86517: variable 'profile' from source: play vars 15330 1726882285.86521: variable 'interface' from source: set_fact 15330 1726882285.86562: variable 'interface' from source: set_fact 15330 1726882285.86615: variable '__network_packages_default_wireless' from source: role '' defaults 15330 1726882285.86660: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882285.86950: variable 'network_connections' from source: play vars 15330 1726882285.86953: variable 'profile' from source: play vars 15330 1726882285.86996: variable 'profile' from source: play vars 15330 1726882285.87000: variable 'interface' from source: set_fact 15330 1726882285.87060: variable 'interface' from source: set_fact 15330 1726882285.87080: variable '__network_packages_default_team' from source: role '' defaults 15330 1726882285.87148: variable '__network_team_connections_defined' from source: role '' defaults 15330 1726882285.87389: variable 'network_connections' from source: play vars 15330 1726882285.87392: variable 'profile' from source: play vars 15330 1726882285.87440: variable 'profile' from source: play vars 15330 1726882285.87444: variable 'interface' from source: set_fact 15330 1726882285.87501: variable 'interface' from source: set_fact 15330 1726882285.87538: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882285.87579: variable '__network_service_name_default_initscripts' from source: role '' defaults 15330 1726882285.87586: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882285.87648: variable '__network_packages_default_initscripts' from source: role '' defaults 15330 1726882285.87777: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15330 1726882285.88172: variable 'network_connections' from source: play vars 15330 1726882285.88176: variable 'profile' from source: play vars 15330 1726882285.88219: variable 'profile' from source: play vars 15330 1726882285.88222: variable 'interface' from source: set_fact 15330 1726882285.88290: variable 'interface' from source: set_fact 15330 1726882285.88301: variable 'ansible_distribution' from source: facts 15330 1726882285.88304: variable '__network_rh_distros' from source: role '' defaults 15330 1726882285.88309: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.88319: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15330 1726882285.88448: variable 'ansible_distribution' from source: facts 15330 1726882285.88451: variable '__network_rh_distros' from source: role '' defaults 15330 1726882285.88459: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.88472: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15330 1726882285.88611: variable 'ansible_distribution' from source: facts 15330 1726882285.88615: variable '__network_rh_distros' from source: role '' defaults 15330 1726882285.88618: variable 'ansible_distribution_major_version' from source: facts 15330 1726882285.88657: variable 'network_provider' from source: set_fact 15330 1726882285.88673: variable 'omit' from source: magic vars 15330 1726882285.88700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882285.88721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882285.88737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882285.88749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882285.88772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882285.88809: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882285.88812: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882285.88815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882285.88882: Set connection var ansible_pipelining to False 15330 1726882285.88896: Set connection var ansible_timeout to 10 15330 1726882285.88899: Set connection var ansible_connection to ssh 15330 1726882285.88901: Set connection var ansible_shell_type to sh 15330 1726882285.88908: Set connection var ansible_shell_executable to /bin/sh 15330 1726882285.88914: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882285.88932: variable 'ansible_shell_executable' from source: unknown 15330 1726882285.88935: variable 'ansible_connection' from source: unknown 15330 1726882285.88937: variable 'ansible_module_compression' from source: unknown 15330 1726882285.88939: variable 'ansible_shell_type' from source: unknown 15330 1726882285.88942: variable 'ansible_shell_executable' from source: unknown 15330 1726882285.88944: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882285.88951: variable 'ansible_pipelining' from source: unknown 15330 1726882285.88953: variable 'ansible_timeout' from source: unknown 15330 1726882285.88955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882285.89043: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882285.89057: variable 'omit' from source: magic vars 15330 1726882285.89060: starting attempt loop 15330 1726882285.89062: running the handler 15330 1726882285.89173: variable 'ansible_facts' from source: unknown 15330 1726882285.90109: _low_level_execute_command(): starting 15330 1726882285.90112: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882285.90567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882285.90598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882285.90601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882285.90604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882285.90607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882285.90674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882285.90678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882285.90708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882285.90762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882285.92420: stdout chunk (state=3): >>>/root <<< 15330 1726882285.92531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882285.92584: stderr chunk (state=3): >>><<< 15330 1726882285.92602: stdout chunk (state=3): >>><<< 15330 1726882285.92728: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882285.92743: _low_level_execute_command(): starting 15330 1726882285.92747: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670 `" && echo ansible-tmp-1726882285.9263837-16897-41119362985670="` echo /root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670 `" ) && sleep 0' 15330 1726882285.93309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882285.93315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882285.93320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882285.93323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 15330 1726882285.93325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882285.93327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882285.93417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882285.93421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882285.93470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882285.95801: stdout chunk (state=3): >>>ansible-tmp-1726882285.9263837-16897-41119362985670=/root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670 <<< 15330 1726882285.95805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882285.95807: stdout chunk (state=3): >>><<< 15330 1726882285.95809: stderr chunk (state=3): >>><<< 15330 1726882285.95811: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882285.9263837-16897-41119362985670=/root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882285.95813: variable 'ansible_module_compression' from source: unknown 15330 1726882285.95815: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15330 1726882285.95923: variable 'ansible_facts' from source: unknown 15330 1726882285.96311: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670/AnsiballZ_systemd.py 15330 1726882285.96558: Sending initial data 15330 1726882285.96567: Sent initial data (155 bytes) 15330 1726882285.97126: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882285.97150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882285.97226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882285.98831: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882285.98834: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882285.98879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp_f40r58d /root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670/AnsiballZ_systemd.py <<< 15330 1726882285.98883: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670/AnsiballZ_systemd.py" <<< 15330 1726882285.98971: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp_f40r58d" to remote "/root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670/AnsiballZ_systemd.py" <<< 15330 1726882286.02250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882286.02260: stdout chunk (state=3): >>><<< 15330 1726882286.02263: stderr chunk (state=3): >>><<< 15330 1726882286.02365: done transferring module to remote 15330 1726882286.02409: _low_level_execute_command(): starting 15330 1726882286.02414: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670/ /root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670/AnsiballZ_systemd.py && sleep 0' 15330 1726882286.03709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882286.03913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882286.03998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882286.04003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882286.05752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882286.05756: stdout chunk (state=3): >>><<< 15330 1726882286.05759: stderr chunk (state=3): >>><<< 15330 1726882286.05799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882286.05803: _low_level_execute_command(): starting 15330 1726882286.05805: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670/AnsiballZ_systemd.py && sleep 0' 15330 1726882286.06527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882286.06532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882286.06539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882286.06615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882286.35018: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10448896", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306835968", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1222297000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 15330 1726882286.35054: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system<<< 15330 1726882286.35064: stdout chunk (state=3): >>>.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15330 1726882286.36680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882286.36734: stderr chunk (state=3): >>><<< 15330 1726882286.36740: stdout chunk (state=3): >>><<< 15330 1726882286.36784: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10448896", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306835968", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1222297000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882286.36979: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882286.36984: _low_level_execute_command(): starting 15330 1726882286.36986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882285.9263837-16897-41119362985670/ > /dev/null 2>&1 && sleep 0' 15330 1726882286.37631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882286.37637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882286.37639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882286.37642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882286.37644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882286.37647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882286.37660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882286.37711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882286.39488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882286.39491: stderr chunk (state=3): >>><<< 15330 1726882286.39496: stdout chunk (state=3): >>><<< 15330 1726882286.39509: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882286.39515: handler run complete 15330 1726882286.39555: attempt loop complete, returning result 15330 1726882286.39558: _execute() done 15330 1726882286.39560: dumping result to json 15330 1726882286.39572: done dumping result, returning 15330 1726882286.39582: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-e4fe-1358-000000000066] 15330 1726882286.39586: sending task result for task 12673a56-9f93-e4fe-1358-000000000066 15330 1726882286.39784: done sending task result for task 12673a56-9f93-e4fe-1358-000000000066 15330 1726882286.39790: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882286.39841: no more pending results, returning what we have 15330 1726882286.39845: results queue empty 15330 1726882286.39846: checking for any_errors_fatal 15330 1726882286.39857: done checking for any_errors_fatal 15330 1726882286.39858: checking for max_fail_percentage 15330 1726882286.39859: done checking for max_fail_percentage 15330 1726882286.39860: checking to see if all hosts have failed and the running result is not ok 15330 1726882286.39861: done checking to see if all hosts have failed 15330 1726882286.39861: getting the remaining hosts for this loop 15330 1726882286.39862: done getting the remaining hosts for this loop 15330 1726882286.39866: getting the next task for host managed_node3 15330 1726882286.39872: done getting next task for host managed_node3 15330 1726882286.39876: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15330 1726882286.39878: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882286.39902: getting variables 15330 1726882286.39904: in VariableManager get_vars() 15330 1726882286.39973: Calling all_inventory to load vars for managed_node3 15330 1726882286.39976: Calling groups_inventory to load vars for managed_node3 15330 1726882286.39978: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882286.39990: Calling all_plugins_play to load vars for managed_node3 15330 1726882286.39996: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882286.39999: Calling groups_plugins_play to load vars for managed_node3 15330 1726882286.40783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882286.41649: done with get_vars() 15330 1726882286.41665: done getting variables 15330 1726882286.41709: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:31:26 -0400 (0:00:00.634) 0:00:35.623 ****** 15330 1726882286.41732: entering _queue_task() for managed_node3/service 15330 1726882286.41962: worker is 1 (out of 1 available) 15330 1726882286.41976: exiting _queue_task() for managed_node3/service 15330 1726882286.41987: done queuing things up, now waiting for results queue to drain 15330 1726882286.41988: waiting for pending results... 15330 1726882286.42190: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15330 1726882286.42255: in run() - task 12673a56-9f93-e4fe-1358-000000000067 15330 1726882286.42267: variable 'ansible_search_path' from source: unknown 15330 1726882286.42270: variable 'ansible_search_path' from source: unknown 15330 1726882286.42300: calling self._execute() 15330 1726882286.42375: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882286.42379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882286.42391: variable 'omit' from source: magic vars 15330 1726882286.42657: variable 'ansible_distribution_major_version' from source: facts 15330 1726882286.42668: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882286.42749: variable 'network_provider' from source: set_fact 15330 1726882286.42753: Evaluated conditional (network_provider == "nm"): True 15330 1726882286.42819: variable '__network_wpa_supplicant_required' from source: role '' defaults 15330 1726882286.42881: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15330 1726882286.43001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882286.44889: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882286.44966: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882286.45013: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882286.45058: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882286.45072: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882286.45157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882286.45195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882286.45236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882286.45267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882286.45278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882286.45379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882286.45382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882286.45385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882286.45448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882286.45459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882286.45498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882286.45515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882286.45540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882286.45620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882286.45624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882286.45746: variable 'network_connections' from source: play vars 15330 1726882286.45768: variable 'profile' from source: play vars 15330 1726882286.45828: variable 'profile' from source: play vars 15330 1726882286.45831: variable 'interface' from source: set_fact 15330 1726882286.45948: variable 'interface' from source: set_fact 15330 1726882286.45982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15330 1726882286.46155: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15330 1726882286.46221: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15330 1726882286.46295: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15330 1726882286.46298: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15330 1726882286.46318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15330 1726882286.46336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15330 1726882286.46354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882286.46374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15330 1726882286.46461: variable '__network_wireless_connections_defined' from source: role '' defaults 15330 1726882286.46710: variable 'network_connections' from source: play vars 15330 1726882286.46714: variable 'profile' from source: play vars 15330 1726882286.46758: variable 'profile' from source: play vars 15330 1726882286.46854: variable 'interface' from source: set_fact 15330 1726882286.46857: variable 'interface' from source: set_fact 15330 1726882286.46872: Evaluated conditional (__network_wpa_supplicant_required): False 15330 1726882286.46875: when evaluation is False, skipping this task 15330 1726882286.46877: _execute() done 15330 1726882286.46886: dumping result to json 15330 1726882286.46888: done dumping result, returning 15330 1726882286.46899: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-e4fe-1358-000000000067] 15330 1726882286.46901: sending task result for task 12673a56-9f93-e4fe-1358-000000000067 15330 1726882286.46977: done sending task result for task 12673a56-9f93-e4fe-1358-000000000067 15330 1726882286.46979: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15330 1726882286.47068: no more pending results, returning what we have 15330 1726882286.47071: results queue empty 15330 1726882286.47072: checking for any_errors_fatal 15330 1726882286.47094: done checking for any_errors_fatal 15330 1726882286.47095: checking for max_fail_percentage 15330 1726882286.47097: done checking for max_fail_percentage 15330 1726882286.47097: checking to see if all hosts have failed and the running result is not ok 15330 1726882286.47098: done checking to see if all hosts have failed 15330 1726882286.47099: getting the remaining hosts for this loop 15330 1726882286.47100: done getting the remaining hosts for this loop 15330 1726882286.47105: getting the next task for host managed_node3 15330 1726882286.47114: done getting next task for host managed_node3 15330 1726882286.47118: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15330 1726882286.47120: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882286.47136: getting variables 15330 1726882286.47137: in VariableManager get_vars() 15330 1726882286.47172: Calling all_inventory to load vars for managed_node3 15330 1726882286.47174: Calling groups_inventory to load vars for managed_node3 15330 1726882286.47176: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882286.47189: Calling all_plugins_play to load vars for managed_node3 15330 1726882286.47191: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882286.47232: Calling groups_plugins_play to load vars for managed_node3 15330 1726882286.48416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882286.49528: done with get_vars() 15330 1726882286.49543: done getting variables 15330 1726882286.49586: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:31:26 -0400 (0:00:00.078) 0:00:35.702 ****** 15330 1726882286.49609: entering _queue_task() for managed_node3/service 15330 1726882286.49844: worker is 1 (out of 1 available) 15330 1726882286.49858: exiting _queue_task() for managed_node3/service 15330 1726882286.49869: done queuing things up, now waiting for results queue to drain 15330 1726882286.49871: waiting for pending results... 15330 1726882286.50043: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 15330 1726882286.50125: in run() - task 12673a56-9f93-e4fe-1358-000000000068 15330 1726882286.50137: variable 'ansible_search_path' from source: unknown 15330 1726882286.50140: variable 'ansible_search_path' from source: unknown 15330 1726882286.50168: calling self._execute() 15330 1726882286.50245: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882286.50249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882286.50260: variable 'omit' from source: magic vars 15330 1726882286.50551: variable 'ansible_distribution_major_version' from source: facts 15330 1726882286.50554: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882286.50633: variable 'network_provider' from source: set_fact 15330 1726882286.50637: Evaluated conditional (network_provider == "initscripts"): False 15330 1726882286.50642: when evaluation is False, skipping this task 15330 1726882286.50645: _execute() done 15330 1726882286.50649: dumping result to json 15330 1726882286.50652: done dumping result, returning 15330 1726882286.50655: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-e4fe-1358-000000000068] 15330 1726882286.50667: sending task result for task 12673a56-9f93-e4fe-1358-000000000068 15330 1726882286.50739: done sending task result for task 12673a56-9f93-e4fe-1358-000000000068 15330 1726882286.50742: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15330 1726882286.50813: no more pending results, returning what we have 15330 1726882286.50816: results queue empty 15330 1726882286.50817: checking for any_errors_fatal 15330 1726882286.50824: done checking for any_errors_fatal 15330 1726882286.50825: checking for max_fail_percentage 15330 1726882286.50827: done checking for max_fail_percentage 15330 1726882286.50827: checking to see if all hosts have failed and the running result is not ok 15330 1726882286.50828: done checking to see if all hosts have failed 15330 1726882286.50829: getting the remaining hosts for this loop 15330 1726882286.50830: done getting the remaining hosts for this loop 15330 1726882286.50833: getting the next task for host managed_node3 15330 1726882286.50839: done getting next task for host managed_node3 15330 1726882286.50842: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15330 1726882286.50844: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882286.50856: getting variables 15330 1726882286.50857: in VariableManager get_vars() 15330 1726882286.50889: Calling all_inventory to load vars for managed_node3 15330 1726882286.50891: Calling groups_inventory to load vars for managed_node3 15330 1726882286.50895: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882286.50903: Calling all_plugins_play to load vars for managed_node3 15330 1726882286.50905: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882286.50908: Calling groups_plugins_play to load vars for managed_node3 15330 1726882286.51825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882286.52852: done with get_vars() 15330 1726882286.52866: done getting variables 15330 1726882286.52908: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:31:26 -0400 (0:00:00.033) 0:00:35.735 ****** 15330 1726882286.52931: entering _queue_task() for managed_node3/copy 15330 1726882286.53124: worker is 1 (out of 1 available) 15330 1726882286.53137: exiting _queue_task() for managed_node3/copy 15330 1726882286.53147: done queuing things up, now waiting for results queue to drain 15330 1726882286.53148: waiting for pending results... 15330 1726882286.53336: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15330 1726882286.53397: in run() - task 12673a56-9f93-e4fe-1358-000000000069 15330 1726882286.53408: variable 'ansible_search_path' from source: unknown 15330 1726882286.53411: variable 'ansible_search_path' from source: unknown 15330 1726882286.53439: calling self._execute() 15330 1726882286.53517: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882286.53522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882286.53531: variable 'omit' from source: magic vars 15330 1726882286.53809: variable 'ansible_distribution_major_version' from source: facts 15330 1726882286.53814: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882286.53881: variable 'network_provider' from source: set_fact 15330 1726882286.53888: Evaluated conditional (network_provider == "initscripts"): False 15330 1726882286.53891: when evaluation is False, skipping this task 15330 1726882286.53896: _execute() done 15330 1726882286.53899: dumping result to json 15330 1726882286.53901: done dumping result, returning 15330 1726882286.53908: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-e4fe-1358-000000000069] 15330 1726882286.53912: sending task result for task 12673a56-9f93-e4fe-1358-000000000069 15330 1726882286.54000: done sending task result for task 12673a56-9f93-e4fe-1358-000000000069 15330 1726882286.54003: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15330 1726882286.54072: no more pending results, returning what we have 15330 1726882286.54076: results queue empty 15330 1726882286.54077: checking for any_errors_fatal 15330 1726882286.54081: done checking for any_errors_fatal 15330 1726882286.54082: checking for max_fail_percentage 15330 1726882286.54083: done checking for max_fail_percentage 15330 1726882286.54084: checking to see if all hosts have failed and the running result is not ok 15330 1726882286.54084: done checking to see if all hosts have failed 15330 1726882286.54085: getting the remaining hosts for this loop 15330 1726882286.54088: done getting the remaining hosts for this loop 15330 1726882286.54091: getting the next task for host managed_node3 15330 1726882286.54097: done getting next task for host managed_node3 15330 1726882286.54100: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15330 1726882286.54102: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882286.54114: getting variables 15330 1726882286.54116: in VariableManager get_vars() 15330 1726882286.54145: Calling all_inventory to load vars for managed_node3 15330 1726882286.54147: Calling groups_inventory to load vars for managed_node3 15330 1726882286.54149: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882286.54157: Calling all_plugins_play to load vars for managed_node3 15330 1726882286.54159: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882286.54162: Calling groups_plugins_play to load vars for managed_node3 15330 1726882286.58921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882286.59842: done with get_vars() 15330 1726882286.59859: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:31:26 -0400 (0:00:00.070) 0:00:35.805 ****** 15330 1726882286.59946: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15330 1726882286.60347: worker is 1 (out of 1 available) 15330 1726882286.60362: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 15330 1726882286.60377: done queuing things up, now waiting for results queue to drain 15330 1726882286.60379: waiting for pending results... 15330 1726882286.60634: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15330 1726882286.60799: in run() - task 12673a56-9f93-e4fe-1358-00000000006a 15330 1726882286.60805: variable 'ansible_search_path' from source: unknown 15330 1726882286.60808: variable 'ansible_search_path' from source: unknown 15330 1726882286.60819: calling self._execute() 15330 1726882286.60947: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882286.60967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882286.60973: variable 'omit' from source: magic vars 15330 1726882286.61598: variable 'ansible_distribution_major_version' from source: facts 15330 1726882286.61603: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882286.61609: variable 'omit' from source: magic vars 15330 1726882286.61613: variable 'omit' from source: magic vars 15330 1726882286.61742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15330 1726882286.63703: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15330 1726882286.63774: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15330 1726882286.63817: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15330 1726882286.63853: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15330 1726882286.63879: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15330 1726882286.63956: variable 'network_provider' from source: set_fact 15330 1726882286.64318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15330 1726882286.64361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15330 1726882286.64397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15330 1726882286.64452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15330 1726882286.64478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15330 1726882286.64615: variable 'omit' from source: magic vars 15330 1726882286.64775: variable 'omit' from source: magic vars 15330 1726882286.64875: variable 'network_connections' from source: play vars 15330 1726882286.64891: variable 'profile' from source: play vars 15330 1726882286.64938: variable 'profile' from source: play vars 15330 1726882286.64942: variable 'interface' from source: set_fact 15330 1726882286.64983: variable 'interface' from source: set_fact 15330 1726882286.65090: variable 'omit' from source: magic vars 15330 1726882286.65100: variable '__lsr_ansible_managed' from source: task vars 15330 1726882286.65151: variable '__lsr_ansible_managed' from source: task vars 15330 1726882286.65271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15330 1726882286.65414: Loaded config def from plugin (lookup/template) 15330 1726882286.65419: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15330 1726882286.65444: File lookup term: get_ansible_managed.j2 15330 1726882286.65447: variable 'ansible_search_path' from source: unknown 15330 1726882286.65450: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15330 1726882286.65459: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15330 1726882286.65472: variable 'ansible_search_path' from source: unknown 15330 1726882286.70203: variable 'ansible_managed' from source: unknown 15330 1726882286.70207: variable 'omit' from source: magic vars 15330 1726882286.70209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882286.70217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882286.70239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882286.70260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882286.70274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882286.70313: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882286.70323: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882286.70332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882286.70429: Set connection var ansible_pipelining to False 15330 1726882286.70449: Set connection var ansible_timeout to 10 15330 1726882286.70457: Set connection var ansible_connection to ssh 15330 1726882286.70465: Set connection var ansible_shell_type to sh 15330 1726882286.70476: Set connection var ansible_shell_executable to /bin/sh 15330 1726882286.70489: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882286.70517: variable 'ansible_shell_executable' from source: unknown 15330 1726882286.70526: variable 'ansible_connection' from source: unknown 15330 1726882286.70536: variable 'ansible_module_compression' from source: unknown 15330 1726882286.70544: variable 'ansible_shell_type' from source: unknown 15330 1726882286.70600: variable 'ansible_shell_executable' from source: unknown 15330 1726882286.70603: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882286.70605: variable 'ansible_pipelining' from source: unknown 15330 1726882286.70608: variable 'ansible_timeout' from source: unknown 15330 1726882286.70610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882286.70715: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882286.70738: variable 'omit' from source: magic vars 15330 1726882286.70755: starting attempt loop 15330 1726882286.70764: running the handler 15330 1726882286.70782: _low_level_execute_command(): starting 15330 1726882286.70857: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882286.71521: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882286.71617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882286.71644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882286.71662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882286.71684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882286.71835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882286.73790: stdout chunk (state=3): >>>/root <<< 15330 1726882286.73951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882286.73955: stdout chunk (state=3): >>><<< 15330 1726882286.73957: stderr chunk (state=3): >>><<< 15330 1726882286.73977: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882286.74077: _low_level_execute_command(): starting 15330 1726882286.74081: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675 `" && echo ansible-tmp-1726882286.7398221-16924-188825227357675="` echo /root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675 `" ) && sleep 0' 15330 1726882286.74668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882286.74684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882286.74706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882286.74746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15330 1726882286.74760: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882286.74810: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882286.74869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882286.74890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882286.74916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882286.75000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882286.76818: stdout chunk (state=3): >>>ansible-tmp-1726882286.7398221-16924-188825227357675=/root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675 <<< 15330 1726882286.76978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882286.76981: stdout chunk (state=3): >>><<< 15330 1726882286.76984: stderr chunk (state=3): >>><<< 15330 1726882286.77099: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882286.7398221-16924-188825227357675=/root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882286.77102: variable 'ansible_module_compression' from source: unknown 15330 1726882286.77105: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15330 1726882286.77144: variable 'ansible_facts' from source: unknown 15330 1726882286.77401: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675/AnsiballZ_network_connections.py 15330 1726882286.77520: Sending initial data 15330 1726882286.77524: Sent initial data (168 bytes) 15330 1726882286.78172: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882286.78207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882286.78289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882286.79949: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882286.79997: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882286.80044: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpzcc_xhmm /root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675/AnsiballZ_network_connections.py <<< 15330 1726882286.80228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675/AnsiballZ_network_connections.py" <<< 15330 1726882286.80232: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpzcc_xhmm" to remote "/root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675/AnsiballZ_network_connections.py" <<< 15330 1726882286.81899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882286.81902: stdout chunk (state=3): >>><<< 15330 1726882286.81904: stderr chunk (state=3): >>><<< 15330 1726882286.81906: done transferring module to remote 15330 1726882286.81910: _low_level_execute_command(): starting 15330 1726882286.81918: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675/ /root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675/AnsiballZ_network_connections.py && sleep 0' 15330 1726882286.82663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882286.82691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882286.82700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882286.82758: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882286.82814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882286.82833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882286.82846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882286.82929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882286.84677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882286.84736: stderr chunk (state=3): >>><<< 15330 1726882286.84739: stdout chunk (state=3): >>><<< 15330 1726882286.85200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882286.85204: _low_level_execute_command(): starting 15330 1726882286.85207: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675/AnsiballZ_network_connections.py && sleep 0' 15330 1726882286.86146: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882286.86150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882286.86152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882286.86154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882286.86157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882286.86253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882286.86305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882287.12607: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6r8pez0a/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6r8pez0a/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/da01a2c2-cda1-473e-9566-db3ae75e453e: error=unknown <<< 15330 1726882287.12751: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15330 1726882287.14602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882287.14634: stderr chunk (state=3): >>><<< 15330 1726882287.14639: stdout chunk (state=3): >>><<< 15330 1726882287.14784: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6r8pez0a/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6r8pez0a/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/da01a2c2-cda1-473e-9566-db3ae75e453e: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882287.14792: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882287.14797: _low_level_execute_command(): starting 15330 1726882287.14800: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882286.7398221-16924-188825227357675/ > /dev/null 2>&1 && sleep 0' 15330 1726882287.15401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882287.15508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882287.15522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882287.15540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882287.15623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882287.17516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882287.17519: stdout chunk (state=3): >>><<< 15330 1726882287.17522: stderr chunk (state=3): >>><<< 15330 1726882287.17537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882287.17576: handler run complete 15330 1726882287.17605: attempt loop complete, returning result 15330 1726882287.17613: _execute() done 15330 1726882287.17699: dumping result to json 15330 1726882287.17703: done dumping result, returning 15330 1726882287.17705: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-e4fe-1358-00000000006a] 15330 1726882287.17707: sending task result for task 12673a56-9f93-e4fe-1358-00000000006a 15330 1726882287.17779: done sending task result for task 12673a56-9f93-e4fe-1358-00000000006a 15330 1726882287.17782: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15330 1726882287.17890: no more pending results, returning what we have 15330 1726882287.17896: results queue empty 15330 1726882287.17897: checking for any_errors_fatal 15330 1726882287.17907: done checking for any_errors_fatal 15330 1726882287.17908: checking for max_fail_percentage 15330 1726882287.17910: done checking for max_fail_percentage 15330 1726882287.17911: checking to see if all hosts have failed and the running result is not ok 15330 1726882287.17911: done checking to see if all hosts have failed 15330 1726882287.17912: getting the remaining hosts for this loop 15330 1726882287.17914: done getting the remaining hosts for this loop 15330 1726882287.17918: getting the next task for host managed_node3 15330 1726882287.17924: done getting next task for host managed_node3 15330 1726882287.17928: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15330 1726882287.17930: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.17940: getting variables 15330 1726882287.17942: in VariableManager get_vars() 15330 1726882287.17981: Calling all_inventory to load vars for managed_node3 15330 1726882287.17983: Calling groups_inventory to load vars for managed_node3 15330 1726882287.17988: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.18212: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.18222: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.18226: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.20631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.23329: done with get_vars() 15330 1726882287.23351: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:31:27 -0400 (0:00:00.635) 0:00:36.440 ****** 15330 1726882287.23451: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15330 1726882287.23914: worker is 1 (out of 1 available) 15330 1726882287.24106: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 15330 1726882287.24118: done queuing things up, now waiting for results queue to drain 15330 1726882287.24119: waiting for pending results... 15330 1726882287.24362: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 15330 1726882287.24369: in run() - task 12673a56-9f93-e4fe-1358-00000000006b 15330 1726882287.24414: variable 'ansible_search_path' from source: unknown 15330 1726882287.24418: variable 'ansible_search_path' from source: unknown 15330 1726882287.24458: calling self._execute() 15330 1726882287.24599: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.24603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.24606: variable 'omit' from source: magic vars 15330 1726882287.25040: variable 'ansible_distribution_major_version' from source: facts 15330 1726882287.25067: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882287.25290: variable 'network_state' from source: role '' defaults 15330 1726882287.25298: Evaluated conditional (network_state != {}): False 15330 1726882287.25300: when evaluation is False, skipping this task 15330 1726882287.25302: _execute() done 15330 1726882287.25304: dumping result to json 15330 1726882287.25306: done dumping result, returning 15330 1726882287.25308: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-e4fe-1358-00000000006b] 15330 1726882287.25310: sending task result for task 12673a56-9f93-e4fe-1358-00000000006b 15330 1726882287.25381: done sending task result for task 12673a56-9f93-e4fe-1358-00000000006b skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15330 1726882287.25560: no more pending results, returning what we have 15330 1726882287.25565: results queue empty 15330 1726882287.25566: checking for any_errors_fatal 15330 1726882287.25579: done checking for any_errors_fatal 15330 1726882287.25580: checking for max_fail_percentage 15330 1726882287.25582: done checking for max_fail_percentage 15330 1726882287.25583: checking to see if all hosts have failed and the running result is not ok 15330 1726882287.25584: done checking to see if all hosts have failed 15330 1726882287.25584: getting the remaining hosts for this loop 15330 1726882287.25589: done getting the remaining hosts for this loop 15330 1726882287.25595: getting the next task for host managed_node3 15330 1726882287.25604: done getting next task for host managed_node3 15330 1726882287.25611: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15330 1726882287.25616: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.25632: getting variables 15330 1726882287.25634: in VariableManager get_vars() 15330 1726882287.25797: Calling all_inventory to load vars for managed_node3 15330 1726882287.25800: Calling groups_inventory to load vars for managed_node3 15330 1726882287.25803: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.25810: WORKER PROCESS EXITING 15330 1726882287.25821: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.25825: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.25828: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.27676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.29327: done with get_vars() 15330 1726882287.29342: done getting variables 15330 1726882287.29389: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:31:27 -0400 (0:00:00.059) 0:00:36.500 ****** 15330 1726882287.29413: entering _queue_task() for managed_node3/debug 15330 1726882287.29646: worker is 1 (out of 1 available) 15330 1726882287.29661: exiting _queue_task() for managed_node3/debug 15330 1726882287.29672: done queuing things up, now waiting for results queue to drain 15330 1726882287.29673: waiting for pending results... 15330 1726882287.29858: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15330 1726882287.29935: in run() - task 12673a56-9f93-e4fe-1358-00000000006c 15330 1726882287.29945: variable 'ansible_search_path' from source: unknown 15330 1726882287.29948: variable 'ansible_search_path' from source: unknown 15330 1726882287.29977: calling self._execute() 15330 1726882287.30057: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.30061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.30071: variable 'omit' from source: magic vars 15330 1726882287.30448: variable 'ansible_distribution_major_version' from source: facts 15330 1726882287.30460: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882287.30464: variable 'omit' from source: magic vars 15330 1726882287.30502: variable 'omit' from source: magic vars 15330 1726882287.30522: variable 'omit' from source: magic vars 15330 1726882287.30570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882287.30590: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882287.30619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882287.30698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882287.30702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882287.30705: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882287.30708: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.30712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.30771: Set connection var ansible_pipelining to False 15330 1726882287.30782: Set connection var ansible_timeout to 10 15330 1726882287.30785: Set connection var ansible_connection to ssh 15330 1726882287.30791: Set connection var ansible_shell_type to sh 15330 1726882287.30796: Set connection var ansible_shell_executable to /bin/sh 15330 1726882287.30798: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882287.30901: variable 'ansible_shell_executable' from source: unknown 15330 1726882287.30905: variable 'ansible_connection' from source: unknown 15330 1726882287.30908: variable 'ansible_module_compression' from source: unknown 15330 1726882287.30910: variable 'ansible_shell_type' from source: unknown 15330 1726882287.30913: variable 'ansible_shell_executable' from source: unknown 15330 1726882287.30915: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.30917: variable 'ansible_pipelining' from source: unknown 15330 1726882287.30920: variable 'ansible_timeout' from source: unknown 15330 1726882287.30922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.30970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882287.30980: variable 'omit' from source: magic vars 15330 1726882287.30989: starting attempt loop 15330 1726882287.30992: running the handler 15330 1726882287.31116: variable '__network_connections_result' from source: set_fact 15330 1726882287.31158: handler run complete 15330 1726882287.31175: attempt loop complete, returning result 15330 1726882287.31178: _execute() done 15330 1726882287.31181: dumping result to json 15330 1726882287.31184: done dumping result, returning 15330 1726882287.31197: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-e4fe-1358-00000000006c] 15330 1726882287.31200: sending task result for task 12673a56-9f93-e4fe-1358-00000000006c 15330 1726882287.31326: done sending task result for task 12673a56-9f93-e4fe-1358-00000000006c 15330 1726882287.31330: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 15330 1726882287.31528: no more pending results, returning what we have 15330 1726882287.31531: results queue empty 15330 1726882287.31532: checking for any_errors_fatal 15330 1726882287.31537: done checking for any_errors_fatal 15330 1726882287.31537: checking for max_fail_percentage 15330 1726882287.31539: done checking for max_fail_percentage 15330 1726882287.31539: checking to see if all hosts have failed and the running result is not ok 15330 1726882287.31540: done checking to see if all hosts have failed 15330 1726882287.31541: getting the remaining hosts for this loop 15330 1726882287.31542: done getting the remaining hosts for this loop 15330 1726882287.31545: getting the next task for host managed_node3 15330 1726882287.31550: done getting next task for host managed_node3 15330 1726882287.31553: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15330 1726882287.31554: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.31562: getting variables 15330 1726882287.31564: in VariableManager get_vars() 15330 1726882287.31610: Calling all_inventory to load vars for managed_node3 15330 1726882287.31614: Calling groups_inventory to load vars for managed_node3 15330 1726882287.31616: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.31649: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.31653: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.31657: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.33142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.34430: done with get_vars() 15330 1726882287.34445: done getting variables 15330 1726882287.34494: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:31:27 -0400 (0:00:00.051) 0:00:36.551 ****** 15330 1726882287.34532: entering _queue_task() for managed_node3/debug 15330 1726882287.34877: worker is 1 (out of 1 available) 15330 1726882287.34892: exiting _queue_task() for managed_node3/debug 15330 1726882287.34907: done queuing things up, now waiting for results queue to drain 15330 1726882287.34908: waiting for pending results... 15330 1726882287.35077: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15330 1726882287.35155: in run() - task 12673a56-9f93-e4fe-1358-00000000006d 15330 1726882287.35201: variable 'ansible_search_path' from source: unknown 15330 1726882287.35206: variable 'ansible_search_path' from source: unknown 15330 1726882287.35233: calling self._execute() 15330 1726882287.35343: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.35351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.35391: variable 'omit' from source: magic vars 15330 1726882287.35901: variable 'ansible_distribution_major_version' from source: facts 15330 1726882287.35908: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882287.35912: variable 'omit' from source: magic vars 15330 1726882287.35915: variable 'omit' from source: magic vars 15330 1726882287.35936: variable 'omit' from source: magic vars 15330 1726882287.35984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882287.36041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882287.36076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882287.36109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882287.36129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882287.36173: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882287.36183: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.36500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.36683: Set connection var ansible_pipelining to False 15330 1726882287.36689: Set connection var ansible_timeout to 10 15330 1726882287.36697: Set connection var ansible_connection to ssh 15330 1726882287.36701: Set connection var ansible_shell_type to sh 15330 1726882287.36703: Set connection var ansible_shell_executable to /bin/sh 15330 1726882287.36705: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882287.36707: variable 'ansible_shell_executable' from source: unknown 15330 1726882287.36709: variable 'ansible_connection' from source: unknown 15330 1726882287.36715: variable 'ansible_module_compression' from source: unknown 15330 1726882287.36717: variable 'ansible_shell_type' from source: unknown 15330 1726882287.36720: variable 'ansible_shell_executable' from source: unknown 15330 1726882287.36722: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.36724: variable 'ansible_pipelining' from source: unknown 15330 1726882287.36726: variable 'ansible_timeout' from source: unknown 15330 1726882287.36728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.36885: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882287.36914: variable 'omit' from source: magic vars 15330 1726882287.36929: starting attempt loop 15330 1726882287.36940: running the handler 15330 1726882287.37002: variable '__network_connections_result' from source: set_fact 15330 1726882287.37079: variable '__network_connections_result' from source: set_fact 15330 1726882287.37191: handler run complete 15330 1726882287.37230: attempt loop complete, returning result 15330 1726882287.37244: _execute() done 15330 1726882287.37344: dumping result to json 15330 1726882287.37348: done dumping result, returning 15330 1726882287.37351: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-e4fe-1358-00000000006d] 15330 1726882287.37353: sending task result for task 12673a56-9f93-e4fe-1358-00000000006d ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15330 1726882287.37578: no more pending results, returning what we have 15330 1726882287.37581: results queue empty 15330 1726882287.37582: checking for any_errors_fatal 15330 1726882287.37592: done checking for any_errors_fatal 15330 1726882287.37595: checking for max_fail_percentage 15330 1726882287.37597: done checking for max_fail_percentage 15330 1726882287.37599: checking to see if all hosts have failed and the running result is not ok 15330 1726882287.37600: done checking to see if all hosts have failed 15330 1726882287.37601: getting the remaining hosts for this loop 15330 1726882287.37603: done getting the remaining hosts for this loop 15330 1726882287.37607: getting the next task for host managed_node3 15330 1726882287.37614: done getting next task for host managed_node3 15330 1726882287.37617: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15330 1726882287.37619: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.37635: getting variables 15330 1726882287.37637: in VariableManager get_vars() 15330 1726882287.37674: Calling all_inventory to load vars for managed_node3 15330 1726882287.37676: Calling groups_inventory to load vars for managed_node3 15330 1726882287.37681: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.37692: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.37792: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.37915: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.38435: done sending task result for task 12673a56-9f93-e4fe-1358-00000000006d 15330 1726882287.38439: WORKER PROCESS EXITING 15330 1726882287.38877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.39866: done with get_vars() 15330 1726882287.39881: done getting variables 15330 1726882287.39923: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:31:27 -0400 (0:00:00.054) 0:00:36.605 ****** 15330 1726882287.39949: entering _queue_task() for managed_node3/debug 15330 1726882287.40166: worker is 1 (out of 1 available) 15330 1726882287.40179: exiting _queue_task() for managed_node3/debug 15330 1726882287.40196: done queuing things up, now waiting for results queue to drain 15330 1726882287.40198: waiting for pending results... 15330 1726882287.40366: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15330 1726882287.40440: in run() - task 12673a56-9f93-e4fe-1358-00000000006e 15330 1726882287.40452: variable 'ansible_search_path' from source: unknown 15330 1726882287.40455: variable 'ansible_search_path' from source: unknown 15330 1726882287.40482: calling self._execute() 15330 1726882287.40561: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.40564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.40573: variable 'omit' from source: magic vars 15330 1726882287.40890: variable 'ansible_distribution_major_version' from source: facts 15330 1726882287.40896: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882287.41010: variable 'network_state' from source: role '' defaults 15330 1726882287.41014: Evaluated conditional (network_state != {}): False 15330 1726882287.41017: when evaluation is False, skipping this task 15330 1726882287.41020: _execute() done 15330 1726882287.41022: dumping result to json 15330 1726882287.41024: done dumping result, returning 15330 1726882287.41027: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-e4fe-1358-00000000006e] 15330 1726882287.41029: sending task result for task 12673a56-9f93-e4fe-1358-00000000006e skipping: [managed_node3] => { "false_condition": "network_state != {}" } 15330 1726882287.41156: no more pending results, returning what we have 15330 1726882287.41160: results queue empty 15330 1726882287.41161: checking for any_errors_fatal 15330 1726882287.41170: done checking for any_errors_fatal 15330 1726882287.41170: checking for max_fail_percentage 15330 1726882287.41172: done checking for max_fail_percentage 15330 1726882287.41173: checking to see if all hosts have failed and the running result is not ok 15330 1726882287.41174: done checking to see if all hosts have failed 15330 1726882287.41174: getting the remaining hosts for this loop 15330 1726882287.41176: done getting the remaining hosts for this loop 15330 1726882287.41178: getting the next task for host managed_node3 15330 1726882287.41183: done getting next task for host managed_node3 15330 1726882287.41189: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15330 1726882287.41191: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.41205: getting variables 15330 1726882287.41207: in VariableManager get_vars() 15330 1726882287.41239: Calling all_inventory to load vars for managed_node3 15330 1726882287.41241: Calling groups_inventory to load vars for managed_node3 15330 1726882287.41243: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.41251: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.41253: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.41256: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.42032: done sending task result for task 12673a56-9f93-e4fe-1358-00000000006e 15330 1726882287.42036: WORKER PROCESS EXITING 15330 1726882287.42512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.44149: done with get_vars() 15330 1726882287.44171: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:31:27 -0400 (0:00:00.043) 0:00:36.648 ****** 15330 1726882287.44268: entering _queue_task() for managed_node3/ping 15330 1726882287.44582: worker is 1 (out of 1 available) 15330 1726882287.44601: exiting _queue_task() for managed_node3/ping 15330 1726882287.44616: done queuing things up, now waiting for results queue to drain 15330 1726882287.44617: waiting for pending results... 15330 1726882287.44832: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 15330 1726882287.44921: in run() - task 12673a56-9f93-e4fe-1358-00000000006f 15330 1726882287.44930: variable 'ansible_search_path' from source: unknown 15330 1726882287.44934: variable 'ansible_search_path' from source: unknown 15330 1726882287.44964: calling self._execute() 15330 1726882287.45041: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.45045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.45068: variable 'omit' from source: magic vars 15330 1726882287.45482: variable 'ansible_distribution_major_version' from source: facts 15330 1726882287.45495: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882287.45513: variable 'omit' from source: magic vars 15330 1726882287.45549: variable 'omit' from source: magic vars 15330 1726882287.45579: variable 'omit' from source: magic vars 15330 1726882287.45620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882287.45662: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882287.45698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882287.45717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882287.45751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882287.45763: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882287.45766: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.45769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.45867: Set connection var ansible_pipelining to False 15330 1726882287.45871: Set connection var ansible_timeout to 10 15330 1726882287.45873: Set connection var ansible_connection to ssh 15330 1726882287.45922: Set connection var ansible_shell_type to sh 15330 1726882287.45925: Set connection var ansible_shell_executable to /bin/sh 15330 1726882287.45928: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882287.45930: variable 'ansible_shell_executable' from source: unknown 15330 1726882287.45932: variable 'ansible_connection' from source: unknown 15330 1726882287.45935: variable 'ansible_module_compression' from source: unknown 15330 1726882287.45943: variable 'ansible_shell_type' from source: unknown 15330 1726882287.45947: variable 'ansible_shell_executable' from source: unknown 15330 1726882287.45949: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.45951: variable 'ansible_pipelining' from source: unknown 15330 1726882287.45953: variable 'ansible_timeout' from source: unknown 15330 1726882287.45955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.46165: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882287.46170: variable 'omit' from source: magic vars 15330 1726882287.46172: starting attempt loop 15330 1726882287.46175: running the handler 15330 1726882287.46190: _low_level_execute_command(): starting 15330 1726882287.46199: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882287.46841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882287.46846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882287.46849: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882287.46927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882287.46931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882287.46934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882287.46975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882287.48642: stdout chunk (state=3): >>>/root <<< 15330 1726882287.48744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882287.48789: stderr chunk (state=3): >>><<< 15330 1726882287.48791: stdout chunk (state=3): >>><<< 15330 1726882287.48818: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882287.48835: _low_level_execute_command(): starting 15330 1726882287.48838: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643 `" && echo ansible-tmp-1726882287.4881728-16973-128038414277643="` echo /root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643 `" ) && sleep 0' 15330 1726882287.49467: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882287.49471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882287.49474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882287.49489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882287.49542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882287.49545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882287.49595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882287.51436: stdout chunk (state=3): >>>ansible-tmp-1726882287.4881728-16973-128038414277643=/root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643 <<< 15330 1726882287.51566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882287.51617: stderr chunk (state=3): >>><<< 15330 1726882287.51620: stdout chunk (state=3): >>><<< 15330 1726882287.51653: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882287.4881728-16973-128038414277643=/root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882287.51721: variable 'ansible_module_compression' from source: unknown 15330 1726882287.51757: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15330 1726882287.51812: variable 'ansible_facts' from source: unknown 15330 1726882287.51953: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643/AnsiballZ_ping.py 15330 1726882287.52183: Sending initial data 15330 1726882287.52186: Sent initial data (153 bytes) 15330 1726882287.53048: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882287.53052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882287.53065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882287.53113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882287.54647: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15330 1726882287.54651: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15330 1726882287.54655: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882287.54690: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882287.54746: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpzchvaqjq /root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643/AnsiballZ_ping.py <<< 15330 1726882287.54750: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643/AnsiballZ_ping.py" <<< 15330 1726882287.54784: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpzchvaqjq" to remote "/root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643/AnsiballZ_ping.py" <<< 15330 1726882287.55583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882287.55617: stderr chunk (state=3): >>><<< 15330 1726882287.55620: stdout chunk (state=3): >>><<< 15330 1726882287.55683: done transferring module to remote 15330 1726882287.55688: _low_level_execute_command(): starting 15330 1726882287.55700: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643/ /root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643/AnsiballZ_ping.py && sleep 0' 15330 1726882287.56238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882287.56290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882287.56368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882287.56371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882287.56447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882287.58129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882287.58159: stderr chunk (state=3): >>><<< 15330 1726882287.58162: stdout chunk (state=3): >>><<< 15330 1726882287.58176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882287.58178: _low_level_execute_command(): starting 15330 1726882287.58183: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643/AnsiballZ_ping.py && sleep 0' 15330 1726882287.58694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882287.58730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882287.58732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882287.58735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882287.58737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882287.58739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882287.58740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882287.58788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882287.58791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882287.58852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882287.73457: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15330 1726882287.74698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882287.74725: stderr chunk (state=3): >>><<< 15330 1726882287.74728: stdout chunk (state=3): >>><<< 15330 1726882287.74742: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882287.74766: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882287.74772: _low_level_execute_command(): starting 15330 1726882287.74777: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882287.4881728-16973-128038414277643/ > /dev/null 2>&1 && sleep 0' 15330 1726882287.75244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882287.75247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882287.75250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 15330 1726882287.75252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882287.75254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882287.75309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882287.75316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882287.75361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882287.77167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882287.77199: stderr chunk (state=3): >>><<< 15330 1726882287.77202: stdout chunk (state=3): >>><<< 15330 1726882287.77214: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882287.77222: handler run complete 15330 1726882287.77234: attempt loop complete, returning result 15330 1726882287.77237: _execute() done 15330 1726882287.77240: dumping result to json 15330 1726882287.77244: done dumping result, returning 15330 1726882287.77253: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-e4fe-1358-00000000006f] 15330 1726882287.77256: sending task result for task 12673a56-9f93-e4fe-1358-00000000006f 15330 1726882287.77343: done sending task result for task 12673a56-9f93-e4fe-1358-00000000006f 15330 1726882287.77346: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 15330 1726882287.77415: no more pending results, returning what we have 15330 1726882287.77419: results queue empty 15330 1726882287.77420: checking for any_errors_fatal 15330 1726882287.77428: done checking for any_errors_fatal 15330 1726882287.77429: checking for max_fail_percentage 15330 1726882287.77430: done checking for max_fail_percentage 15330 1726882287.77431: checking to see if all hosts have failed and the running result is not ok 15330 1726882287.77432: done checking to see if all hosts have failed 15330 1726882287.77433: getting the remaining hosts for this loop 15330 1726882287.77434: done getting the remaining hosts for this loop 15330 1726882287.77438: getting the next task for host managed_node3 15330 1726882287.77445: done getting next task for host managed_node3 15330 1726882287.77447: ^ task is: TASK: meta (role_complete) 15330 1726882287.77449: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.77465: getting variables 15330 1726882287.77467: in VariableManager get_vars() 15330 1726882287.77514: Calling all_inventory to load vars for managed_node3 15330 1726882287.77517: Calling groups_inventory to load vars for managed_node3 15330 1726882287.77519: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.77528: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.77531: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.77533: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.78369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.79259: done with get_vars() 15330 1726882287.79277: done getting variables 15330 1726882287.79339: done queuing things up, now waiting for results queue to drain 15330 1726882287.79342: results queue empty 15330 1726882287.79342: checking for any_errors_fatal 15330 1726882287.79344: done checking for any_errors_fatal 15330 1726882287.79344: checking for max_fail_percentage 15330 1726882287.79345: done checking for max_fail_percentage 15330 1726882287.79346: checking to see if all hosts have failed and the running result is not ok 15330 1726882287.79346: done checking to see if all hosts have failed 15330 1726882287.79346: getting the remaining hosts for this loop 15330 1726882287.79347: done getting the remaining hosts for this loop 15330 1726882287.79349: getting the next task for host managed_node3 15330 1726882287.79351: done getting next task for host managed_node3 15330 1726882287.79352: ^ task is: TASK: meta (flush_handlers) 15330 1726882287.79353: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.79355: getting variables 15330 1726882287.79356: in VariableManager get_vars() 15330 1726882287.79364: Calling all_inventory to load vars for managed_node3 15330 1726882287.79366: Calling groups_inventory to load vars for managed_node3 15330 1726882287.79367: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.79370: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.79372: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.79373: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.80107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.81139: done with get_vars() 15330 1726882287.81154: done getting variables 15330 1726882287.81191: in VariableManager get_vars() 15330 1726882287.81202: Calling all_inventory to load vars for managed_node3 15330 1726882287.81204: Calling groups_inventory to load vars for managed_node3 15330 1726882287.81205: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.81209: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.81210: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.81212: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.82027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.83753: done with get_vars() 15330 1726882287.83777: done queuing things up, now waiting for results queue to drain 15330 1726882287.83779: results queue empty 15330 1726882287.83780: checking for any_errors_fatal 15330 1726882287.83782: done checking for any_errors_fatal 15330 1726882287.83782: checking for max_fail_percentage 15330 1726882287.83783: done checking for max_fail_percentage 15330 1726882287.83784: checking to see if all hosts have failed and the running result is not ok 15330 1726882287.83784: done checking to see if all hosts have failed 15330 1726882287.83788: getting the remaining hosts for this loop 15330 1726882287.83789: done getting the remaining hosts for this loop 15330 1726882287.83792: getting the next task for host managed_node3 15330 1726882287.83841: done getting next task for host managed_node3 15330 1726882287.83844: ^ task is: TASK: meta (flush_handlers) 15330 1726882287.83845: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.83848: getting variables 15330 1726882287.83849: in VariableManager get_vars() 15330 1726882287.83861: Calling all_inventory to load vars for managed_node3 15330 1726882287.83864: Calling groups_inventory to load vars for managed_node3 15330 1726882287.83866: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.83872: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.83875: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.83878: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.85037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.87241: done with get_vars() 15330 1726882287.87268: done getting variables 15330 1726882287.87436: in VariableManager get_vars() 15330 1726882287.87451: Calling all_inventory to load vars for managed_node3 15330 1726882287.87453: Calling groups_inventory to load vars for managed_node3 15330 1726882287.87455: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.87460: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.87463: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.87465: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.89937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.92970: done with get_vars() 15330 1726882287.93209: done queuing things up, now waiting for results queue to drain 15330 1726882287.93212: results queue empty 15330 1726882287.93213: checking for any_errors_fatal 15330 1726882287.93214: done checking for any_errors_fatal 15330 1726882287.93215: checking for max_fail_percentage 15330 1726882287.93216: done checking for max_fail_percentage 15330 1726882287.93217: checking to see if all hosts have failed and the running result is not ok 15330 1726882287.93218: done checking to see if all hosts have failed 15330 1726882287.93218: getting the remaining hosts for this loop 15330 1726882287.93219: done getting the remaining hosts for this loop 15330 1726882287.93223: getting the next task for host managed_node3 15330 1726882287.93226: done getting next task for host managed_node3 15330 1726882287.93227: ^ task is: None 15330 1726882287.93228: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.93229: done queuing things up, now waiting for results queue to drain 15330 1726882287.93230: results queue empty 15330 1726882287.93230: checking for any_errors_fatal 15330 1726882287.93231: done checking for any_errors_fatal 15330 1726882287.93232: checking for max_fail_percentage 15330 1726882287.93232: done checking for max_fail_percentage 15330 1726882287.93233: checking to see if all hosts have failed and the running result is not ok 15330 1726882287.93234: done checking to see if all hosts have failed 15330 1726882287.93235: getting the next task for host managed_node3 15330 1726882287.93237: done getting next task for host managed_node3 15330 1726882287.93237: ^ task is: None 15330 1726882287.93238: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.93281: in VariableManager get_vars() 15330 1726882287.93304: done with get_vars() 15330 1726882287.93311: in VariableManager get_vars() 15330 1726882287.93320: done with get_vars() 15330 1726882287.93324: variable 'omit' from source: magic vars 15330 1726882287.93542: variable 'task' from source: play vars 15330 1726882287.93572: in VariableManager get_vars() 15330 1726882287.93584: done with get_vars() 15330 1726882287.93711: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 15330 1726882287.94142: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15330 1726882287.94217: getting the remaining hosts for this loop 15330 1726882287.94219: done getting the remaining hosts for this loop 15330 1726882287.94221: getting the next task for host managed_node3 15330 1726882287.94223: done getting next task for host managed_node3 15330 1726882287.94225: ^ task is: TASK: Gathering Facts 15330 1726882287.94227: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882287.94229: getting variables 15330 1726882287.94229: in VariableManager get_vars() 15330 1726882287.94237: Calling all_inventory to load vars for managed_node3 15330 1726882287.94239: Calling groups_inventory to load vars for managed_node3 15330 1726882287.94241: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882287.94246: Calling all_plugins_play to load vars for managed_node3 15330 1726882287.94248: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882287.94251: Calling groups_plugins_play to load vars for managed_node3 15330 1726882287.95913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882287.97520: done with get_vars() 15330 1726882287.97549: done getting variables 15330 1726882287.97599: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:31:27 -0400 (0:00:00.533) 0:00:37.182 ****** 15330 1726882287.97627: entering _queue_task() for managed_node3/gather_facts 15330 1726882287.97975: worker is 1 (out of 1 available) 15330 1726882287.97990: exiting _queue_task() for managed_node3/gather_facts 15330 1726882287.98109: done queuing things up, now waiting for results queue to drain 15330 1726882287.98111: waiting for pending results... 15330 1726882287.98305: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882287.98419: in run() - task 12673a56-9f93-e4fe-1358-00000000046e 15330 1726882287.98445: variable 'ansible_search_path' from source: unknown 15330 1726882287.98500: calling self._execute() 15330 1726882287.98599: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.98700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.98704: variable 'omit' from source: magic vars 15330 1726882287.99029: variable 'ansible_distribution_major_version' from source: facts 15330 1726882287.99049: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882287.99062: variable 'omit' from source: magic vars 15330 1726882287.99099: variable 'omit' from source: magic vars 15330 1726882287.99144: variable 'omit' from source: magic vars 15330 1726882287.99198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882287.99233: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882287.99313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882287.99319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882287.99322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882287.99335: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882287.99338: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.99342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.99445: Set connection var ansible_pipelining to False 15330 1726882287.99454: Set connection var ansible_timeout to 10 15330 1726882287.99457: Set connection var ansible_connection to ssh 15330 1726882287.99459: Set connection var ansible_shell_type to sh 15330 1726882287.99465: Set connection var ansible_shell_executable to /bin/sh 15330 1726882287.99470: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882287.99489: variable 'ansible_shell_executable' from source: unknown 15330 1726882287.99492: variable 'ansible_connection' from source: unknown 15330 1726882287.99495: variable 'ansible_module_compression' from source: unknown 15330 1726882287.99498: variable 'ansible_shell_type' from source: unknown 15330 1726882287.99505: variable 'ansible_shell_executable' from source: unknown 15330 1726882287.99508: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882287.99513: variable 'ansible_pipelining' from source: unknown 15330 1726882287.99515: variable 'ansible_timeout' from source: unknown 15330 1726882287.99520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882287.99665: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882287.99673: variable 'omit' from source: magic vars 15330 1726882287.99677: starting attempt loop 15330 1726882287.99680: running the handler 15330 1726882287.99695: variable 'ansible_facts' from source: unknown 15330 1726882287.99712: _low_level_execute_command(): starting 15330 1726882287.99723: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882288.00219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882288.00222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882288.00225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882288.00227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882288.00268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882288.00284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882288.00349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882288.02080: stdout chunk (state=3): >>>/root <<< 15330 1726882288.02277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882288.02281: stdout chunk (state=3): >>><<< 15330 1726882288.02284: stderr chunk (state=3): >>><<< 15330 1726882288.02288: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882288.02291: _low_level_execute_command(): starting 15330 1726882288.02296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046 `" && echo ansible-tmp-1726882288.0219622-16993-59506483071046="` echo /root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046 `" ) && sleep 0' 15330 1726882288.02898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882288.02901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882288.02904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882288.02907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882288.02910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882288.02958: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882288.02965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882288.02999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882288.03015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882288.03085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882288.04964: stdout chunk (state=3): >>>ansible-tmp-1726882288.0219622-16993-59506483071046=/root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046 <<< 15330 1726882288.05088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882288.05092: stdout chunk (state=3): >>><<< 15330 1726882288.05096: stderr chunk (state=3): >>><<< 15330 1726882288.05303: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882288.0219622-16993-59506483071046=/root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882288.05306: variable 'ansible_module_compression' from source: unknown 15330 1726882288.05309: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882288.05310: variable 'ansible_facts' from source: unknown 15330 1726882288.05847: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046/AnsiballZ_setup.py 15330 1726882288.06216: Sending initial data 15330 1726882288.06219: Sent initial data (153 bytes) 15330 1726882288.07036: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882288.07112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882288.07148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882288.07361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882288.08827: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15330 1726882288.08840: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15330 1726882288.08858: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882288.08924: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882288.08973: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpjo5dsmuj /root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046/AnsiballZ_setup.py <<< 15330 1726882288.08984: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046/AnsiballZ_setup.py" <<< 15330 1726882288.09018: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpjo5dsmuj" to remote "/root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046/AnsiballZ_setup.py" <<< 15330 1726882288.10823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882288.10826: stdout chunk (state=3): >>><<< 15330 1726882288.10829: stderr chunk (state=3): >>><<< 15330 1726882288.10974: done transferring module to remote 15330 1726882288.10977: _low_level_execute_command(): starting 15330 1726882288.10980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046/ /root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046/AnsiballZ_setup.py && sleep 0' 15330 1726882288.11821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882288.11834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882288.11846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882288.11926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882288.11929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882288.11973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882288.13699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882288.13718: stdout chunk (state=3): >>><<< 15330 1726882288.13721: stderr chunk (state=3): >>><<< 15330 1726882288.13809: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882288.13813: _low_level_execute_command(): starting 15330 1726882288.13816: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046/AnsiballZ_setup.py && sleep 0' 15330 1726882288.14284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882288.14376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882288.14380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882288.14382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882288.14384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882288.14386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882288.14388: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882288.14482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882288.14531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882288.78398: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_loadavg": {"1m": 0.935546875, "5m": 0.50830078125, "15m": 0.2392578125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "28", "epoch": "1726882288", "epoch_int": "1726882288", "date": "2024-09-20", "time": "21:31:28", "iso8601_micro": "2024-09-21T01:31:28.407829Z", "iso8601": "2024-09-21T01:31:28Z", "iso8601_basic": "20240920T213128407829", "iso8601_basic_short": "20240920T213128", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "exe<<< 15330 1726882288.78460: stdout chunk (state=3): >>>cutable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 595, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803098112, "block_size": 4096, "block_total": 65519099, "block_available": 63916772, "block_used": 1602327, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882288.80511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882288.80544: stderr chunk (state=3): >>><<< 15330 1726882288.80563: stdout chunk (state=3): >>><<< 15330 1726882288.80609: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_loadavg": {"1m": 0.935546875, "5m": 0.50830078125, "15m": 0.2392578125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "28", "epoch": "1726882288", "epoch_int": "1726882288", "date": "2024-09-20", "time": "21:31:28", "iso8601_micro": "2024-09-21T01:31:28.407829Z", "iso8601": "2024-09-21T01:31:28Z", "iso8601_basic": "20240920T213128407829", "iso8601_basic_short": "20240920T213128", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 595, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803098112, "block_size": 4096, "block_total": 65519099, "block_available": 63916772, "block_used": 1602327, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882288.80991: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882288.81024: _low_level_execute_command(): starting 15330 1726882288.81034: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882288.0219622-16993-59506483071046/ > /dev/null 2>&1 && sleep 0' 15330 1726882288.81717: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882288.81730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882288.81751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882288.81777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882288.81812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882288.81828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882288.81862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882288.81942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882288.82003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882288.82057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882288.83918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882288.83921: stdout chunk (state=3): >>><<< 15330 1726882288.83923: stderr chunk (state=3): >>><<< 15330 1726882288.84099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882288.84102: handler run complete 15330 1726882288.84105: variable 'ansible_facts' from source: unknown 15330 1726882288.84361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882288.84691: variable 'ansible_facts' from source: unknown 15330 1726882288.84790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882288.84929: attempt loop complete, returning result 15330 1726882288.84937: _execute() done 15330 1726882288.84943: dumping result to json 15330 1726882288.84978: done dumping result, returning 15330 1726882288.85097: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-00000000046e] 15330 1726882288.85100: sending task result for task 12673a56-9f93-e4fe-1358-00000000046e ok: [managed_node3] 15330 1726882288.85975: no more pending results, returning what we have 15330 1726882288.85978: results queue empty 15330 1726882288.85979: checking for any_errors_fatal 15330 1726882288.85980: done checking for any_errors_fatal 15330 1726882288.85981: checking for max_fail_percentage 15330 1726882288.85982: done checking for max_fail_percentage 15330 1726882288.85983: checking to see if all hosts have failed and the running result is not ok 15330 1726882288.85983: done checking to see if all hosts have failed 15330 1726882288.85984: getting the remaining hosts for this loop 15330 1726882288.85985: done getting the remaining hosts for this loop 15330 1726882288.85988: getting the next task for host managed_node3 15330 1726882288.85995: done getting next task for host managed_node3 15330 1726882288.85996: ^ task is: TASK: meta (flush_handlers) 15330 1726882288.85998: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882288.86002: getting variables 15330 1726882288.86003: in VariableManager get_vars() 15330 1726882288.86025: Calling all_inventory to load vars for managed_node3 15330 1726882288.86027: Calling groups_inventory to load vars for managed_node3 15330 1726882288.86030: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882288.86036: done sending task result for task 12673a56-9f93-e4fe-1358-00000000046e 15330 1726882288.86046: WORKER PROCESS EXITING 15330 1726882288.86055: Calling all_plugins_play to load vars for managed_node3 15330 1726882288.86058: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882288.86061: Calling groups_plugins_play to load vars for managed_node3 15330 1726882288.86962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882288.87828: done with get_vars() 15330 1726882288.87843: done getting variables 15330 1726882288.87895: in VariableManager get_vars() 15330 1726882288.87903: Calling all_inventory to load vars for managed_node3 15330 1726882288.87904: Calling groups_inventory to load vars for managed_node3 15330 1726882288.87906: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882288.87909: Calling all_plugins_play to load vars for managed_node3 15330 1726882288.87910: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882288.87912: Calling groups_plugins_play to load vars for managed_node3 15330 1726882288.88897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882288.90373: done with get_vars() 15330 1726882288.90395: done queuing things up, now waiting for results queue to drain 15330 1726882288.90397: results queue empty 15330 1726882288.90397: checking for any_errors_fatal 15330 1726882288.90400: done checking for any_errors_fatal 15330 1726882288.90404: checking for max_fail_percentage 15330 1726882288.90405: done checking for max_fail_percentage 15330 1726882288.90405: checking to see if all hosts have failed and the running result is not ok 15330 1726882288.90406: done checking to see if all hosts have failed 15330 1726882288.90406: getting the remaining hosts for this loop 15330 1726882288.90407: done getting the remaining hosts for this loop 15330 1726882288.90409: getting the next task for host managed_node3 15330 1726882288.90411: done getting next task for host managed_node3 15330 1726882288.90414: ^ task is: TASK: Include the task '{{ task }}' 15330 1726882288.90415: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882288.90416: getting variables 15330 1726882288.90417: in VariableManager get_vars() 15330 1726882288.90423: Calling all_inventory to load vars for managed_node3 15330 1726882288.90425: Calling groups_inventory to load vars for managed_node3 15330 1726882288.90431: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882288.90439: Calling all_plugins_play to load vars for managed_node3 15330 1726882288.90442: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882288.90445: Calling groups_plugins_play to load vars for managed_node3 15330 1726882288.91100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882288.92811: done with get_vars() 15330 1726882288.92832: done getting variables 15330 1726882288.92982: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:31:28 -0400 (0:00:00.953) 0:00:38.136 ****** 15330 1726882288.93014: entering _queue_task() for managed_node3/include_tasks 15330 1726882288.93294: worker is 1 (out of 1 available) 15330 1726882288.93306: exiting _queue_task() for managed_node3/include_tasks 15330 1726882288.93318: done queuing things up, now waiting for results queue to drain 15330 1726882288.93319: waiting for pending results... 15330 1726882288.93717: running TaskExecutor() for managed_node3/TASK: Include the task 'tasks/assert_profile_absent.yml' 15330 1726882288.93722: in run() - task 12673a56-9f93-e4fe-1358-000000000073 15330 1726882288.93726: variable 'ansible_search_path' from source: unknown 15330 1726882288.93756: calling self._execute() 15330 1726882288.93826: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882288.93831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882288.93845: variable 'omit' from source: magic vars 15330 1726882288.94108: variable 'ansible_distribution_major_version' from source: facts 15330 1726882288.94117: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882288.94124: variable 'task' from source: play vars 15330 1726882288.94175: variable 'task' from source: play vars 15330 1726882288.94182: _execute() done 15330 1726882288.94185: dumping result to json 15330 1726882288.94191: done dumping result, returning 15330 1726882288.94201: done running TaskExecutor() for managed_node3/TASK: Include the task 'tasks/assert_profile_absent.yml' [12673a56-9f93-e4fe-1358-000000000073] 15330 1726882288.94204: sending task result for task 12673a56-9f93-e4fe-1358-000000000073 15330 1726882288.94297: done sending task result for task 12673a56-9f93-e4fe-1358-000000000073 15330 1726882288.94300: WORKER PROCESS EXITING 15330 1726882288.94325: no more pending results, returning what we have 15330 1726882288.94330: in VariableManager get_vars() 15330 1726882288.94361: Calling all_inventory to load vars for managed_node3 15330 1726882288.94363: Calling groups_inventory to load vars for managed_node3 15330 1726882288.94366: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882288.94380: Calling all_plugins_play to load vars for managed_node3 15330 1726882288.94383: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882288.94385: Calling groups_plugins_play to load vars for managed_node3 15330 1726882288.96901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882288.98710: done with get_vars() 15330 1726882288.98738: variable 'ansible_search_path' from source: unknown 15330 1726882288.98764: we have included files to process 15330 1726882288.98765: generating all_blocks data 15330 1726882288.98767: done generating all_blocks data 15330 1726882288.98768: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15330 1726882288.98770: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15330 1726882288.98773: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15330 1726882288.98959: in VariableManager get_vars() 15330 1726882288.98985: done with get_vars() 15330 1726882288.99105: done processing included file 15330 1726882288.99107: iterating over new_blocks loaded from include file 15330 1726882288.99109: in VariableManager get_vars() 15330 1726882288.99120: done with get_vars() 15330 1726882288.99122: filtering new block on tags 15330 1726882288.99139: done filtering new block on tags 15330 1726882288.99142: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 15330 1726882288.99147: extending task lists for all hosts with included blocks 15330 1726882288.99175: done extending task lists 15330 1726882288.99177: done processing included files 15330 1726882288.99178: results queue empty 15330 1726882288.99178: checking for any_errors_fatal 15330 1726882288.99180: done checking for any_errors_fatal 15330 1726882288.99180: checking for max_fail_percentage 15330 1726882288.99181: done checking for max_fail_percentage 15330 1726882288.99182: checking to see if all hosts have failed and the running result is not ok 15330 1726882288.99183: done checking to see if all hosts have failed 15330 1726882288.99183: getting the remaining hosts for this loop 15330 1726882288.99184: done getting the remaining hosts for this loop 15330 1726882288.99187: getting the next task for host managed_node3 15330 1726882288.99190: done getting next task for host managed_node3 15330 1726882288.99192: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15330 1726882288.99202: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882288.99205: getting variables 15330 1726882288.99206: in VariableManager get_vars() 15330 1726882288.99215: Calling all_inventory to load vars for managed_node3 15330 1726882288.99217: Calling groups_inventory to load vars for managed_node3 15330 1726882288.99220: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882288.99225: Calling all_plugins_play to load vars for managed_node3 15330 1726882288.99228: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882288.99230: Calling groups_plugins_play to load vars for managed_node3 15330 1726882289.01325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882289.02869: done with get_vars() 15330 1726882289.02889: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:31:29 -0400 (0:00:00.099) 0:00:38.235 ****** 15330 1726882289.02947: entering _queue_task() for managed_node3/include_tasks 15330 1726882289.03205: worker is 1 (out of 1 available) 15330 1726882289.03219: exiting _queue_task() for managed_node3/include_tasks 15330 1726882289.03230: done queuing things up, now waiting for results queue to drain 15330 1726882289.03232: waiting for pending results... 15330 1726882289.03410: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 15330 1726882289.03479: in run() - task 12673a56-9f93-e4fe-1358-00000000047f 15330 1726882289.03492: variable 'ansible_search_path' from source: unknown 15330 1726882289.03497: variable 'ansible_search_path' from source: unknown 15330 1726882289.03523: calling self._execute() 15330 1726882289.03595: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.03603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.03611: variable 'omit' from source: magic vars 15330 1726882289.03977: variable 'ansible_distribution_major_version' from source: facts 15330 1726882289.03987: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882289.04081: _execute() done 15330 1726882289.04085: dumping result to json 15330 1726882289.04087: done dumping result, returning 15330 1726882289.04089: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-e4fe-1358-00000000047f] 15330 1726882289.04091: sending task result for task 12673a56-9f93-e4fe-1358-00000000047f 15330 1726882289.04289: done sending task result for task 12673a56-9f93-e4fe-1358-00000000047f 15330 1726882289.04292: WORKER PROCESS EXITING 15330 1726882289.04325: no more pending results, returning what we have 15330 1726882289.04364: in VariableManager get_vars() 15330 1726882289.04403: Calling all_inventory to load vars for managed_node3 15330 1726882289.04406: Calling groups_inventory to load vars for managed_node3 15330 1726882289.04412: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882289.04429: Calling all_plugins_play to load vars for managed_node3 15330 1726882289.04436: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882289.04442: Calling groups_plugins_play to load vars for managed_node3 15330 1726882289.05531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882289.07051: done with get_vars() 15330 1726882289.07066: variable 'ansible_search_path' from source: unknown 15330 1726882289.07067: variable 'ansible_search_path' from source: unknown 15330 1726882289.07073: variable 'task' from source: play vars 15330 1726882289.07164: variable 'task' from source: play vars 15330 1726882289.07190: we have included files to process 15330 1726882289.07191: generating all_blocks data 15330 1726882289.07192: done generating all_blocks data 15330 1726882289.07194: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15330 1726882289.07197: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15330 1726882289.07199: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15330 1726882289.08140: done processing included file 15330 1726882289.08142: iterating over new_blocks loaded from include file 15330 1726882289.08143: in VariableManager get_vars() 15330 1726882289.08164: done with get_vars() 15330 1726882289.08168: filtering new block on tags 15330 1726882289.08199: done filtering new block on tags 15330 1726882289.08203: in VariableManager get_vars() 15330 1726882289.08213: done with get_vars() 15330 1726882289.08214: filtering new block on tags 15330 1726882289.08227: done filtering new block on tags 15330 1726882289.08228: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 15330 1726882289.08232: extending task lists for all hosts with included blocks 15330 1726882289.08289: done extending task lists 15330 1726882289.08290: done processing included files 15330 1726882289.08291: results queue empty 15330 1726882289.08291: checking for any_errors_fatal 15330 1726882289.08295: done checking for any_errors_fatal 15330 1726882289.08296: checking for max_fail_percentage 15330 1726882289.08297: done checking for max_fail_percentage 15330 1726882289.08298: checking to see if all hosts have failed and the running result is not ok 15330 1726882289.08299: done checking to see if all hosts have failed 15330 1726882289.08299: getting the remaining hosts for this loop 15330 1726882289.08300: done getting the remaining hosts for this loop 15330 1726882289.08302: getting the next task for host managed_node3 15330 1726882289.08305: done getting next task for host managed_node3 15330 1726882289.08306: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15330 1726882289.08308: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882289.08310: getting variables 15330 1726882289.08310: in VariableManager get_vars() 15330 1726882289.08316: Calling all_inventory to load vars for managed_node3 15330 1726882289.08318: Calling groups_inventory to load vars for managed_node3 15330 1726882289.08319: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882289.08323: Calling all_plugins_play to load vars for managed_node3 15330 1726882289.08325: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882289.08326: Calling groups_plugins_play to load vars for managed_node3 15330 1726882289.09210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882289.13943: done with get_vars() 15330 1726882289.13959: done getting variables 15330 1726882289.13995: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:31:29 -0400 (0:00:00.110) 0:00:38.346 ****** 15330 1726882289.14029: entering _queue_task() for managed_node3/set_fact 15330 1726882289.14379: worker is 1 (out of 1 available) 15330 1726882289.14397: exiting _queue_task() for managed_node3/set_fact 15330 1726882289.14410: done queuing things up, now waiting for results queue to drain 15330 1726882289.14411: waiting for pending results... 15330 1726882289.14626: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 15330 1726882289.14704: in run() - task 12673a56-9f93-e4fe-1358-00000000048a 15330 1726882289.14714: variable 'ansible_search_path' from source: unknown 15330 1726882289.14718: variable 'ansible_search_path' from source: unknown 15330 1726882289.14747: calling self._execute() 15330 1726882289.14818: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.14822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.14831: variable 'omit' from source: magic vars 15330 1726882289.15133: variable 'ansible_distribution_major_version' from source: facts 15330 1726882289.15136: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882289.15140: variable 'omit' from source: magic vars 15330 1726882289.15163: variable 'omit' from source: magic vars 15330 1726882289.15190: variable 'omit' from source: magic vars 15330 1726882289.15225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882289.15250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882289.15267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882289.15280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882289.15299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882289.15320: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882289.15323: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.15326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.15389: Set connection var ansible_pipelining to False 15330 1726882289.15404: Set connection var ansible_timeout to 10 15330 1726882289.15408: Set connection var ansible_connection to ssh 15330 1726882289.15410: Set connection var ansible_shell_type to sh 15330 1726882289.15413: Set connection var ansible_shell_executable to /bin/sh 15330 1726882289.15420: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882289.15436: variable 'ansible_shell_executable' from source: unknown 15330 1726882289.15439: variable 'ansible_connection' from source: unknown 15330 1726882289.15441: variable 'ansible_module_compression' from source: unknown 15330 1726882289.15444: variable 'ansible_shell_type' from source: unknown 15330 1726882289.15446: variable 'ansible_shell_executable' from source: unknown 15330 1726882289.15448: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.15452: variable 'ansible_pipelining' from source: unknown 15330 1726882289.15456: variable 'ansible_timeout' from source: unknown 15330 1726882289.15458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.15561: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882289.15569: variable 'omit' from source: magic vars 15330 1726882289.15574: starting attempt loop 15330 1726882289.15577: running the handler 15330 1726882289.15586: handler run complete 15330 1726882289.15599: attempt loop complete, returning result 15330 1726882289.15601: _execute() done 15330 1726882289.15604: dumping result to json 15330 1726882289.15606: done dumping result, returning 15330 1726882289.15614: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-e4fe-1358-00000000048a] 15330 1726882289.15620: sending task result for task 12673a56-9f93-e4fe-1358-00000000048a 15330 1726882289.15781: done sending task result for task 12673a56-9f93-e4fe-1358-00000000048a 15330 1726882289.15784: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15330 1726882289.15859: no more pending results, returning what we have 15330 1726882289.15862: results queue empty 15330 1726882289.15863: checking for any_errors_fatal 15330 1726882289.15865: done checking for any_errors_fatal 15330 1726882289.15866: checking for max_fail_percentage 15330 1726882289.15867: done checking for max_fail_percentage 15330 1726882289.15868: checking to see if all hosts have failed and the running result is not ok 15330 1726882289.15869: done checking to see if all hosts have failed 15330 1726882289.15870: getting the remaining hosts for this loop 15330 1726882289.15871: done getting the remaining hosts for this loop 15330 1726882289.15874: getting the next task for host managed_node3 15330 1726882289.15880: done getting next task for host managed_node3 15330 1726882289.15882: ^ task is: TASK: Stat profile file 15330 1726882289.15886: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882289.15891: getting variables 15330 1726882289.15896: in VariableManager get_vars() 15330 1726882289.15927: Calling all_inventory to load vars for managed_node3 15330 1726882289.15930: Calling groups_inventory to load vars for managed_node3 15330 1726882289.15933: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882289.15945: Calling all_plugins_play to load vars for managed_node3 15330 1726882289.15949: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882289.15952: Calling groups_plugins_play to load vars for managed_node3 15330 1726882289.16822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882289.17856: done with get_vars() 15330 1726882289.17872: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:31:29 -0400 (0:00:00.039) 0:00:38.385 ****** 15330 1726882289.17950: entering _queue_task() for managed_node3/stat 15330 1726882289.18185: worker is 1 (out of 1 available) 15330 1726882289.18205: exiting _queue_task() for managed_node3/stat 15330 1726882289.18219: done queuing things up, now waiting for results queue to drain 15330 1726882289.18220: waiting for pending results... 15330 1726882289.18441: running TaskExecutor() for managed_node3/TASK: Stat profile file 15330 1726882289.18580: in run() - task 12673a56-9f93-e4fe-1358-00000000048b 15330 1726882289.18584: variable 'ansible_search_path' from source: unknown 15330 1726882289.18614: variable 'ansible_search_path' from source: unknown 15330 1726882289.18630: calling self._execute() 15330 1726882289.18716: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.18720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.18731: variable 'omit' from source: magic vars 15330 1726882289.19087: variable 'ansible_distribution_major_version' from source: facts 15330 1726882289.19125: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882289.19129: variable 'omit' from source: magic vars 15330 1726882289.19241: variable 'omit' from source: magic vars 15330 1726882289.19262: variable 'profile' from source: play vars 15330 1726882289.19269: variable 'interface' from source: set_fact 15330 1726882289.19337: variable 'interface' from source: set_fact 15330 1726882289.19351: variable 'omit' from source: magic vars 15330 1726882289.19403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882289.19432: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882289.19448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882289.19463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882289.19484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882289.19575: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882289.19578: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.19580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.19648: Set connection var ansible_pipelining to False 15330 1726882289.19682: Set connection var ansible_timeout to 10 15330 1726882289.19688: Set connection var ansible_connection to ssh 15330 1726882289.19691: Set connection var ansible_shell_type to sh 15330 1726882289.19695: Set connection var ansible_shell_executable to /bin/sh 15330 1726882289.19698: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882289.19705: variable 'ansible_shell_executable' from source: unknown 15330 1726882289.19708: variable 'ansible_connection' from source: unknown 15330 1726882289.19710: variable 'ansible_module_compression' from source: unknown 15330 1726882289.19713: variable 'ansible_shell_type' from source: unknown 15330 1726882289.19716: variable 'ansible_shell_executable' from source: unknown 15330 1726882289.19719: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.19723: variable 'ansible_pipelining' from source: unknown 15330 1726882289.19726: variable 'ansible_timeout' from source: unknown 15330 1726882289.19728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.19887: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882289.19895: variable 'omit' from source: magic vars 15330 1726882289.19901: starting attempt loop 15330 1726882289.19904: running the handler 15330 1726882289.19915: _low_level_execute_command(): starting 15330 1726882289.19922: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882289.20572: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882289.20577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882289.20581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.20639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.20698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882289.22369: stdout chunk (state=3): >>>/root <<< 15330 1726882289.22467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882289.22508: stderr chunk (state=3): >>><<< 15330 1726882289.22511: stdout chunk (state=3): >>><<< 15330 1726882289.22536: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882289.22564: _low_level_execute_command(): starting 15330 1726882289.22568: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761 `" && echo ansible-tmp-1726882289.2253466-17043-272536939096761="` echo /root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761 `" ) && sleep 0' 15330 1726882289.23043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882289.23047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882289.23049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.23058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882289.23061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.23118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.23171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882289.25026: stdout chunk (state=3): >>>ansible-tmp-1726882289.2253466-17043-272536939096761=/root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761 <<< 15330 1726882289.25300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882289.25308: stdout chunk (state=3): >>><<< 15330 1726882289.25315: stderr chunk (state=3): >>><<< 15330 1726882289.25319: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882289.2253466-17043-272536939096761=/root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882289.25337: variable 'ansible_module_compression' from source: unknown 15330 1726882289.25402: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15330 1726882289.25472: variable 'ansible_facts' from source: unknown 15330 1726882289.25521: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761/AnsiballZ_stat.py 15330 1726882289.25855: Sending initial data 15330 1726882289.25863: Sent initial data (153 bytes) 15330 1726882289.26591: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882289.26609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882289.26621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.26633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 15330 1726882289.26645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.26702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882289.26715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.26781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882289.28275: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882289.28317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882289.28360: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpdg9xmhws /root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761/AnsiballZ_stat.py <<< 15330 1726882289.28364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761/AnsiballZ_stat.py" <<< 15330 1726882289.28402: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpdg9xmhws" to remote "/root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761/AnsiballZ_stat.py" <<< 15330 1726882289.29030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882289.29100: stderr chunk (state=3): >>><<< 15330 1726882289.29103: stdout chunk (state=3): >>><<< 15330 1726882289.29106: done transferring module to remote 15330 1726882289.29108: _low_level_execute_command(): starting 15330 1726882289.29110: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761/ /root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761/AnsiballZ_stat.py && sleep 0' 15330 1726882289.29588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882289.29621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882289.29628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.29676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882289.29680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882289.29697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.29738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882289.31828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882289.31831: stdout chunk (state=3): >>><<< 15330 1726882289.31833: stderr chunk (state=3): >>><<< 15330 1726882289.31836: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882289.31838: _low_level_execute_command(): starting 15330 1726882289.31841: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761/AnsiballZ_stat.py && sleep 0' 15330 1726882289.32898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882289.32901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882289.32962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.33010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882289.33023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882289.33058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.33169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882289.48231: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15330 1726882289.49509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882289.49513: stdout chunk (state=3): >>><<< 15330 1726882289.49515: stderr chunk (state=3): >>><<< 15330 1726882289.49519: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882289.49522: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882289.49524: _low_level_execute_command(): starting 15330 1726882289.49527: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882289.2253466-17043-272536939096761/ > /dev/null 2>&1 && sleep 0' 15330 1726882289.50664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882289.50668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882289.50670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882289.50672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882289.50674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882289.50768: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882289.51028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.51072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882289.52929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882289.52932: stdout chunk (state=3): >>><<< 15330 1726882289.52934: stderr chunk (state=3): >>><<< 15330 1726882289.52951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882289.53098: handler run complete 15330 1726882289.53102: attempt loop complete, returning result 15330 1726882289.53104: _execute() done 15330 1726882289.53107: dumping result to json 15330 1726882289.53109: done dumping result, returning 15330 1726882289.53111: done running TaskExecutor() for managed_node3/TASK: Stat profile file [12673a56-9f93-e4fe-1358-00000000048b] 15330 1726882289.53113: sending task result for task 12673a56-9f93-e4fe-1358-00000000048b 15330 1726882289.53182: done sending task result for task 12673a56-9f93-e4fe-1358-00000000048b 15330 1726882289.53185: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 15330 1726882289.53246: no more pending results, returning what we have 15330 1726882289.53249: results queue empty 15330 1726882289.53251: checking for any_errors_fatal 15330 1726882289.53259: done checking for any_errors_fatal 15330 1726882289.53260: checking for max_fail_percentage 15330 1726882289.53261: done checking for max_fail_percentage 15330 1726882289.53262: checking to see if all hosts have failed and the running result is not ok 15330 1726882289.53263: done checking to see if all hosts have failed 15330 1726882289.53264: getting the remaining hosts for this loop 15330 1726882289.53266: done getting the remaining hosts for this loop 15330 1726882289.53269: getting the next task for host managed_node3 15330 1726882289.53276: done getting next task for host managed_node3 15330 1726882289.53279: ^ task is: TASK: Set NM profile exist flag based on the profile files 15330 1726882289.53283: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882289.53287: getting variables 15330 1726882289.53289: in VariableManager get_vars() 15330 1726882289.53321: Calling all_inventory to load vars for managed_node3 15330 1726882289.53324: Calling groups_inventory to load vars for managed_node3 15330 1726882289.53328: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882289.53340: Calling all_plugins_play to load vars for managed_node3 15330 1726882289.53343: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882289.53346: Calling groups_plugins_play to load vars for managed_node3 15330 1726882289.57240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882289.60532: done with get_vars() 15330 1726882289.60671: done getting variables 15330 1726882289.60736: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:31:29 -0400 (0:00:00.429) 0:00:38.814 ****** 15330 1726882289.60885: entering _queue_task() for managed_node3/set_fact 15330 1726882289.61571: worker is 1 (out of 1 available) 15330 1726882289.61583: exiting _queue_task() for managed_node3/set_fact 15330 1726882289.61701: done queuing things up, now waiting for results queue to drain 15330 1726882289.61703: waiting for pending results... 15330 1726882289.62114: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 15330 1726882289.62600: in run() - task 12673a56-9f93-e4fe-1358-00000000048c 15330 1726882289.62604: variable 'ansible_search_path' from source: unknown 15330 1726882289.62607: variable 'ansible_search_path' from source: unknown 15330 1726882289.62610: calling self._execute() 15330 1726882289.62612: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.62615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.62619: variable 'omit' from source: magic vars 15330 1726882289.63307: variable 'ansible_distribution_major_version' from source: facts 15330 1726882289.63599: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882289.63999: variable 'profile_stat' from source: set_fact 15330 1726882289.64002: Evaluated conditional (profile_stat.stat.exists): False 15330 1726882289.64004: when evaluation is False, skipping this task 15330 1726882289.64007: _execute() done 15330 1726882289.64009: dumping result to json 15330 1726882289.64011: done dumping result, returning 15330 1726882289.64015: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-e4fe-1358-00000000048c] 15330 1726882289.64017: sending task result for task 12673a56-9f93-e4fe-1358-00000000048c skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15330 1726882289.64147: no more pending results, returning what we have 15330 1726882289.64150: results queue empty 15330 1726882289.64152: checking for any_errors_fatal 15330 1726882289.64161: done checking for any_errors_fatal 15330 1726882289.64162: checking for max_fail_percentage 15330 1726882289.64164: done checking for max_fail_percentage 15330 1726882289.64165: checking to see if all hosts have failed and the running result is not ok 15330 1726882289.64166: done checking to see if all hosts have failed 15330 1726882289.64167: getting the remaining hosts for this loop 15330 1726882289.64168: done getting the remaining hosts for this loop 15330 1726882289.64172: getting the next task for host managed_node3 15330 1726882289.64180: done getting next task for host managed_node3 15330 1726882289.64183: ^ task is: TASK: Get NM profile info 15330 1726882289.64191: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882289.64197: getting variables 15330 1726882289.64199: in VariableManager get_vars() 15330 1726882289.64230: Calling all_inventory to load vars for managed_node3 15330 1726882289.64233: Calling groups_inventory to load vars for managed_node3 15330 1726882289.64354: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882289.64361: done sending task result for task 12673a56-9f93-e4fe-1358-00000000048c 15330 1726882289.64365: WORKER PROCESS EXITING 15330 1726882289.64375: Calling all_plugins_play to load vars for managed_node3 15330 1726882289.64378: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882289.64381: Calling groups_plugins_play to load vars for managed_node3 15330 1726882289.67637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882289.70876: done with get_vars() 15330 1726882289.70918: done getting variables 15330 1726882289.70976: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:31:29 -0400 (0:00:00.101) 0:00:38.916 ****** 15330 1726882289.71012: entering _queue_task() for managed_node3/shell 15330 1726882289.71426: worker is 1 (out of 1 available) 15330 1726882289.71437: exiting _queue_task() for managed_node3/shell 15330 1726882289.71447: done queuing things up, now waiting for results queue to drain 15330 1726882289.71448: waiting for pending results... 15330 1726882289.71763: running TaskExecutor() for managed_node3/TASK: Get NM profile info 15330 1726882289.71875: in run() - task 12673a56-9f93-e4fe-1358-00000000048d 15330 1726882289.71898: variable 'ansible_search_path' from source: unknown 15330 1726882289.71911: variable 'ansible_search_path' from source: unknown 15330 1726882289.71954: calling self._execute() 15330 1726882289.72052: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.72065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.72079: variable 'omit' from source: magic vars 15330 1726882289.72453: variable 'ansible_distribution_major_version' from source: facts 15330 1726882289.72471: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882289.72484: variable 'omit' from source: magic vars 15330 1726882289.72536: variable 'omit' from source: magic vars 15330 1726882289.72639: variable 'profile' from source: play vars 15330 1726882289.72650: variable 'interface' from source: set_fact 15330 1726882289.72717: variable 'interface' from source: set_fact 15330 1726882289.72747: variable 'omit' from source: magic vars 15330 1726882289.73398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882289.73402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882289.73404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882289.73407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882289.73409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882289.73411: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882289.73416: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.73418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.73600: Set connection var ansible_pipelining to False 15330 1726882289.73616: Set connection var ansible_timeout to 10 15330 1726882289.73619: Set connection var ansible_connection to ssh 15330 1726882289.73621: Set connection var ansible_shell_type to sh 15330 1726882289.73628: Set connection var ansible_shell_executable to /bin/sh 15330 1726882289.73631: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882289.73654: variable 'ansible_shell_executable' from source: unknown 15330 1726882289.73657: variable 'ansible_connection' from source: unknown 15330 1726882289.73660: variable 'ansible_module_compression' from source: unknown 15330 1726882289.73662: variable 'ansible_shell_type' from source: unknown 15330 1726882289.73665: variable 'ansible_shell_executable' from source: unknown 15330 1726882289.73667: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882289.73669: variable 'ansible_pipelining' from source: unknown 15330 1726882289.73672: variable 'ansible_timeout' from source: unknown 15330 1726882289.73674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882289.74015: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882289.74028: variable 'omit' from source: magic vars 15330 1726882289.74033: starting attempt loop 15330 1726882289.74036: running the handler 15330 1726882289.74047: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882289.74066: _low_level_execute_command(): starting 15330 1726882289.74078: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882289.75229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882289.75233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882289.75235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882289.75255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882289.75272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882289.75305: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882289.75316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.75527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882289.75536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882289.75544: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15330 1726882289.75552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882289.75562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882289.75574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882289.75581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882289.75596: stderr chunk (state=3): >>>debug2: match found <<< 15330 1726882289.75608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.75681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882289.75784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.76023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882289.77676: stdout chunk (state=3): >>>/root <<< 15330 1726882289.77821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882289.77824: stdout chunk (state=3): >>><<< 15330 1726882289.77825: stderr chunk (state=3): >>><<< 15330 1726882289.77944: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882289.77955: _low_level_execute_command(): starting 15330 1726882289.77958: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616 `" && echo ansible-tmp-1726882289.7784903-17069-2218725005616="` echo /root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616 `" ) && sleep 0' 15330 1726882289.78515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882289.78529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882289.78544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882289.78561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882289.78578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882289.78597: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882289.78689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882289.78778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.78924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882289.80791: stdout chunk (state=3): >>>ansible-tmp-1726882289.7784903-17069-2218725005616=/root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616 <<< 15330 1726882289.80938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882289.80949: stdout chunk (state=3): >>><<< 15330 1726882289.80968: stderr chunk (state=3): >>><<< 15330 1726882289.80997: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882289.7784903-17069-2218725005616=/root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882289.81037: variable 'ansible_module_compression' from source: unknown 15330 1726882289.81106: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15330 1726882289.81151: variable 'ansible_facts' from source: unknown 15330 1726882289.81337: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616/AnsiballZ_command.py 15330 1726882289.81475: Sending initial data 15330 1726882289.81478: Sent initial data (154 bytes) 15330 1726882289.82229: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882289.82254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882289.82375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882289.82422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.82514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882289.84043: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882289.84149: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882289.84205: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp21j_pyoa /root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616/AnsiballZ_command.py <<< 15330 1726882289.84209: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp21j_pyoa" to remote "/root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616/AnsiballZ_command.py" <<< 15330 1726882289.85554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882289.85722: stderr chunk (state=3): >>><<< 15330 1726882289.85725: stdout chunk (state=3): >>><<< 15330 1726882289.85727: done transferring module to remote 15330 1726882289.85728: _low_level_execute_command(): starting 15330 1726882289.85730: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616/ /root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616/AnsiballZ_command.py && sleep 0' 15330 1726882289.86845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882289.86853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.87013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.87036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882289.87053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882289.87069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.87242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882289.88860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882289.88939: stderr chunk (state=3): >>><<< 15330 1726882289.88942: stdout chunk (state=3): >>><<< 15330 1726882289.89039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882289.89042: _low_level_execute_command(): starting 15330 1726882289.89044: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616/AnsiballZ_command.py && sleep 0' 15330 1726882289.89590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882289.89610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882289.89638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882289.89658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882289.89679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882289.89699: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882289.89756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882289.89970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882289.90086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882289.90239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882290.06585: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:31:30.048484", "end": "2024-09-20 21:31:30.064264", "delta": "0:00:00.015780", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15330 1726882290.08104: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.10.229 closed. <<< 15330 1726882290.08108: stdout chunk (state=3): >>><<< 15330 1726882290.08110: stderr chunk (state=3): >>><<< 15330 1726882290.08113: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:31:30.048484", "end": "2024-09-20 21:31:30.064264", "delta": "0:00:00.015780", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.10.229 closed. 15330 1726882290.08245: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882290.08252: _low_level_execute_command(): starting 15330 1726882290.08255: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882289.7784903-17069-2218725005616/ > /dev/null 2>&1 && sleep 0' 15330 1726882290.08766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882290.08820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 15330 1726882290.08824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882290.08850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882290.08902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882290.08906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882290.08932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882290.08977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882290.10838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882290.10853: stderr chunk (state=3): >>><<< 15330 1726882290.10857: stdout chunk (state=3): >>><<< 15330 1726882290.10909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882290.10928: handler run complete 15330 1726882290.10931: Evaluated conditional (False): False 15330 1726882290.10936: attempt loop complete, returning result 15330 1726882290.10938: _execute() done 15330 1726882290.10940: dumping result to json 15330 1726882290.10978: done dumping result, returning 15330 1726882290.10986: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [12673a56-9f93-e4fe-1358-00000000048d] 15330 1726882290.10988: sending task result for task 12673a56-9f93-e4fe-1358-00000000048d 15330 1726882290.11122: done sending task result for task 12673a56-9f93-e4fe-1358-00000000048d 15330 1726882290.11126: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.015780", "end": "2024-09-20 21:31:30.064264", "rc": 1, "start": "2024-09-20 21:31:30.048484" } MSG: non-zero return code ...ignoring 15330 1726882290.11245: no more pending results, returning what we have 15330 1726882290.11250: results queue empty 15330 1726882290.11251: checking for any_errors_fatal 15330 1726882290.11264: done checking for any_errors_fatal 15330 1726882290.11265: checking for max_fail_percentage 15330 1726882290.11267: done checking for max_fail_percentage 15330 1726882290.11270: checking to see if all hosts have failed and the running result is not ok 15330 1726882290.11271: done checking to see if all hosts have failed 15330 1726882290.11273: getting the remaining hosts for this loop 15330 1726882290.11274: done getting the remaining hosts for this loop 15330 1726882290.11278: getting the next task for host managed_node3 15330 1726882290.11284: done getting next task for host managed_node3 15330 1726882290.11287: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15330 1726882290.11291: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.11297: getting variables 15330 1726882290.11298: in VariableManager get_vars() 15330 1726882290.11331: Calling all_inventory to load vars for managed_node3 15330 1726882290.11334: Calling groups_inventory to load vars for managed_node3 15330 1726882290.11337: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.11348: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.11350: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.11353: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.13205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.14549: done with get_vars() 15330 1726882290.14567: done getting variables 15330 1726882290.14613: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:31:30 -0400 (0:00:00.436) 0:00:39.352 ****** 15330 1726882290.14643: entering _queue_task() for managed_node3/set_fact 15330 1726882290.14918: worker is 1 (out of 1 available) 15330 1726882290.14929: exiting _queue_task() for managed_node3/set_fact 15330 1726882290.14942: done queuing things up, now waiting for results queue to drain 15330 1726882290.14944: waiting for pending results... 15330 1726882290.15129: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15330 1726882290.15237: in run() - task 12673a56-9f93-e4fe-1358-00000000048e 15330 1726882290.15247: variable 'ansible_search_path' from source: unknown 15330 1726882290.15251: variable 'ansible_search_path' from source: unknown 15330 1726882290.15321: calling self._execute() 15330 1726882290.15390: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.15397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.15421: variable 'omit' from source: magic vars 15330 1726882290.15776: variable 'ansible_distribution_major_version' from source: facts 15330 1726882290.15786: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882290.15875: variable 'nm_profile_exists' from source: set_fact 15330 1726882290.15884: Evaluated conditional (nm_profile_exists.rc == 0): False 15330 1726882290.15887: when evaluation is False, skipping this task 15330 1726882290.15895: _execute() done 15330 1726882290.15899: dumping result to json 15330 1726882290.15901: done dumping result, returning 15330 1726882290.15909: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-e4fe-1358-00000000048e] 15330 1726882290.15912: sending task result for task 12673a56-9f93-e4fe-1358-00000000048e 15330 1726882290.15995: done sending task result for task 12673a56-9f93-e4fe-1358-00000000048e 15330 1726882290.15998: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 15330 1726882290.16046: no more pending results, returning what we have 15330 1726882290.16049: results queue empty 15330 1726882290.16050: checking for any_errors_fatal 15330 1726882290.16059: done checking for any_errors_fatal 15330 1726882290.16060: checking for max_fail_percentage 15330 1726882290.16061: done checking for max_fail_percentage 15330 1726882290.16062: checking to see if all hosts have failed and the running result is not ok 15330 1726882290.16063: done checking to see if all hosts have failed 15330 1726882290.16064: getting the remaining hosts for this loop 15330 1726882290.16065: done getting the remaining hosts for this loop 15330 1726882290.16068: getting the next task for host managed_node3 15330 1726882290.16078: done getting next task for host managed_node3 15330 1726882290.16081: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15330 1726882290.16085: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.16088: getting variables 15330 1726882290.16089: in VariableManager get_vars() 15330 1726882290.16118: Calling all_inventory to load vars for managed_node3 15330 1726882290.16120: Calling groups_inventory to load vars for managed_node3 15330 1726882290.16123: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.16132: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.16135: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.16137: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.16892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.17877: done with get_vars() 15330 1726882290.17895: done getting variables 15330 1726882290.17936: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882290.18019: variable 'profile' from source: play vars 15330 1726882290.18022: variable 'interface' from source: set_fact 15330 1726882290.18063: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:31:30 -0400 (0:00:00.034) 0:00:39.386 ****** 15330 1726882290.18088: entering _queue_task() for managed_node3/command 15330 1726882290.18348: worker is 1 (out of 1 available) 15330 1726882290.18361: exiting _queue_task() for managed_node3/command 15330 1726882290.18375: done queuing things up, now waiting for results queue to drain 15330 1726882290.18377: waiting for pending results... 15330 1726882290.18620: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15330 1726882290.18768: in run() - task 12673a56-9f93-e4fe-1358-000000000490 15330 1726882290.18780: variable 'ansible_search_path' from source: unknown 15330 1726882290.18783: variable 'ansible_search_path' from source: unknown 15330 1726882290.18845: calling self._execute() 15330 1726882290.18910: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.18914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.18950: variable 'omit' from source: magic vars 15330 1726882290.19316: variable 'ansible_distribution_major_version' from source: facts 15330 1726882290.19334: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882290.19467: variable 'profile_stat' from source: set_fact 15330 1726882290.19472: Evaluated conditional (profile_stat.stat.exists): False 15330 1726882290.19503: when evaluation is False, skipping this task 15330 1726882290.19507: _execute() done 15330 1726882290.19511: dumping result to json 15330 1726882290.19514: done dumping result, returning 15330 1726882290.19517: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000490] 15330 1726882290.19519: sending task result for task 12673a56-9f93-e4fe-1358-000000000490 15330 1726882290.19625: done sending task result for task 12673a56-9f93-e4fe-1358-000000000490 15330 1726882290.19632: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15330 1726882290.19694: no more pending results, returning what we have 15330 1726882290.19698: results queue empty 15330 1726882290.19699: checking for any_errors_fatal 15330 1726882290.19710: done checking for any_errors_fatal 15330 1726882290.19711: checking for max_fail_percentage 15330 1726882290.19714: done checking for max_fail_percentage 15330 1726882290.19715: checking to see if all hosts have failed and the running result is not ok 15330 1726882290.19715: done checking to see if all hosts have failed 15330 1726882290.19716: getting the remaining hosts for this loop 15330 1726882290.19717: done getting the remaining hosts for this loop 15330 1726882290.19721: getting the next task for host managed_node3 15330 1726882290.19727: done getting next task for host managed_node3 15330 1726882290.19729: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15330 1726882290.19733: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.19736: getting variables 15330 1726882290.19737: in VariableManager get_vars() 15330 1726882290.19764: Calling all_inventory to load vars for managed_node3 15330 1726882290.19767: Calling groups_inventory to load vars for managed_node3 15330 1726882290.19770: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.19780: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.19783: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.19787: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.20955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.21866: done with get_vars() 15330 1726882290.21882: done getting variables 15330 1726882290.21926: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882290.22001: variable 'profile' from source: play vars 15330 1726882290.22004: variable 'interface' from source: set_fact 15330 1726882290.22044: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:31:30 -0400 (0:00:00.039) 0:00:39.426 ****** 15330 1726882290.22065: entering _queue_task() for managed_node3/set_fact 15330 1726882290.22346: worker is 1 (out of 1 available) 15330 1726882290.22360: exiting _queue_task() for managed_node3/set_fact 15330 1726882290.22372: done queuing things up, now waiting for results queue to drain 15330 1726882290.22374: waiting for pending results... 15330 1726882290.22604: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15330 1726882290.22685: in run() - task 12673a56-9f93-e4fe-1358-000000000491 15330 1726882290.22699: variable 'ansible_search_path' from source: unknown 15330 1726882290.22703: variable 'ansible_search_path' from source: unknown 15330 1726882290.22733: calling self._execute() 15330 1726882290.22833: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.22837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.22851: variable 'omit' from source: magic vars 15330 1726882290.23114: variable 'ansible_distribution_major_version' from source: facts 15330 1726882290.23123: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882290.23213: variable 'profile_stat' from source: set_fact 15330 1726882290.23224: Evaluated conditional (profile_stat.stat.exists): False 15330 1726882290.23227: when evaluation is False, skipping this task 15330 1726882290.23232: _execute() done 15330 1726882290.23235: dumping result to json 15330 1726882290.23238: done dumping result, returning 15330 1726882290.23271: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000491] 15330 1726882290.23274: sending task result for task 12673a56-9f93-e4fe-1358-000000000491 15330 1726882290.23348: done sending task result for task 12673a56-9f93-e4fe-1358-000000000491 15330 1726882290.23350: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15330 1726882290.23423: no more pending results, returning what we have 15330 1726882290.23426: results queue empty 15330 1726882290.23427: checking for any_errors_fatal 15330 1726882290.23431: done checking for any_errors_fatal 15330 1726882290.23432: checking for max_fail_percentage 15330 1726882290.23433: done checking for max_fail_percentage 15330 1726882290.23434: checking to see if all hosts have failed and the running result is not ok 15330 1726882290.23435: done checking to see if all hosts have failed 15330 1726882290.23436: getting the remaining hosts for this loop 15330 1726882290.23437: done getting the remaining hosts for this loop 15330 1726882290.23440: getting the next task for host managed_node3 15330 1726882290.23446: done getting next task for host managed_node3 15330 1726882290.23448: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15330 1726882290.23452: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.23455: getting variables 15330 1726882290.23456: in VariableManager get_vars() 15330 1726882290.23479: Calling all_inventory to load vars for managed_node3 15330 1726882290.23481: Calling groups_inventory to load vars for managed_node3 15330 1726882290.23484: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.23498: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.23500: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.23508: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.24603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.25798: done with get_vars() 15330 1726882290.25812: done getting variables 15330 1726882290.25880: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882290.25984: variable 'profile' from source: play vars 15330 1726882290.25989: variable 'interface' from source: set_fact 15330 1726882290.26040: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:31:30 -0400 (0:00:00.040) 0:00:39.466 ****** 15330 1726882290.26075: entering _queue_task() for managed_node3/command 15330 1726882290.26406: worker is 1 (out of 1 available) 15330 1726882290.26418: exiting _queue_task() for managed_node3/command 15330 1726882290.26433: done queuing things up, now waiting for results queue to drain 15330 1726882290.26434: waiting for pending results... 15330 1726882290.26639: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15330 1726882290.26733: in run() - task 12673a56-9f93-e4fe-1358-000000000492 15330 1726882290.26743: variable 'ansible_search_path' from source: unknown 15330 1726882290.26747: variable 'ansible_search_path' from source: unknown 15330 1726882290.26773: calling self._execute() 15330 1726882290.26844: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.26848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.26855: variable 'omit' from source: magic vars 15330 1726882290.27114: variable 'ansible_distribution_major_version' from source: facts 15330 1726882290.27125: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882290.27208: variable 'profile_stat' from source: set_fact 15330 1726882290.27219: Evaluated conditional (profile_stat.stat.exists): False 15330 1726882290.27223: when evaluation is False, skipping this task 15330 1726882290.27226: _execute() done 15330 1726882290.27230: dumping result to json 15330 1726882290.27233: done dumping result, returning 15330 1726882290.27236: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000492] 15330 1726882290.27242: sending task result for task 12673a56-9f93-e4fe-1358-000000000492 15330 1726882290.27329: done sending task result for task 12673a56-9f93-e4fe-1358-000000000492 15330 1726882290.27331: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15330 1726882290.27380: no more pending results, returning what we have 15330 1726882290.27384: results queue empty 15330 1726882290.27385: checking for any_errors_fatal 15330 1726882290.27392: done checking for any_errors_fatal 15330 1726882290.27395: checking for max_fail_percentage 15330 1726882290.27396: done checking for max_fail_percentage 15330 1726882290.27397: checking to see if all hosts have failed and the running result is not ok 15330 1726882290.27398: done checking to see if all hosts have failed 15330 1726882290.27398: getting the remaining hosts for this loop 15330 1726882290.27400: done getting the remaining hosts for this loop 15330 1726882290.27404: getting the next task for host managed_node3 15330 1726882290.27410: done getting next task for host managed_node3 15330 1726882290.27412: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15330 1726882290.27416: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.27419: getting variables 15330 1726882290.27420: in VariableManager get_vars() 15330 1726882290.27443: Calling all_inventory to load vars for managed_node3 15330 1726882290.27445: Calling groups_inventory to load vars for managed_node3 15330 1726882290.27448: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.27457: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.27459: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.27461: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.28234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.29139: done with get_vars() 15330 1726882290.29167: done getting variables 15330 1726882290.29248: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882290.29350: variable 'profile' from source: play vars 15330 1726882290.29353: variable 'interface' from source: set_fact 15330 1726882290.29396: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:31:30 -0400 (0:00:00.033) 0:00:39.500 ****** 15330 1726882290.29417: entering _queue_task() for managed_node3/set_fact 15330 1726882290.29622: worker is 1 (out of 1 available) 15330 1726882290.29634: exiting _queue_task() for managed_node3/set_fact 15330 1726882290.29649: done queuing things up, now waiting for results queue to drain 15330 1726882290.29651: waiting for pending results... 15330 1726882290.30022: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15330 1726882290.30070: in run() - task 12673a56-9f93-e4fe-1358-000000000493 15330 1726882290.30082: variable 'ansible_search_path' from source: unknown 15330 1726882290.30085: variable 'ansible_search_path' from source: unknown 15330 1726882290.30117: calling self._execute() 15330 1726882290.30182: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.30189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.30197: variable 'omit' from source: magic vars 15330 1726882290.30690: variable 'ansible_distribution_major_version' from source: facts 15330 1726882290.30898: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882290.30902: variable 'profile_stat' from source: set_fact 15330 1726882290.30915: Evaluated conditional (profile_stat.stat.exists): False 15330 1726882290.30918: when evaluation is False, skipping this task 15330 1726882290.30921: _execute() done 15330 1726882290.30923: dumping result to json 15330 1726882290.30932: done dumping result, returning 15330 1726882290.30938: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [12673a56-9f93-e4fe-1358-000000000493] 15330 1726882290.30943: sending task result for task 12673a56-9f93-e4fe-1358-000000000493 15330 1726882290.31029: done sending task result for task 12673a56-9f93-e4fe-1358-000000000493 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15330 1726882290.31089: no more pending results, returning what we have 15330 1726882290.31095: results queue empty 15330 1726882290.31096: checking for any_errors_fatal 15330 1726882290.31107: done checking for any_errors_fatal 15330 1726882290.31108: checking for max_fail_percentage 15330 1726882290.31112: done checking for max_fail_percentage 15330 1726882290.31113: checking to see if all hosts have failed and the running result is not ok 15330 1726882290.31115: done checking to see if all hosts have failed 15330 1726882290.31115: getting the remaining hosts for this loop 15330 1726882290.31117: done getting the remaining hosts for this loop 15330 1726882290.31122: getting the next task for host managed_node3 15330 1726882290.31134: done getting next task for host managed_node3 15330 1726882290.31138: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 15330 1726882290.31141: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.31147: getting variables 15330 1726882290.31149: in VariableManager get_vars() 15330 1726882290.31177: Calling all_inventory to load vars for managed_node3 15330 1726882290.31179: Calling groups_inventory to load vars for managed_node3 15330 1726882290.31182: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.31196: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.31199: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.31204: WORKER PROCESS EXITING 15330 1726882290.31208: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.32483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.34139: done with get_vars() 15330 1726882290.34163: done getting variables 15330 1726882290.34227: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882290.34335: variable 'profile' from source: play vars 15330 1726882290.34338: variable 'interface' from source: set_fact 15330 1726882290.34392: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:31:30 -0400 (0:00:00.050) 0:00:39.550 ****** 15330 1726882290.34426: entering _queue_task() for managed_node3/assert 15330 1726882290.34734: worker is 1 (out of 1 available) 15330 1726882290.34747: exiting _queue_task() for managed_node3/assert 15330 1726882290.34765: done queuing things up, now waiting for results queue to drain 15330 1726882290.34767: waiting for pending results... 15330 1726882290.35052: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'LSR-TST-br31' 15330 1726882290.35176: in run() - task 12673a56-9f93-e4fe-1358-000000000480 15330 1726882290.35200: variable 'ansible_search_path' from source: unknown 15330 1726882290.35210: variable 'ansible_search_path' from source: unknown 15330 1726882290.35253: calling self._execute() 15330 1726882290.35345: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.35356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.35369: variable 'omit' from source: magic vars 15330 1726882290.35725: variable 'ansible_distribution_major_version' from source: facts 15330 1726882290.35742: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882290.35753: variable 'omit' from source: magic vars 15330 1726882290.35800: variable 'omit' from source: magic vars 15330 1726882290.35907: variable 'profile' from source: play vars 15330 1726882290.35916: variable 'interface' from source: set_fact 15330 1726882290.35984: variable 'interface' from source: set_fact 15330 1726882290.36013: variable 'omit' from source: magic vars 15330 1726882290.36065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882290.36111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882290.36134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882290.36156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882290.36172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882290.36210: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882290.36218: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.36224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.36328: Set connection var ansible_pipelining to False 15330 1726882290.36345: Set connection var ansible_timeout to 10 15330 1726882290.36397: Set connection var ansible_connection to ssh 15330 1726882290.36400: Set connection var ansible_shell_type to sh 15330 1726882290.36402: Set connection var ansible_shell_executable to /bin/sh 15330 1726882290.36405: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882290.36407: variable 'ansible_shell_executable' from source: unknown 15330 1726882290.36410: variable 'ansible_connection' from source: unknown 15330 1726882290.36413: variable 'ansible_module_compression' from source: unknown 15330 1726882290.36415: variable 'ansible_shell_type' from source: unknown 15330 1726882290.36416: variable 'ansible_shell_executable' from source: unknown 15330 1726882290.36598: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.36602: variable 'ansible_pipelining' from source: unknown 15330 1726882290.36604: variable 'ansible_timeout' from source: unknown 15330 1726882290.36606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.36609: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882290.36611: variable 'omit' from source: magic vars 15330 1726882290.36613: starting attempt loop 15330 1726882290.36615: running the handler 15330 1726882290.36716: variable 'lsr_net_profile_exists' from source: set_fact 15330 1726882290.36730: Evaluated conditional (not lsr_net_profile_exists): True 15330 1726882290.36739: handler run complete 15330 1726882290.36757: attempt loop complete, returning result 15330 1726882290.36763: _execute() done 15330 1726882290.36769: dumping result to json 15330 1726882290.36776: done dumping result, returning 15330 1726882290.36786: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'LSR-TST-br31' [12673a56-9f93-e4fe-1358-000000000480] 15330 1726882290.36796: sending task result for task 12673a56-9f93-e4fe-1358-000000000480 15330 1726882290.37099: done sending task result for task 12673a56-9f93-e4fe-1358-000000000480 15330 1726882290.37103: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15330 1726882290.37145: no more pending results, returning what we have 15330 1726882290.37148: results queue empty 15330 1726882290.37149: checking for any_errors_fatal 15330 1726882290.37156: done checking for any_errors_fatal 15330 1726882290.37157: checking for max_fail_percentage 15330 1726882290.37159: done checking for max_fail_percentage 15330 1726882290.37159: checking to see if all hosts have failed and the running result is not ok 15330 1726882290.37160: done checking to see if all hosts have failed 15330 1726882290.37161: getting the remaining hosts for this loop 15330 1726882290.37162: done getting the remaining hosts for this loop 15330 1726882290.37166: getting the next task for host managed_node3 15330 1726882290.37175: done getting next task for host managed_node3 15330 1726882290.37177: ^ task is: TASK: meta (flush_handlers) 15330 1726882290.37178: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.37183: getting variables 15330 1726882290.37184: in VariableManager get_vars() 15330 1726882290.37215: Calling all_inventory to load vars for managed_node3 15330 1726882290.37218: Calling groups_inventory to load vars for managed_node3 15330 1726882290.37222: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.37232: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.37235: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.37238: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.38343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.39401: done with get_vars() 15330 1726882290.39427: done getting variables 15330 1726882290.39517: in VariableManager get_vars() 15330 1726882290.39528: Calling all_inventory to load vars for managed_node3 15330 1726882290.39533: Calling groups_inventory to load vars for managed_node3 15330 1726882290.39536: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.39542: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.39545: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.39548: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.40947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.42918: done with get_vars() 15330 1726882290.42949: done queuing things up, now waiting for results queue to drain 15330 1726882290.42951: results queue empty 15330 1726882290.42952: checking for any_errors_fatal 15330 1726882290.42954: done checking for any_errors_fatal 15330 1726882290.42955: checking for max_fail_percentage 15330 1726882290.42956: done checking for max_fail_percentage 15330 1726882290.42957: checking to see if all hosts have failed and the running result is not ok 15330 1726882290.42962: done checking to see if all hosts have failed 15330 1726882290.42963: getting the remaining hosts for this loop 15330 1726882290.42964: done getting the remaining hosts for this loop 15330 1726882290.42967: getting the next task for host managed_node3 15330 1726882290.42971: done getting next task for host managed_node3 15330 1726882290.42972: ^ task is: TASK: meta (flush_handlers) 15330 1726882290.42973: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.42976: getting variables 15330 1726882290.42977: in VariableManager get_vars() 15330 1726882290.42986: Calling all_inventory to load vars for managed_node3 15330 1726882290.42988: Calling groups_inventory to load vars for managed_node3 15330 1726882290.42990: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.43172: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.43176: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.43179: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.44359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.46008: done with get_vars() 15330 1726882290.46029: done getting variables 15330 1726882290.46078: in VariableManager get_vars() 15330 1726882290.46088: Calling all_inventory to load vars for managed_node3 15330 1726882290.46090: Calling groups_inventory to load vars for managed_node3 15330 1726882290.46094: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.46099: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.46102: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.46104: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.47162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.48676: done with get_vars() 15330 1726882290.48702: done queuing things up, now waiting for results queue to drain 15330 1726882290.48704: results queue empty 15330 1726882290.48705: checking for any_errors_fatal 15330 1726882290.48706: done checking for any_errors_fatal 15330 1726882290.48707: checking for max_fail_percentage 15330 1726882290.48708: done checking for max_fail_percentage 15330 1726882290.48709: checking to see if all hosts have failed and the running result is not ok 15330 1726882290.48709: done checking to see if all hosts have failed 15330 1726882290.48710: getting the remaining hosts for this loop 15330 1726882290.48711: done getting the remaining hosts for this loop 15330 1726882290.48714: getting the next task for host managed_node3 15330 1726882290.48717: done getting next task for host managed_node3 15330 1726882290.48717: ^ task is: None 15330 1726882290.48719: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.48720: done queuing things up, now waiting for results queue to drain 15330 1726882290.48721: results queue empty 15330 1726882290.48721: checking for any_errors_fatal 15330 1726882290.48722: done checking for any_errors_fatal 15330 1726882290.48723: checking for max_fail_percentage 15330 1726882290.48724: done checking for max_fail_percentage 15330 1726882290.48724: checking to see if all hosts have failed and the running result is not ok 15330 1726882290.48725: done checking to see if all hosts have failed 15330 1726882290.48726: getting the next task for host managed_node3 15330 1726882290.48728: done getting next task for host managed_node3 15330 1726882290.48729: ^ task is: None 15330 1726882290.48731: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.48772: in VariableManager get_vars() 15330 1726882290.48787: done with get_vars() 15330 1726882290.48795: in VariableManager get_vars() 15330 1726882290.48805: done with get_vars() 15330 1726882290.48810: variable 'omit' from source: magic vars 15330 1726882290.48920: variable 'task' from source: play vars 15330 1726882290.48949: in VariableManager get_vars() 15330 1726882290.48959: done with get_vars() 15330 1726882290.48976: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 15330 1726882290.49204: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15330 1726882290.49228: getting the remaining hosts for this loop 15330 1726882290.49230: done getting the remaining hosts for this loop 15330 1726882290.49232: getting the next task for host managed_node3 15330 1726882290.49235: done getting next task for host managed_node3 15330 1726882290.49237: ^ task is: TASK: Gathering Facts 15330 1726882290.49238: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882290.49240: getting variables 15330 1726882290.49241: in VariableManager get_vars() 15330 1726882290.49248: Calling all_inventory to load vars for managed_node3 15330 1726882290.49251: Calling groups_inventory to load vars for managed_node3 15330 1726882290.49253: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882290.49257: Calling all_plugins_play to load vars for managed_node3 15330 1726882290.49260: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882290.49262: Calling groups_plugins_play to load vars for managed_node3 15330 1726882290.50500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882290.51977: done with get_vars() 15330 1726882290.51997: done getting variables 15330 1726882290.52035: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:31:30 -0400 (0:00:00.176) 0:00:39.726 ****** 15330 1726882290.52059: entering _queue_task() for managed_node3/gather_facts 15330 1726882290.52368: worker is 1 (out of 1 available) 15330 1726882290.52379: exiting _queue_task() for managed_node3/gather_facts 15330 1726882290.52391: done queuing things up, now waiting for results queue to drain 15330 1726882290.52392: waiting for pending results... 15330 1726882290.52655: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882290.52800: in run() - task 12673a56-9f93-e4fe-1358-0000000004c5 15330 1726882290.52803: variable 'ansible_search_path' from source: unknown 15330 1726882290.52823: calling self._execute() 15330 1726882290.52917: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.52936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.52998: variable 'omit' from source: magic vars 15330 1726882290.53309: variable 'ansible_distribution_major_version' from source: facts 15330 1726882290.53325: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882290.53335: variable 'omit' from source: magic vars 15330 1726882290.53372: variable 'omit' from source: magic vars 15330 1726882290.53415: variable 'omit' from source: magic vars 15330 1726882290.53458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882290.53506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882290.53532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882290.53581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882290.53584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882290.53605: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882290.53614: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.53621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.53799: Set connection var ansible_pipelining to False 15330 1726882290.53803: Set connection var ansible_timeout to 10 15330 1726882290.53805: Set connection var ansible_connection to ssh 15330 1726882290.53808: Set connection var ansible_shell_type to sh 15330 1726882290.53810: Set connection var ansible_shell_executable to /bin/sh 15330 1726882290.53812: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882290.53814: variable 'ansible_shell_executable' from source: unknown 15330 1726882290.53816: variable 'ansible_connection' from source: unknown 15330 1726882290.53818: variable 'ansible_module_compression' from source: unknown 15330 1726882290.53820: variable 'ansible_shell_type' from source: unknown 15330 1726882290.53822: variable 'ansible_shell_executable' from source: unknown 15330 1726882290.53824: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882290.53825: variable 'ansible_pipelining' from source: unknown 15330 1726882290.53827: variable 'ansible_timeout' from source: unknown 15330 1726882290.53829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882290.54003: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882290.54024: variable 'omit' from source: magic vars 15330 1726882290.54033: starting attempt loop 15330 1726882290.54038: running the handler 15330 1726882290.54054: variable 'ansible_facts' from source: unknown 15330 1726882290.54073: _low_level_execute_command(): starting 15330 1726882290.54083: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882290.54902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882290.54916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882290.54934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882290.55133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882290.56807: stdout chunk (state=3): >>>/root <<< 15330 1726882290.56970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882290.56973: stdout chunk (state=3): >>><<< 15330 1726882290.56976: stderr chunk (state=3): >>><<< 15330 1726882290.57185: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882290.57191: _low_level_execute_command(): starting 15330 1726882290.57196: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744 `" && echo ansible-tmp-1726882290.5709763-17103-109401875821744="` echo /root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744 `" ) && sleep 0' 15330 1726882290.58142: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882290.58157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882290.58200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882290.58219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882290.58241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882290.58252: stderr chunk (state=3): >>>debug2: match not found <<< 15330 1726882290.58265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882290.58297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15330 1726882290.58313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 15330 1726882290.58348: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882290.58435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882290.58467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882290.58507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882290.58573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882290.60710: stdout chunk (state=3): >>>ansible-tmp-1726882290.5709763-17103-109401875821744=/root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744 <<< 15330 1726882290.60916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882290.60919: stdout chunk (state=3): >>><<< 15330 1726882290.60922: stderr chunk (state=3): >>><<< 15330 1726882290.60925: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882290.5709763-17103-109401875821744=/root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882290.60928: variable 'ansible_module_compression' from source: unknown 15330 1726882290.60931: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882290.61146: variable 'ansible_facts' from source: unknown 15330 1726882290.61538: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744/AnsiballZ_setup.py 15330 1726882290.61814: Sending initial data 15330 1726882290.61825: Sent initial data (154 bytes) 15330 1726882290.62974: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882290.62990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882290.63005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882290.63139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882290.63152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882290.63217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882290.63255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882290.64819: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15330 1726882290.64835: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882290.64917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882290.64979: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp8jdzjkfh /root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744/AnsiballZ_setup.py <<< 15330 1726882290.65005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744/AnsiballZ_setup.py" <<< 15330 1726882290.65046: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp8jdzjkfh" to remote "/root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744/AnsiballZ_setup.py" <<< 15330 1726882290.65058: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744/AnsiballZ_setup.py" <<< 15330 1726882290.67983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882290.68196: stderr chunk (state=3): >>><<< 15330 1726882290.68199: stdout chunk (state=3): >>><<< 15330 1726882290.68202: done transferring module to remote 15330 1726882290.68204: _low_level_execute_command(): starting 15330 1726882290.68206: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744/ /root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744/AnsiballZ_setup.py && sleep 0' 15330 1726882290.69437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882290.69440: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882290.69514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882290.69708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882290.69809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882290.71530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882290.71562: stderr chunk (state=3): >>><<< 15330 1726882290.71572: stdout chunk (state=3): >>><<< 15330 1726882290.71611: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882290.71776: _low_level_execute_command(): starting 15330 1726882290.71780: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744/AnsiballZ_setup.py && sleep 0' 15330 1726882290.72813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882290.72843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882291.34025: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2970, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 561, "free": 2970}, "nocache": {"free": 3286, "used": 245}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 598, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803069440, "block_size": 4096, "block_total": 65519099, "block_available": 63916765, "block_used": 1602334, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "31", "epoch": "1726882291", "epoch_int": "1726882291", "date": "2024-09-20", "time": "21:31:31", "iso8601_micro": "2024-09-21T01:31:31.301224Z", "iso8601": "2024-09-21T01:31:31Z", "iso8601_basic": "20240920T213131301224", "iso8601_basic_short": "20240920T213131", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on <<< 15330 1726882291.34119: stdout chunk (state=3): >>>[fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_loadavg": {"1m": 0.935546875, "5m": 0.50830078125, "15m": 0.2392578125}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882291.36002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882291.36300: stderr chunk (state=3): >>><<< 15330 1726882291.36304: stdout chunk (state=3): >>><<< 15330 1726882291.36310: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2970, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 561, "free": 2970}, "nocache": {"free": 3286, "used": 245}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 598, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803069440, "block_size": 4096, "block_total": 65519099, "block_available": 63916765, "block_used": 1602334, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "31", "epoch": "1726882291", "epoch_int": "1726882291", "date": "2024-09-20", "time": "21:31:31", "iso8601_micro": "2024-09-21T01:31:31.301224Z", "iso8601": "2024-09-21T01:31:31Z", "iso8601_basic": "20240920T213131301224", "iso8601_basic_short": "20240920T213131", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_loadavg": {"1m": 0.935546875, "5m": 0.50830078125, "15m": 0.2392578125}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882291.36918: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882291.36950: _low_level_execute_command(): starting 15330 1726882291.36961: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882290.5709763-17103-109401875821744/ > /dev/null 2>&1 && sleep 0' 15330 1726882291.38137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882291.38154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882291.38262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882291.38290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882291.38310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882291.38496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882291.40252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882291.40262: stdout chunk (state=3): >>><<< 15330 1726882291.40272: stderr chunk (state=3): >>><<< 15330 1726882291.40701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882291.40705: handler run complete 15330 1726882291.40707: variable 'ansible_facts' from source: unknown 15330 1726882291.40772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882291.41485: variable 'ansible_facts' from source: unknown 15330 1726882291.41698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882291.41944: attempt loop complete, returning result 15330 1726882291.42007: _execute() done 15330 1726882291.42016: dumping result to json 15330 1726882291.42052: done dumping result, returning 15330 1726882291.42120: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-0000000004c5] 15330 1726882291.42131: sending task result for task 12673a56-9f93-e4fe-1358-0000000004c5 ok: [managed_node3] 15330 1726882291.43187: no more pending results, returning what we have 15330 1726882291.43190: results queue empty 15330 1726882291.43191: checking for any_errors_fatal 15330 1726882291.43194: done checking for any_errors_fatal 15330 1726882291.43195: checking for max_fail_percentage 15330 1726882291.43197: done checking for max_fail_percentage 15330 1726882291.43197: checking to see if all hosts have failed and the running result is not ok 15330 1726882291.43198: done checking to see if all hosts have failed 15330 1726882291.43199: getting the remaining hosts for this loop 15330 1726882291.43200: done getting the remaining hosts for this loop 15330 1726882291.43204: getting the next task for host managed_node3 15330 1726882291.43209: done getting next task for host managed_node3 15330 1726882291.43210: ^ task is: TASK: meta (flush_handlers) 15330 1726882291.43213: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882291.43216: getting variables 15330 1726882291.43218: in VariableManager get_vars() 15330 1726882291.43240: Calling all_inventory to load vars for managed_node3 15330 1726882291.43242: Calling groups_inventory to load vars for managed_node3 15330 1726882291.43245: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882291.43504: Calling all_plugins_play to load vars for managed_node3 15330 1726882291.43508: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882291.43511: Calling groups_plugins_play to load vars for managed_node3 15330 1726882291.44207: done sending task result for task 12673a56-9f93-e4fe-1358-0000000004c5 15330 1726882291.44211: WORKER PROCESS EXITING 15330 1726882291.46075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882291.49606: done with get_vars() 15330 1726882291.49629: done getting variables 15330 1726882291.49903: in VariableManager get_vars() 15330 1726882291.49914: Calling all_inventory to load vars for managed_node3 15330 1726882291.49916: Calling groups_inventory to load vars for managed_node3 15330 1726882291.49919: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882291.49924: Calling all_plugins_play to load vars for managed_node3 15330 1726882291.49927: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882291.49930: Calling groups_plugins_play to load vars for managed_node3 15330 1726882291.52111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882291.55496: done with get_vars() 15330 1726882291.55526: done queuing things up, now waiting for results queue to drain 15330 1726882291.55528: results queue empty 15330 1726882291.55529: checking for any_errors_fatal 15330 1726882291.55534: done checking for any_errors_fatal 15330 1726882291.55534: checking for max_fail_percentage 15330 1726882291.55535: done checking for max_fail_percentage 15330 1726882291.55536: checking to see if all hosts have failed and the running result is not ok 15330 1726882291.55541: done checking to see if all hosts have failed 15330 1726882291.55542: getting the remaining hosts for this loop 15330 1726882291.55543: done getting the remaining hosts for this loop 15330 1726882291.55660: getting the next task for host managed_node3 15330 1726882291.55666: done getting next task for host managed_node3 15330 1726882291.55669: ^ task is: TASK: Include the task '{{ task }}' 15330 1726882291.55670: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882291.55673: getting variables 15330 1726882291.55674: in VariableManager get_vars() 15330 1726882291.55683: Calling all_inventory to load vars for managed_node3 15330 1726882291.55688: Calling groups_inventory to load vars for managed_node3 15330 1726882291.55691: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882291.55698: Calling all_plugins_play to load vars for managed_node3 15330 1726882291.55701: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882291.55704: Calling groups_plugins_play to load vars for managed_node3 15330 1726882291.58453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882291.60362: done with get_vars() 15330 1726882291.60382: done getting variables 15330 1726882291.60552: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:31:31 -0400 (0:00:01.085) 0:00:40.811 ****** 15330 1726882291.60588: entering _queue_task() for managed_node3/include_tasks 15330 1726882291.61125: worker is 1 (out of 1 available) 15330 1726882291.61132: exiting _queue_task() for managed_node3/include_tasks 15330 1726882291.61142: done queuing things up, now waiting for results queue to drain 15330 1726882291.61143: waiting for pending results... 15330 1726882291.61257: running TaskExecutor() for managed_node3/TASK: Include the task 'tasks/assert_device_absent.yml' 15330 1726882291.61396: in run() - task 12673a56-9f93-e4fe-1358-000000000077 15330 1726882291.61418: variable 'ansible_search_path' from source: unknown 15330 1726882291.61463: calling self._execute() 15330 1726882291.61574: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882291.61596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882291.61618: variable 'omit' from source: magic vars 15330 1726882291.62266: variable 'ansible_distribution_major_version' from source: facts 15330 1726882291.62290: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882291.62306: variable 'task' from source: play vars 15330 1726882291.62389: variable 'task' from source: play vars 15330 1726882291.62459: _execute() done 15330 1726882291.62463: dumping result to json 15330 1726882291.62471: done dumping result, returning 15330 1726882291.62474: done running TaskExecutor() for managed_node3/TASK: Include the task 'tasks/assert_device_absent.yml' [12673a56-9f93-e4fe-1358-000000000077] 15330 1726882291.62477: sending task result for task 12673a56-9f93-e4fe-1358-000000000077 15330 1726882291.62555: done sending task result for task 12673a56-9f93-e4fe-1358-000000000077 15330 1726882291.62559: WORKER PROCESS EXITING 15330 1726882291.62606: no more pending results, returning what we have 15330 1726882291.62611: in VariableManager get_vars() 15330 1726882291.62646: Calling all_inventory to load vars for managed_node3 15330 1726882291.62650: Calling groups_inventory to load vars for managed_node3 15330 1726882291.62653: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882291.62668: Calling all_plugins_play to load vars for managed_node3 15330 1726882291.62676: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882291.62679: Calling groups_plugins_play to load vars for managed_node3 15330 1726882291.64491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882291.66716: done with get_vars() 15330 1726882291.66737: variable 'ansible_search_path' from source: unknown 15330 1726882291.66751: we have included files to process 15330 1726882291.66752: generating all_blocks data 15330 1726882291.66754: done generating all_blocks data 15330 1726882291.66754: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15330 1726882291.66756: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15330 1726882291.66758: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15330 1726882291.67070: in VariableManager get_vars() 15330 1726882291.67091: done with get_vars() 15330 1726882291.67406: done processing included file 15330 1726882291.67408: iterating over new_blocks loaded from include file 15330 1726882291.67409: in VariableManager get_vars() 15330 1726882291.67421: done with get_vars() 15330 1726882291.67422: filtering new block on tags 15330 1726882291.67440: done filtering new block on tags 15330 1726882291.67442: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 15330 1726882291.67448: extending task lists for all hosts with included blocks 15330 1726882291.67477: done extending task lists 15330 1726882291.67478: done processing included files 15330 1726882291.67479: results queue empty 15330 1726882291.67479: checking for any_errors_fatal 15330 1726882291.67481: done checking for any_errors_fatal 15330 1726882291.67481: checking for max_fail_percentage 15330 1726882291.67482: done checking for max_fail_percentage 15330 1726882291.67483: checking to see if all hosts have failed and the running result is not ok 15330 1726882291.67484: done checking to see if all hosts have failed 15330 1726882291.67485: getting the remaining hosts for this loop 15330 1726882291.67489: done getting the remaining hosts for this loop 15330 1726882291.67491: getting the next task for host managed_node3 15330 1726882291.67497: done getting next task for host managed_node3 15330 1726882291.67499: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15330 1726882291.67502: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882291.67504: getting variables 15330 1726882291.67505: in VariableManager get_vars() 15330 1726882291.67513: Calling all_inventory to load vars for managed_node3 15330 1726882291.67515: Calling groups_inventory to load vars for managed_node3 15330 1726882291.67517: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882291.67523: Calling all_plugins_play to load vars for managed_node3 15330 1726882291.67525: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882291.67527: Calling groups_plugins_play to load vars for managed_node3 15330 1726882291.78612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882291.81561: done with get_vars() 15330 1726882291.81590: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:31:31 -0400 (0:00:00.212) 0:00:41.024 ****** 15330 1726882291.81862: entering _queue_task() for managed_node3/include_tasks 15330 1726882291.82631: worker is 1 (out of 1 available) 15330 1726882291.82643: exiting _queue_task() for managed_node3/include_tasks 15330 1726882291.82654: done queuing things up, now waiting for results queue to drain 15330 1726882291.82656: waiting for pending results... 15330 1726882291.82966: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 15330 1726882291.83301: in run() - task 12673a56-9f93-e4fe-1358-0000000004d6 15330 1726882291.83310: variable 'ansible_search_path' from source: unknown 15330 1726882291.83316: variable 'ansible_search_path' from source: unknown 15330 1726882291.83320: calling self._execute() 15330 1726882291.83474: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882291.83603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882291.83607: variable 'omit' from source: magic vars 15330 1726882291.84400: variable 'ansible_distribution_major_version' from source: facts 15330 1726882291.84403: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882291.84406: _execute() done 15330 1726882291.84408: dumping result to json 15330 1726882291.84411: done dumping result, returning 15330 1726882291.84414: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-e4fe-1358-0000000004d6] 15330 1726882291.84417: sending task result for task 12673a56-9f93-e4fe-1358-0000000004d6 15330 1726882291.84491: done sending task result for task 12673a56-9f93-e4fe-1358-0000000004d6 15330 1726882291.84522: no more pending results, returning what we have 15330 1726882291.84527: in VariableManager get_vars() 15330 1726882291.84561: Calling all_inventory to load vars for managed_node3 15330 1726882291.84564: Calling groups_inventory to load vars for managed_node3 15330 1726882291.84567: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882291.84574: WORKER PROCESS EXITING 15330 1726882291.84795: Calling all_plugins_play to load vars for managed_node3 15330 1726882291.84799: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882291.84803: Calling groups_plugins_play to load vars for managed_node3 15330 1726882291.87188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882291.88926: done with get_vars() 15330 1726882291.88982: variable 'ansible_search_path' from source: unknown 15330 1726882291.88983: variable 'ansible_search_path' from source: unknown 15330 1726882291.88997: variable 'task' from source: play vars 15330 1726882291.89239: variable 'task' from source: play vars 15330 1726882291.89273: we have included files to process 15330 1726882291.89274: generating all_blocks data 15330 1726882291.89276: done generating all_blocks data 15330 1726882291.89278: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15330 1726882291.89404: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15330 1726882291.89409: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15330 1726882291.89598: done processing included file 15330 1726882291.89600: iterating over new_blocks loaded from include file 15330 1726882291.89601: in VariableManager get_vars() 15330 1726882291.89624: done with get_vars() 15330 1726882291.89626: filtering new block on tags 15330 1726882291.89642: done filtering new block on tags 15330 1726882291.89644: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 15330 1726882291.89649: extending task lists for all hosts with included blocks 15330 1726882291.89763: done extending task lists 15330 1726882291.89765: done processing included files 15330 1726882291.89766: results queue empty 15330 1726882291.89767: checking for any_errors_fatal 15330 1726882291.89770: done checking for any_errors_fatal 15330 1726882291.89771: checking for max_fail_percentage 15330 1726882291.89772: done checking for max_fail_percentage 15330 1726882291.89773: checking to see if all hosts have failed and the running result is not ok 15330 1726882291.89774: done checking to see if all hosts have failed 15330 1726882291.89774: getting the remaining hosts for this loop 15330 1726882291.89775: done getting the remaining hosts for this loop 15330 1726882291.89778: getting the next task for host managed_node3 15330 1726882291.89782: done getting next task for host managed_node3 15330 1726882291.89784: ^ task is: TASK: Get stat for interface {{ interface }} 15330 1726882291.89787: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882291.89789: getting variables 15330 1726882291.89790: in VariableManager get_vars() 15330 1726882291.89801: Calling all_inventory to load vars for managed_node3 15330 1726882291.89804: Calling groups_inventory to load vars for managed_node3 15330 1726882291.89806: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882291.89811: Calling all_plugins_play to load vars for managed_node3 15330 1726882291.89814: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882291.89816: Calling groups_plugins_play to load vars for managed_node3 15330 1726882291.91248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882291.93141: done with get_vars() 15330 1726882291.93161: done getting variables 15330 1726882291.93383: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:31:31 -0400 (0:00:00.115) 0:00:41.140 ****** 15330 1726882291.93459: entering _queue_task() for managed_node3/stat 15330 1726882291.94022: worker is 1 (out of 1 available) 15330 1726882291.94033: exiting _queue_task() for managed_node3/stat 15330 1726882291.94045: done queuing things up, now waiting for results queue to drain 15330 1726882291.94046: waiting for pending results... 15330 1726882291.94284: running TaskExecutor() for managed_node3/TASK: Get stat for interface LSR-TST-br31 15330 1726882291.94395: in run() - task 12673a56-9f93-e4fe-1358-0000000004e1 15330 1726882291.94425: variable 'ansible_search_path' from source: unknown 15330 1726882291.94527: variable 'ansible_search_path' from source: unknown 15330 1726882291.94532: calling self._execute() 15330 1726882291.94580: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882291.94635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882291.94640: variable 'omit' from source: magic vars 15330 1726882291.95053: variable 'ansible_distribution_major_version' from source: facts 15330 1726882291.95075: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882291.95098: variable 'omit' from source: magic vars 15330 1726882291.95231: variable 'omit' from source: magic vars 15330 1726882291.95342: variable 'interface' from source: set_fact 15330 1726882291.95400: variable 'omit' from source: magic vars 15330 1726882291.95436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882291.95509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882291.95512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882291.95532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882291.95547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882291.95581: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882291.95591: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882291.95617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882291.95716: Set connection var ansible_pipelining to False 15330 1726882291.95798: Set connection var ansible_timeout to 10 15330 1726882291.95801: Set connection var ansible_connection to ssh 15330 1726882291.95803: Set connection var ansible_shell_type to sh 15330 1726882291.95805: Set connection var ansible_shell_executable to /bin/sh 15330 1726882291.95807: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882291.95810: variable 'ansible_shell_executable' from source: unknown 15330 1726882291.95812: variable 'ansible_connection' from source: unknown 15330 1726882291.95814: variable 'ansible_module_compression' from source: unknown 15330 1726882291.95816: variable 'ansible_shell_type' from source: unknown 15330 1726882291.95818: variable 'ansible_shell_executable' from source: unknown 15330 1726882291.95820: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882291.95822: variable 'ansible_pipelining' from source: unknown 15330 1726882291.95824: variable 'ansible_timeout' from source: unknown 15330 1726882291.95826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882291.96035: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15330 1726882291.96064: variable 'omit' from source: magic vars 15330 1726882291.96162: starting attempt loop 15330 1726882291.96165: running the handler 15330 1726882291.96167: _low_level_execute_command(): starting 15330 1726882291.96171: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882291.96917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882291.96971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882291.96990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882291.97025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882291.97151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882291.98834: stdout chunk (state=3): >>>/root <<< 15330 1726882291.98964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882291.98988: stderr chunk (state=3): >>><<< 15330 1726882291.98991: stdout chunk (state=3): >>><<< 15330 1726882291.99145: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882291.99153: _low_level_execute_command(): starting 15330 1726882291.99156: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857 `" && echo ansible-tmp-1726882291.99061-17157-68010658384857="` echo /root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857 `" ) && sleep 0' 15330 1726882291.99713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882291.99716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882291.99822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882291.99836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882291.99890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882292.01777: stdout chunk (state=3): >>>ansible-tmp-1726882291.99061-17157-68010658384857=/root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857 <<< 15330 1726882292.01874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882292.02202: stderr chunk (state=3): >>><<< 15330 1726882292.02206: stdout chunk (state=3): >>><<< 15330 1726882292.02209: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882291.99061-17157-68010658384857=/root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882292.02212: variable 'ansible_module_compression' from source: unknown 15330 1726882292.02223: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15330 1726882292.02267: variable 'ansible_facts' from source: unknown 15330 1726882292.02548: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857/AnsiballZ_stat.py 15330 1726882292.02786: Sending initial data 15330 1726882292.02886: Sent initial data (150 bytes) 15330 1726882292.03540: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.03574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882292.03591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882292.03617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882292.03702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882292.05258: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882292.05312: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882292.05413: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp0y4fz4oi /root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857/AnsiballZ_stat.py <<< 15330 1726882292.05455: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857/AnsiballZ_stat.py" <<< 15330 1726882292.05547: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp0y4fz4oi" to remote "/root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857/AnsiballZ_stat.py" <<< 15330 1726882292.06588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882292.06648: stderr chunk (state=3): >>><<< 15330 1726882292.06652: stdout chunk (state=3): >>><<< 15330 1726882292.06727: done transferring module to remote 15330 1726882292.06731: _low_level_execute_command(): starting 15330 1726882292.06734: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857/ /root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857/AnsiballZ_stat.py && sleep 0' 15330 1726882292.07336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882292.07339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.07368: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882292.07372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.07448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882292.07489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882292.09219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882292.09222: stdout chunk (state=3): >>><<< 15330 1726882292.09224: stderr chunk (state=3): >>><<< 15330 1726882292.09341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882292.09350: _low_level_execute_command(): starting 15330 1726882292.09353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857/AnsiballZ_stat.py && sleep 0' 15330 1726882292.10036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.10039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882292.10041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882292.10064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882292.10067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882292.24943: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15330 1726882292.26251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882292.26269: stderr chunk (state=3): >>><<< 15330 1726882292.26298: stdout chunk (state=3): >>><<< 15330 1726882292.26313: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882292.26335: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882292.26344: _low_level_execute_command(): starting 15330 1726882292.26352: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882291.99061-17157-68010658384857/ > /dev/null 2>&1 && sleep 0' 15330 1726882292.26837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882292.26841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882292.26843: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.26845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882292.26847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.26899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882292.26906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882292.26909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882292.26954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882292.28747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882292.28770: stderr chunk (state=3): >>><<< 15330 1726882292.28774: stdout chunk (state=3): >>><<< 15330 1726882292.28789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882292.28796: handler run complete 15330 1726882292.28812: attempt loop complete, returning result 15330 1726882292.28815: _execute() done 15330 1726882292.28817: dumping result to json 15330 1726882292.28823: done dumping result, returning 15330 1726882292.28830: done running TaskExecutor() for managed_node3/TASK: Get stat for interface LSR-TST-br31 [12673a56-9f93-e4fe-1358-0000000004e1] 15330 1726882292.28833: sending task result for task 12673a56-9f93-e4fe-1358-0000000004e1 15330 1726882292.28925: done sending task result for task 12673a56-9f93-e4fe-1358-0000000004e1 15330 1726882292.28927: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 15330 1726882292.28984: no more pending results, returning what we have 15330 1726882292.28989: results queue empty 15330 1726882292.28990: checking for any_errors_fatal 15330 1726882292.28992: done checking for any_errors_fatal 15330 1726882292.28994: checking for max_fail_percentage 15330 1726882292.28996: done checking for max_fail_percentage 15330 1726882292.28997: checking to see if all hosts have failed and the running result is not ok 15330 1726882292.28998: done checking to see if all hosts have failed 15330 1726882292.28998: getting the remaining hosts for this loop 15330 1726882292.28999: done getting the remaining hosts for this loop 15330 1726882292.29003: getting the next task for host managed_node3 15330 1726882292.29011: done getting next task for host managed_node3 15330 1726882292.29014: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15330 1726882292.29016: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882292.29020: getting variables 15330 1726882292.29022: in VariableManager get_vars() 15330 1726882292.29053: Calling all_inventory to load vars for managed_node3 15330 1726882292.29055: Calling groups_inventory to load vars for managed_node3 15330 1726882292.29059: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882292.29069: Calling all_plugins_play to load vars for managed_node3 15330 1726882292.29072: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882292.29074: Calling groups_plugins_play to load vars for managed_node3 15330 1726882292.30067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882292.31449: done with get_vars() 15330 1726882292.31464: done getting variables 15330 1726882292.31541: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15330 1726882292.31679: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:31:32 -0400 (0:00:00.382) 0:00:41.523 ****** 15330 1726882292.31715: entering _queue_task() for managed_node3/assert 15330 1726882292.31945: worker is 1 (out of 1 available) 15330 1726882292.31956: exiting _queue_task() for managed_node3/assert 15330 1726882292.31967: done queuing things up, now waiting for results queue to drain 15330 1726882292.31968: waiting for pending results... 15330 1726882292.32181: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15330 1726882292.32289: in run() - task 12673a56-9f93-e4fe-1358-0000000004d7 15330 1726882292.32311: variable 'ansible_search_path' from source: unknown 15330 1726882292.32315: variable 'ansible_search_path' from source: unknown 15330 1726882292.32352: calling self._execute() 15330 1726882292.32438: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882292.32442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882292.32468: variable 'omit' from source: magic vars 15330 1726882292.32764: variable 'ansible_distribution_major_version' from source: facts 15330 1726882292.32775: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882292.32778: variable 'omit' from source: magic vars 15330 1726882292.32809: variable 'omit' from source: magic vars 15330 1726882292.32901: variable 'interface' from source: set_fact 15330 1726882292.32938: variable 'omit' from source: magic vars 15330 1726882292.33005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882292.33027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882292.33051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882292.33081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882292.33084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882292.33143: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882292.33150: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882292.33157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882292.33254: Set connection var ansible_pipelining to False 15330 1726882292.33278: Set connection var ansible_timeout to 10 15330 1726882292.33282: Set connection var ansible_connection to ssh 15330 1726882292.33284: Set connection var ansible_shell_type to sh 15330 1726882292.33289: Set connection var ansible_shell_executable to /bin/sh 15330 1726882292.33292: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882292.33320: variable 'ansible_shell_executable' from source: unknown 15330 1726882292.33323: variable 'ansible_connection' from source: unknown 15330 1726882292.33326: variable 'ansible_module_compression' from source: unknown 15330 1726882292.33328: variable 'ansible_shell_type' from source: unknown 15330 1726882292.33330: variable 'ansible_shell_executable' from source: unknown 15330 1726882292.33336: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882292.33339: variable 'ansible_pipelining' from source: unknown 15330 1726882292.33365: variable 'ansible_timeout' from source: unknown 15330 1726882292.33368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882292.33528: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882292.33550: variable 'omit' from source: magic vars 15330 1726882292.33553: starting attempt loop 15330 1726882292.33555: running the handler 15330 1726882292.33689: variable 'interface_stat' from source: set_fact 15330 1726882292.33694: Evaluated conditional (not interface_stat.stat.exists): True 15330 1726882292.33697: handler run complete 15330 1726882292.33718: attempt loop complete, returning result 15330 1726882292.33721: _execute() done 15330 1726882292.33723: dumping result to json 15330 1726882292.33726: done dumping result, returning 15330 1726882292.33728: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'LSR-TST-br31' [12673a56-9f93-e4fe-1358-0000000004d7] 15330 1726882292.33735: sending task result for task 12673a56-9f93-e4fe-1358-0000000004d7 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 15330 1726882292.33883: no more pending results, returning what we have 15330 1726882292.33888: results queue empty 15330 1726882292.33889: checking for any_errors_fatal 15330 1726882292.33897: done checking for any_errors_fatal 15330 1726882292.33898: checking for max_fail_percentage 15330 1726882292.33904: done checking for max_fail_percentage 15330 1726882292.33905: checking to see if all hosts have failed and the running result is not ok 15330 1726882292.33906: done checking to see if all hosts have failed 15330 1726882292.33907: getting the remaining hosts for this loop 15330 1726882292.33908: done getting the remaining hosts for this loop 15330 1726882292.33911: getting the next task for host managed_node3 15330 1726882292.33918: done getting next task for host managed_node3 15330 1726882292.33921: ^ task is: TASK: meta (flush_handlers) 15330 1726882292.33924: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882292.33928: getting variables 15330 1726882292.33930: in VariableManager get_vars() 15330 1726882292.33953: Calling all_inventory to load vars for managed_node3 15330 1726882292.33955: Calling groups_inventory to load vars for managed_node3 15330 1726882292.33958: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882292.33967: Calling all_plugins_play to load vars for managed_node3 15330 1726882292.33969: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882292.33975: Calling groups_plugins_play to load vars for managed_node3 15330 1726882292.34506: done sending task result for task 12673a56-9f93-e4fe-1358-0000000004d7 15330 1726882292.34510: WORKER PROCESS EXITING 15330 1726882292.34979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882292.36063: done with get_vars() 15330 1726882292.36090: done getting variables 15330 1726882292.36162: in VariableManager get_vars() 15330 1726882292.36169: Calling all_inventory to load vars for managed_node3 15330 1726882292.36170: Calling groups_inventory to load vars for managed_node3 15330 1726882292.36172: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882292.36175: Calling all_plugins_play to load vars for managed_node3 15330 1726882292.36176: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882292.36178: Calling groups_plugins_play to load vars for managed_node3 15330 1726882292.37136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882292.38230: done with get_vars() 15330 1726882292.38261: done queuing things up, now waiting for results queue to drain 15330 1726882292.38262: results queue empty 15330 1726882292.38266: checking for any_errors_fatal 15330 1726882292.38269: done checking for any_errors_fatal 15330 1726882292.38271: checking for max_fail_percentage 15330 1726882292.38272: done checking for max_fail_percentage 15330 1726882292.38273: checking to see if all hosts have failed and the running result is not ok 15330 1726882292.38273: done checking to see if all hosts have failed 15330 1726882292.38278: getting the remaining hosts for this loop 15330 1726882292.38279: done getting the remaining hosts for this loop 15330 1726882292.38287: getting the next task for host managed_node3 15330 1726882292.38291: done getting next task for host managed_node3 15330 1726882292.38292: ^ task is: TASK: meta (flush_handlers) 15330 1726882292.38295: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882292.38298: getting variables 15330 1726882292.38299: in VariableManager get_vars() 15330 1726882292.38307: Calling all_inventory to load vars for managed_node3 15330 1726882292.38309: Calling groups_inventory to load vars for managed_node3 15330 1726882292.38311: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882292.38315: Calling all_plugins_play to load vars for managed_node3 15330 1726882292.38317: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882292.38319: Calling groups_plugins_play to load vars for managed_node3 15330 1726882292.39109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882292.40045: done with get_vars() 15330 1726882292.40058: done getting variables 15330 1726882292.40092: in VariableManager get_vars() 15330 1726882292.40099: Calling all_inventory to load vars for managed_node3 15330 1726882292.40101: Calling groups_inventory to load vars for managed_node3 15330 1726882292.40102: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882292.40105: Calling all_plugins_play to load vars for managed_node3 15330 1726882292.40107: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882292.40108: Calling groups_plugins_play to load vars for managed_node3 15330 1726882292.40827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882292.41873: done with get_vars() 15330 1726882292.41892: done queuing things up, now waiting for results queue to drain 15330 1726882292.41895: results queue empty 15330 1726882292.41895: checking for any_errors_fatal 15330 1726882292.41896: done checking for any_errors_fatal 15330 1726882292.41896: checking for max_fail_percentage 15330 1726882292.41897: done checking for max_fail_percentage 15330 1726882292.41898: checking to see if all hosts have failed and the running result is not ok 15330 1726882292.41898: done checking to see if all hosts have failed 15330 1726882292.41899: getting the remaining hosts for this loop 15330 1726882292.41899: done getting the remaining hosts for this loop 15330 1726882292.41901: getting the next task for host managed_node3 15330 1726882292.41903: done getting next task for host managed_node3 15330 1726882292.41903: ^ task is: None 15330 1726882292.41904: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882292.41905: done queuing things up, now waiting for results queue to drain 15330 1726882292.41905: results queue empty 15330 1726882292.41906: checking for any_errors_fatal 15330 1726882292.41906: done checking for any_errors_fatal 15330 1726882292.41906: checking for max_fail_percentage 15330 1726882292.41907: done checking for max_fail_percentage 15330 1726882292.41907: checking to see if all hosts have failed and the running result is not ok 15330 1726882292.41908: done checking to see if all hosts have failed 15330 1726882292.41909: getting the next task for host managed_node3 15330 1726882292.41910: done getting next task for host managed_node3 15330 1726882292.41910: ^ task is: None 15330 1726882292.41911: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882292.41941: in VariableManager get_vars() 15330 1726882292.41951: done with get_vars() 15330 1726882292.41957: in VariableManager get_vars() 15330 1726882292.41963: done with get_vars() 15330 1726882292.41966: variable 'omit' from source: magic vars 15330 1726882292.41985: in VariableManager get_vars() 15330 1726882292.41994: done with get_vars() 15330 1726882292.42016: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 15330 1726882292.42173: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15330 1726882292.42199: getting the remaining hosts for this loop 15330 1726882292.42201: done getting the remaining hosts for this loop 15330 1726882292.42207: getting the next task for host managed_node3 15330 1726882292.42209: done getting next task for host managed_node3 15330 1726882292.42211: ^ task is: TASK: Gathering Facts 15330 1726882292.42215: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882292.42217: getting variables 15330 1726882292.42218: in VariableManager get_vars() 15330 1726882292.42226: Calling all_inventory to load vars for managed_node3 15330 1726882292.42231: Calling groups_inventory to load vars for managed_node3 15330 1726882292.42234: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882292.42239: Calling all_plugins_play to load vars for managed_node3 15330 1726882292.42241: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882292.42244: Calling groups_plugins_play to load vars for managed_node3 15330 1726882292.43099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882292.44241: done with get_vars() 15330 1726882292.44259: done getting variables 15330 1726882292.44297: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Friday 20 September 2024 21:31:32 -0400 (0:00:00.125) 0:00:41.649 ****** 15330 1726882292.44314: entering _queue_task() for managed_node3/gather_facts 15330 1726882292.44614: worker is 1 (out of 1 available) 15330 1726882292.44626: exiting _queue_task() for managed_node3/gather_facts 15330 1726882292.44637: done queuing things up, now waiting for results queue to drain 15330 1726882292.44639: waiting for pending results... 15330 1726882292.44818: running TaskExecutor() for managed_node3/TASK: Gathering Facts 15330 1726882292.44901: in run() - task 12673a56-9f93-e4fe-1358-0000000004fa 15330 1726882292.44933: variable 'ansible_search_path' from source: unknown 15330 1726882292.44953: calling self._execute() 15330 1726882292.45049: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882292.45063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882292.45092: variable 'omit' from source: magic vars 15330 1726882292.45417: variable 'ansible_distribution_major_version' from source: facts 15330 1726882292.45480: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882292.45483: variable 'omit' from source: magic vars 15330 1726882292.45485: variable 'omit' from source: magic vars 15330 1726882292.45506: variable 'omit' from source: magic vars 15330 1726882292.45538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882292.45566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882292.45591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882292.45614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882292.45620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882292.45644: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882292.45647: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882292.45651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882292.45724: Set connection var ansible_pipelining to False 15330 1726882292.45735: Set connection var ansible_timeout to 10 15330 1726882292.45749: Set connection var ansible_connection to ssh 15330 1726882292.45752: Set connection var ansible_shell_type to sh 15330 1726882292.45755: Set connection var ansible_shell_executable to /bin/sh 15330 1726882292.45757: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882292.45777: variable 'ansible_shell_executable' from source: unknown 15330 1726882292.45781: variable 'ansible_connection' from source: unknown 15330 1726882292.45783: variable 'ansible_module_compression' from source: unknown 15330 1726882292.45786: variable 'ansible_shell_type' from source: unknown 15330 1726882292.45789: variable 'ansible_shell_executable' from source: unknown 15330 1726882292.45792: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882292.45799: variable 'ansible_pipelining' from source: unknown 15330 1726882292.45801: variable 'ansible_timeout' from source: unknown 15330 1726882292.45803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882292.45936: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882292.45944: variable 'omit' from source: magic vars 15330 1726882292.45949: starting attempt loop 15330 1726882292.45952: running the handler 15330 1726882292.45964: variable 'ansible_facts' from source: unknown 15330 1726882292.45980: _low_level_execute_command(): starting 15330 1726882292.45989: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882292.46511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882292.46515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.46518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882292.46520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.46575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882292.46578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882292.46583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882292.46629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882292.48235: stdout chunk (state=3): >>>/root <<< 15330 1726882292.48334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882292.48363: stderr chunk (state=3): >>><<< 15330 1726882292.48366: stdout chunk (state=3): >>><<< 15330 1726882292.48394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882292.48406: _low_level_execute_command(): starting 15330 1726882292.48411: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246 `" && echo ansible-tmp-1726882292.483923-17180-234755727859246="` echo /root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246 `" ) && sleep 0' 15330 1726882292.48990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882292.48994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882292.48997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.49005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882292.49008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.49111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882292.49155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882292.51037: stdout chunk (state=3): >>>ansible-tmp-1726882292.483923-17180-234755727859246=/root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246 <<< 15330 1726882292.51148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882292.51401: stderr chunk (state=3): >>><<< 15330 1726882292.51405: stdout chunk (state=3): >>><<< 15330 1726882292.51409: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882292.483923-17180-234755727859246=/root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882292.51411: variable 'ansible_module_compression' from source: unknown 15330 1726882292.51413: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15330 1726882292.51552: variable 'ansible_facts' from source: unknown 15330 1726882292.51802: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246/AnsiballZ_setup.py 15330 1726882292.51908: Sending initial data 15330 1726882292.51911: Sent initial data (153 bytes) 15330 1726882292.52343: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882292.52346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882292.52348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15330 1726882292.52352: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.52410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882292.52417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882292.52419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882292.52464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882292.54128: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882292.54177: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882292.54229: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmpvtvuq7vd /root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246/AnsiballZ_setup.py <<< 15330 1726882292.54233: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246/AnsiballZ_setup.py" <<< 15330 1726882292.54268: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmpvtvuq7vd" to remote "/root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246/AnsiballZ_setup.py" <<< 15330 1726882292.55745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882292.55815: stderr chunk (state=3): >>><<< 15330 1726882292.55825: stdout chunk (state=3): >>><<< 15330 1726882292.55937: done transferring module to remote 15330 1726882292.55940: _low_level_execute_command(): starting 15330 1726882292.55942: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246/ /root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246/AnsiballZ_setup.py && sleep 0' 15330 1726882292.56464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882292.56514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882292.56598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882292.56640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882292.56711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882292.58492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882292.58498: stdout chunk (state=3): >>><<< 15330 1726882292.58500: stderr chunk (state=3): >>><<< 15330 1726882292.58601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882292.58605: _low_level_execute_command(): starting 15330 1726882292.58607: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246/AnsiballZ_setup.py && sleep 0' 15330 1726882292.59168: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882292.59180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882292.59196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882292.59261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882292.59316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882292.59331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882292.59355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882292.59439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.22988: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "32", "epoch": "1726882292", "epoch_int": "1726882292", "date": "2024-09-20", "time": "21:31:32", "iso8601_micro": "2024-09-21T01:31:32.868042Z", "iso8601": "2024-09-21T01:31:32Z", "iso8601_basic": "20240920T213132868042", "iso8601_basic_short": "20240920T213132", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixe<<< 15330 1726882293.23006: stdout chunk (state=3): >>>d]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2973, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 558, "free": 2973}, "nocache": {"free": 3289, "used": 242}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 600, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803069440, "block_size": 4096, "block_total": 65519099, "block_available": 63916765, "block_used": 1602334, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 1.02099609375, "5m": 0.533203125, "15m": 0.2490234375}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15330 1726882293.24990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882293.25024: stderr chunk (state=3): >>><<< 15330 1726882293.25028: stdout chunk (state=3): >>><<< 15330 1726882293.25063: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "31", "second": "32", "epoch": "1726882292", "epoch_int": "1726882292", "date": "2024-09-20", "time": "21:31:32", "iso8601_micro": "2024-09-21T01:31:32.868042Z", "iso8601": "2024-09-21T01:31:32Z", "iso8601_basic": "20240920T213132868042", "iso8601_basic_short": "20240920T213132", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2973, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 558, "free": 2973}, "nocache": {"free": 3289, "used": 242}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 600, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261803069440, "block_size": 4096, "block_total": 65519099, "block_available": 63916765, "block_used": 1602334, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 1.02099609375, "5m": 0.533203125, "15m": 0.2490234375}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882293.25427: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882293.25432: _low_level_execute_command(): starting 15330 1726882293.25456: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882292.483923-17180-234755727859246/ > /dev/null 2>&1 && sleep 0' 15330 1726882293.25995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 15330 1726882293.26012: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882293.26015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 15330 1726882293.26018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.26071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882293.26075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882293.26077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.26153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.27945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882293.27974: stderr chunk (state=3): >>><<< 15330 1726882293.27977: stdout chunk (state=3): >>><<< 15330 1726882293.27995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882293.28004: handler run complete 15330 1726882293.28075: variable 'ansible_facts' from source: unknown 15330 1726882293.28148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882293.28323: variable 'ansible_facts' from source: unknown 15330 1726882293.28375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882293.28454: attempt loop complete, returning result 15330 1726882293.28457: _execute() done 15330 1726882293.28460: dumping result to json 15330 1726882293.28479: done dumping result, returning 15330 1726882293.28488: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-e4fe-1358-0000000004fa] 15330 1726882293.28491: sending task result for task 12673a56-9f93-e4fe-1358-0000000004fa ok: [managed_node3] 15330 1726882293.29041: no more pending results, returning what we have 15330 1726882293.29043: results queue empty 15330 1726882293.29044: checking for any_errors_fatal 15330 1726882293.29044: done checking for any_errors_fatal 15330 1726882293.29045: checking for max_fail_percentage 15330 1726882293.29046: done checking for max_fail_percentage 15330 1726882293.29047: checking to see if all hosts have failed and the running result is not ok 15330 1726882293.29047: done checking to see if all hosts have failed 15330 1726882293.29048: getting the remaining hosts for this loop 15330 1726882293.29048: done getting the remaining hosts for this loop 15330 1726882293.29050: getting the next task for host managed_node3 15330 1726882293.29054: done getting next task for host managed_node3 15330 1726882293.29055: ^ task is: TASK: meta (flush_handlers) 15330 1726882293.29056: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882293.29058: getting variables 15330 1726882293.29059: in VariableManager get_vars() 15330 1726882293.29077: Calling all_inventory to load vars for managed_node3 15330 1726882293.29079: Calling groups_inventory to load vars for managed_node3 15330 1726882293.29081: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882293.29089: Calling all_plugins_play to load vars for managed_node3 15330 1726882293.29091: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882293.29097: done sending task result for task 12673a56-9f93-e4fe-1358-0000000004fa 15330 1726882293.29100: WORKER PROCESS EXITING 15330 1726882293.29104: Calling groups_plugins_play to load vars for managed_node3 15330 1726882293.29861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882293.31179: done with get_vars() 15330 1726882293.31208: done getting variables 15330 1726882293.31294: in VariableManager get_vars() 15330 1726882293.31301: Calling all_inventory to load vars for managed_node3 15330 1726882293.31303: Calling groups_inventory to load vars for managed_node3 15330 1726882293.31304: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882293.31307: Calling all_plugins_play to load vars for managed_node3 15330 1726882293.31309: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882293.31310: Calling groups_plugins_play to load vars for managed_node3 15330 1726882293.32303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882293.33849: done with get_vars() 15330 1726882293.33877: done queuing things up, now waiting for results queue to drain 15330 1726882293.33879: results queue empty 15330 1726882293.33879: checking for any_errors_fatal 15330 1726882293.33882: done checking for any_errors_fatal 15330 1726882293.33882: checking for max_fail_percentage 15330 1726882293.33889: done checking for max_fail_percentage 15330 1726882293.33890: checking to see if all hosts have failed and the running result is not ok 15330 1726882293.33891: done checking to see if all hosts have failed 15330 1726882293.33891: getting the remaining hosts for this loop 15330 1726882293.33892: done getting the remaining hosts for this loop 15330 1726882293.33895: getting the next task for host managed_node3 15330 1726882293.33898: done getting next task for host managed_node3 15330 1726882293.33900: ^ task is: TASK: Verify network state restored to default 15330 1726882293.33901: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882293.33902: getting variables 15330 1726882293.33903: in VariableManager get_vars() 15330 1726882293.33909: Calling all_inventory to load vars for managed_node3 15330 1726882293.33910: Calling groups_inventory to load vars for managed_node3 15330 1726882293.33912: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882293.33916: Calling all_plugins_play to load vars for managed_node3 15330 1726882293.33917: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882293.33919: Calling groups_plugins_play to load vars for managed_node3 15330 1726882293.34556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882293.35397: done with get_vars() 15330 1726882293.35410: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Friday 20 September 2024 21:31:33 -0400 (0:00:00.911) 0:00:42.560 ****** 15330 1726882293.35459: entering _queue_task() for managed_node3/include_tasks 15330 1726882293.35736: worker is 1 (out of 1 available) 15330 1726882293.35748: exiting _queue_task() for managed_node3/include_tasks 15330 1726882293.35760: done queuing things up, now waiting for results queue to drain 15330 1726882293.35762: waiting for pending results... 15330 1726882293.36212: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 15330 1726882293.36217: in run() - task 12673a56-9f93-e4fe-1358-00000000007a 15330 1726882293.36221: variable 'ansible_search_path' from source: unknown 15330 1726882293.36224: calling self._execute() 15330 1726882293.36301: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882293.36312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882293.36324: variable 'omit' from source: magic vars 15330 1726882293.36695: variable 'ansible_distribution_major_version' from source: facts 15330 1726882293.36710: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882293.36719: _execute() done 15330 1726882293.36726: dumping result to json 15330 1726882293.36732: done dumping result, returning 15330 1726882293.36740: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [12673a56-9f93-e4fe-1358-00000000007a] 15330 1726882293.36749: sending task result for task 12673a56-9f93-e4fe-1358-00000000007a 15330 1726882293.36906: no more pending results, returning what we have 15330 1726882293.36912: in VariableManager get_vars() 15330 1726882293.36948: Calling all_inventory to load vars for managed_node3 15330 1726882293.36950: Calling groups_inventory to load vars for managed_node3 15330 1726882293.36953: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882293.36969: Calling all_plugins_play to load vars for managed_node3 15330 1726882293.36972: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882293.36976: Calling groups_plugins_play to load vars for managed_node3 15330 1726882293.37807: done sending task result for task 12673a56-9f93-e4fe-1358-00000000007a 15330 1726882293.37810: WORKER PROCESS EXITING 15330 1726882293.38619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882293.40183: done with get_vars() 15330 1726882293.40210: variable 'ansible_search_path' from source: unknown 15330 1726882293.40226: we have included files to process 15330 1726882293.40227: generating all_blocks data 15330 1726882293.40229: done generating all_blocks data 15330 1726882293.40230: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15330 1726882293.40231: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15330 1726882293.40233: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15330 1726882293.40653: done processing included file 15330 1726882293.40655: iterating over new_blocks loaded from include file 15330 1726882293.40657: in VariableManager get_vars() 15330 1726882293.40669: done with get_vars() 15330 1726882293.40670: filtering new block on tags 15330 1726882293.40691: done filtering new block on tags 15330 1726882293.40696: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 15330 1726882293.40702: extending task lists for all hosts with included blocks 15330 1726882293.40736: done extending task lists 15330 1726882293.40737: done processing included files 15330 1726882293.40738: results queue empty 15330 1726882293.40739: checking for any_errors_fatal 15330 1726882293.40740: done checking for any_errors_fatal 15330 1726882293.40741: checking for max_fail_percentage 15330 1726882293.40742: done checking for max_fail_percentage 15330 1726882293.40742: checking to see if all hosts have failed and the running result is not ok 15330 1726882293.40743: done checking to see if all hosts have failed 15330 1726882293.40744: getting the remaining hosts for this loop 15330 1726882293.40745: done getting the remaining hosts for this loop 15330 1726882293.40748: getting the next task for host managed_node3 15330 1726882293.40752: done getting next task for host managed_node3 15330 1726882293.40754: ^ task is: TASK: Check routes and DNS 15330 1726882293.40757: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882293.40759: getting variables 15330 1726882293.40760: in VariableManager get_vars() 15330 1726882293.40769: Calling all_inventory to load vars for managed_node3 15330 1726882293.40772: Calling groups_inventory to load vars for managed_node3 15330 1726882293.40774: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882293.40780: Calling all_plugins_play to load vars for managed_node3 15330 1726882293.40782: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882293.40788: Calling groups_plugins_play to load vars for managed_node3 15330 1726882293.41933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882293.43571: done with get_vars() 15330 1726882293.43596: done getting variables 15330 1726882293.43639: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:31:33 -0400 (0:00:00.082) 0:00:42.642 ****** 15330 1726882293.43669: entering _queue_task() for managed_node3/shell 15330 1726882293.44011: worker is 1 (out of 1 available) 15330 1726882293.44023: exiting _queue_task() for managed_node3/shell 15330 1726882293.44035: done queuing things up, now waiting for results queue to drain 15330 1726882293.44036: waiting for pending results... 15330 1726882293.44297: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 15330 1726882293.44422: in run() - task 12673a56-9f93-e4fe-1358-00000000050b 15330 1726882293.44440: variable 'ansible_search_path' from source: unknown 15330 1726882293.44447: variable 'ansible_search_path' from source: unknown 15330 1726882293.44484: calling self._execute() 15330 1726882293.44582: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882293.44602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882293.44619: variable 'omit' from source: magic vars 15330 1726882293.44999: variable 'ansible_distribution_major_version' from source: facts 15330 1726882293.45014: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882293.45026: variable 'omit' from source: magic vars 15330 1726882293.45067: variable 'omit' from source: magic vars 15330 1726882293.45106: variable 'omit' from source: magic vars 15330 1726882293.45144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882293.45192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882293.45226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882293.45251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882293.45269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882293.45314: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882293.45323: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882293.45330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882293.45443: Set connection var ansible_pipelining to False 15330 1726882293.45462: Set connection var ansible_timeout to 10 15330 1726882293.45469: Set connection var ansible_connection to ssh 15330 1726882293.45475: Set connection var ansible_shell_type to sh 15330 1726882293.45488: Set connection var ansible_shell_executable to /bin/sh 15330 1726882293.45505: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882293.45532: variable 'ansible_shell_executable' from source: unknown 15330 1726882293.45540: variable 'ansible_connection' from source: unknown 15330 1726882293.45548: variable 'ansible_module_compression' from source: unknown 15330 1726882293.45554: variable 'ansible_shell_type' from source: unknown 15330 1726882293.45561: variable 'ansible_shell_executable' from source: unknown 15330 1726882293.45568: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882293.45608: variable 'ansible_pipelining' from source: unknown 15330 1726882293.45612: variable 'ansible_timeout' from source: unknown 15330 1726882293.45615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882293.45751: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882293.45768: variable 'omit' from source: magic vars 15330 1726882293.45778: starting attempt loop 15330 1726882293.45825: running the handler 15330 1726882293.45829: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882293.45832: _low_level_execute_command(): starting 15330 1726882293.45842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882293.46713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882293.46719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882293.46741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.46834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.48498: stdout chunk (state=3): >>>/root <<< 15330 1726882293.48657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882293.48661: stdout chunk (state=3): >>><<< 15330 1726882293.48663: stderr chunk (state=3): >>><<< 15330 1726882293.48682: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882293.48707: _low_level_execute_command(): starting 15330 1726882293.48795: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400 `" && echo ansible-tmp-1726882293.486915-17209-83433500997400="` echo /root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400 `" ) && sleep 0' 15330 1726882293.49289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882293.49309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882293.49324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882293.49409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.49449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882293.49472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882293.49486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.49565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.51417: stdout chunk (state=3): >>>ansible-tmp-1726882293.486915-17209-83433500997400=/root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400 <<< 15330 1726882293.51547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882293.51584: stderr chunk (state=3): >>><<< 15330 1726882293.51642: stdout chunk (state=3): >>><<< 15330 1726882293.51646: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882293.486915-17209-83433500997400=/root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882293.51671: variable 'ansible_module_compression' from source: unknown 15330 1726882293.51726: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15330 1726882293.51774: variable 'ansible_facts' from source: unknown 15330 1726882293.51865: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400/AnsiballZ_command.py 15330 1726882293.52100: Sending initial data 15330 1726882293.52103: Sent initial data (154 bytes) 15330 1726882293.52633: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882293.52651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882293.52698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882293.52711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.52780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882293.52806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.52881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.54391: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882293.54463: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882293.54540: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp8_7x8er4 /root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400/AnsiballZ_command.py <<< 15330 1726882293.54544: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400/AnsiballZ_command.py" <<< 15330 1726882293.54583: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp8_7x8er4" to remote "/root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400/AnsiballZ_command.py" <<< 15330 1726882293.55365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882293.55399: stderr chunk (state=3): >>><<< 15330 1726882293.55409: stdout chunk (state=3): >>><<< 15330 1726882293.55559: done transferring module to remote 15330 1726882293.55562: _low_level_execute_command(): starting 15330 1726882293.55565: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400/ /root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400/AnsiballZ_command.py && sleep 0' 15330 1726882293.56208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.56247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882293.56260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882293.56280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.56365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.58107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882293.58123: stdout chunk (state=3): >>><<< 15330 1726882293.58135: stderr chunk (state=3): >>><<< 15330 1726882293.58155: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882293.58164: _low_level_execute_command(): starting 15330 1726882293.58174: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400/AnsiballZ_command.py && sleep 0' 15330 1726882293.58912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882293.58938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882293.59066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882293.59098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882293.59223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.59254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882293.59268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882293.59292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.59425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.75299: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:87:27:91:87:37 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.229/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3034sec preferred_lft 3034sec\n inet6 fe80::1087:27ff:fe91:8737/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:31:33.742714", "end": "2024-09-20 21:31:33.751116", "delta": "0:00:00.008402", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15330 1726882293.76696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882293.76738: stderr chunk (state=3): >>><<< 15330 1726882293.76741: stdout chunk (state=3): >>><<< 15330 1726882293.76753: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:87:27:91:87:37 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.229/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3034sec preferred_lft 3034sec\n inet6 fe80::1087:27ff:fe91:8737/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:31:33.742714", "end": "2024-09-20 21:31:33.751116", "delta": "0:00:00.008402", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882293.76786: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882293.76801: _low_level_execute_command(): starting 15330 1726882293.76804: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882293.486915-17209-83433500997400/ > /dev/null 2>&1 && sleep 0' 15330 1726882293.77378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882293.77381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.77383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882293.77385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.77428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882293.77449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882293.77483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.77533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.79498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882293.79501: stdout chunk (state=3): >>><<< 15330 1726882293.79503: stderr chunk (state=3): >>><<< 15330 1726882293.79507: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882293.79512: handler run complete 15330 1726882293.79514: Evaluated conditional (False): False 15330 1726882293.79516: attempt loop complete, returning result 15330 1726882293.79517: _execute() done 15330 1726882293.79519: dumping result to json 15330 1726882293.79520: done dumping result, returning 15330 1726882293.79522: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [12673a56-9f93-e4fe-1358-00000000050b] 15330 1726882293.79524: sending task result for task 12673a56-9f93-e4fe-1358-00000000050b 15330 1726882293.79591: done sending task result for task 12673a56-9f93-e4fe-1358-00000000050b 15330 1726882293.79596: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008402", "end": "2024-09-20 21:31:33.751116", "rc": 0, "start": "2024-09-20 21:31:33.742714" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:87:27:91:87:37 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.10.229/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3034sec preferred_lft 3034sec inet6 fe80::1087:27ff:fe91:8737/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 15330 1726882293.79737: no more pending results, returning what we have 15330 1726882293.79741: results queue empty 15330 1726882293.79742: checking for any_errors_fatal 15330 1726882293.79743: done checking for any_errors_fatal 15330 1726882293.79744: checking for max_fail_percentage 15330 1726882293.79746: done checking for max_fail_percentage 15330 1726882293.79747: checking to see if all hosts have failed and the running result is not ok 15330 1726882293.79748: done checking to see if all hosts have failed 15330 1726882293.79748: getting the remaining hosts for this loop 15330 1726882293.79750: done getting the remaining hosts for this loop 15330 1726882293.79753: getting the next task for host managed_node3 15330 1726882293.79758: done getting next task for host managed_node3 15330 1726882293.79761: ^ task is: TASK: Verify DNS and network connectivity 15330 1726882293.79764: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882293.79767: getting variables 15330 1726882293.79768: in VariableManager get_vars() 15330 1726882293.79799: Calling all_inventory to load vars for managed_node3 15330 1726882293.79805: Calling groups_inventory to load vars for managed_node3 15330 1726882293.79809: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882293.79820: Calling all_plugins_play to load vars for managed_node3 15330 1726882293.79847: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882293.79851: Calling groups_plugins_play to load vars for managed_node3 15330 1726882293.81210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882293.83152: done with get_vars() 15330 1726882293.83171: done getting variables 15330 1726882293.83220: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:31:33 -0400 (0:00:00.395) 0:00:43.038 ****** 15330 1726882293.83243: entering _queue_task() for managed_node3/shell 15330 1726882293.83496: worker is 1 (out of 1 available) 15330 1726882293.83508: exiting _queue_task() for managed_node3/shell 15330 1726882293.83521: done queuing things up, now waiting for results queue to drain 15330 1726882293.83522: waiting for pending results... 15330 1726882293.83699: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 15330 1726882293.83763: in run() - task 12673a56-9f93-e4fe-1358-00000000050c 15330 1726882293.83775: variable 'ansible_search_path' from source: unknown 15330 1726882293.83779: variable 'ansible_search_path' from source: unknown 15330 1726882293.83811: calling self._execute() 15330 1726882293.83880: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882293.83890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882293.83899: variable 'omit' from source: magic vars 15330 1726882293.84169: variable 'ansible_distribution_major_version' from source: facts 15330 1726882293.84178: Evaluated conditional (ansible_distribution_major_version != '6'): True 15330 1726882293.84276: variable 'ansible_facts' from source: unknown 15330 1726882293.84898: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 15330 1726882293.84901: variable 'omit' from source: magic vars 15330 1726882293.84907: variable 'omit' from source: magic vars 15330 1726882293.84909: variable 'omit' from source: magic vars 15330 1726882293.84947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15330 1726882293.85017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15330 1726882293.85045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15330 1726882293.85153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882293.85201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15330 1726882293.85260: variable 'inventory_hostname' from source: host vars for 'managed_node3' 15330 1726882293.85269: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882293.85311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882293.85432: Set connection var ansible_pipelining to False 15330 1726882293.85461: Set connection var ansible_timeout to 10 15330 1726882293.85470: Set connection var ansible_connection to ssh 15330 1726882293.85697: Set connection var ansible_shell_type to sh 15330 1726882293.85700: Set connection var ansible_shell_executable to /bin/sh 15330 1726882293.85702: Set connection var ansible_module_compression to ZIP_DEFLATED 15330 1726882293.85705: variable 'ansible_shell_executable' from source: unknown 15330 1726882293.85706: variable 'ansible_connection' from source: unknown 15330 1726882293.85709: variable 'ansible_module_compression' from source: unknown 15330 1726882293.85710: variable 'ansible_shell_type' from source: unknown 15330 1726882293.85712: variable 'ansible_shell_executable' from source: unknown 15330 1726882293.85714: variable 'ansible_host' from source: host vars for 'managed_node3' 15330 1726882293.85716: variable 'ansible_pipelining' from source: unknown 15330 1726882293.85718: variable 'ansible_timeout' from source: unknown 15330 1726882293.85720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 15330 1726882293.85731: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882293.85747: variable 'omit' from source: magic vars 15330 1726882293.85755: starting attempt loop 15330 1726882293.85761: running the handler 15330 1726882293.85773: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15330 1726882293.85800: _low_level_execute_command(): starting 15330 1726882293.85812: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15330 1726882293.86656: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882293.86733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.86809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882293.86843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882293.86891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.86945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.88522: stdout chunk (state=3): >>>/root <<< 15330 1726882293.88639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882293.88684: stderr chunk (state=3): >>><<< 15330 1726882293.88687: stdout chunk (state=3): >>><<< 15330 1726882293.88819: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882293.88823: _low_level_execute_command(): starting 15330 1726882293.88827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516 `" && echo ansible-tmp-1726882293.8872147-17234-238717878558516="` echo /root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516 `" ) && sleep 0' 15330 1726882293.89455: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882293.89490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882293.89627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882293.89654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.89730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.91591: stdout chunk (state=3): >>>ansible-tmp-1726882293.8872147-17234-238717878558516=/root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516 <<< 15330 1726882293.91715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882293.91741: stderr chunk (state=3): >>><<< 15330 1726882293.91744: stdout chunk (state=3): >>><<< 15330 1726882293.91807: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882293.8872147-17234-238717878558516=/root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882293.91827: variable 'ansible_module_compression' from source: unknown 15330 1726882293.91868: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-153308gpzszpk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15330 1726882293.91915: variable 'ansible_facts' from source: unknown 15330 1726882293.91965: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516/AnsiballZ_command.py 15330 1726882293.92081: Sending initial data 15330 1726882293.92084: Sent initial data (156 bytes) 15330 1726882293.92933: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.92982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882293.93006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882293.93051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.93172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.94685: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15330 1726882293.94736: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15330 1726882293.94796: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-153308gpzszpk/tmp0lt_1g9s /root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516/AnsiballZ_command.py <<< 15330 1726882293.94803: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516/AnsiballZ_command.py" <<< 15330 1726882293.94860: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-153308gpzszpk/tmp0lt_1g9s" to remote "/root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516/AnsiballZ_command.py" <<< 15330 1726882293.95422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882293.95547: stderr chunk (state=3): >>><<< 15330 1726882293.95550: stdout chunk (state=3): >>><<< 15330 1726882293.95560: done transferring module to remote 15330 1726882293.95563: _low_level_execute_command(): starting 15330 1726882293.95566: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516/ /root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516/AnsiballZ_command.py && sleep 0' 15330 1726882293.95970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882293.95973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 15330 1726882293.95978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.95980: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882293.95988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.96036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882293.96042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.96086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882293.97847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882293.97908: stderr chunk (state=3): >>><<< 15330 1726882293.97912: stdout chunk (state=3): >>><<< 15330 1726882293.97945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882293.97957: _low_level_execute_command(): starting 15330 1726882293.97969: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516/AnsiballZ_command.py && sleep 0' 15330 1726882293.98731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882293.98749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882293.98769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882293.98791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882293.98871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882294.21819: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 14746 0 --:--:-- --:--:-- --:--:-- 15250\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7890 0 --:--:-- --:--:-- --:--:-- 8083", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:31:34.137512", "end": "2024-09-20 21:31:34.216309", "delta": "0:00:00.078797", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15330 1726882294.23467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 15330 1726882294.23473: stdout chunk (state=3): >>><<< 15330 1726882294.23475: stderr chunk (state=3): >>><<< 15330 1726882294.23508: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 14746 0 --:--:-- --:--:-- --:--:-- 15250\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7890 0 --:--:-- --:--:-- --:--:-- 8083", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:31:34.137512", "end": "2024-09-20 21:31:34.216309", "delta": "0:00:00.078797", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 15330 1726882294.23599: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15330 1726882294.23602: _low_level_execute_command(): starting 15330 1726882294.23605: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882293.8872147-17234-238717878558516/ > /dev/null 2>&1 && sleep 0' 15330 1726882294.24222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15330 1726882294.24226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15330 1726882294.24228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882294.24314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15330 1726882294.24318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882294.24329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15330 1726882294.24335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15330 1726882294.24337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 15330 1726882294.24339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15330 1726882294.24372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15330 1726882294.24420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15330 1726882294.26219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15330 1726882294.26236: stderr chunk (state=3): >>><<< 15330 1726882294.26239: stdout chunk (state=3): >>><<< 15330 1726882294.26254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15330 1726882294.26260: handler run complete 15330 1726882294.26276: Evaluated conditional (False): False 15330 1726882294.26291: attempt loop complete, returning result 15330 1726882294.26297: _execute() done 15330 1726882294.26300: dumping result to json 15330 1726882294.26305: done dumping result, returning 15330 1726882294.26312: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [12673a56-9f93-e4fe-1358-00000000050c] 15330 1726882294.26317: sending task result for task 12673a56-9f93-e4fe-1358-00000000050c 15330 1726882294.26423: done sending task result for task 12673a56-9f93-e4fe-1358-00000000050c 15330 1726882294.26426: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.078797", "end": "2024-09-20 21:31:34.216309", "rc": 0, "start": "2024-09-20 21:31:34.137512" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 14746 0 --:--:-- --:--:-- --:--:-- 15250 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 7890 0 --:--:-- --:--:-- --:--:-- 8083 15330 1726882294.26491: no more pending results, returning what we have 15330 1726882294.26496: results queue empty 15330 1726882294.26497: checking for any_errors_fatal 15330 1726882294.26505: done checking for any_errors_fatal 15330 1726882294.26506: checking for max_fail_percentage 15330 1726882294.26507: done checking for max_fail_percentage 15330 1726882294.26508: checking to see if all hosts have failed and the running result is not ok 15330 1726882294.26509: done checking to see if all hosts have failed 15330 1726882294.26513: getting the remaining hosts for this loop 15330 1726882294.26515: done getting the remaining hosts for this loop 15330 1726882294.26518: getting the next task for host managed_node3 15330 1726882294.26526: done getting next task for host managed_node3 15330 1726882294.26528: ^ task is: TASK: meta (flush_handlers) 15330 1726882294.26530: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882294.26536: getting variables 15330 1726882294.26538: in VariableManager get_vars() 15330 1726882294.26566: Calling all_inventory to load vars for managed_node3 15330 1726882294.26569: Calling groups_inventory to load vars for managed_node3 15330 1726882294.26572: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882294.26582: Calling all_plugins_play to load vars for managed_node3 15330 1726882294.26585: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882294.26588: Calling groups_plugins_play to load vars for managed_node3 15330 1726882294.27538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882294.28906: done with get_vars() 15330 1726882294.28926: done getting variables 15330 1726882294.28991: in VariableManager get_vars() 15330 1726882294.28999: Calling all_inventory to load vars for managed_node3 15330 1726882294.29000: Calling groups_inventory to load vars for managed_node3 15330 1726882294.29005: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882294.29008: Calling all_plugins_play to load vars for managed_node3 15330 1726882294.29010: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882294.29011: Calling groups_plugins_play to load vars for managed_node3 15330 1726882294.29998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882294.30985: done with get_vars() 15330 1726882294.31008: done queuing things up, now waiting for results queue to drain 15330 1726882294.31009: results queue empty 15330 1726882294.31010: checking for any_errors_fatal 15330 1726882294.31012: done checking for any_errors_fatal 15330 1726882294.31013: checking for max_fail_percentage 15330 1726882294.31013: done checking for max_fail_percentage 15330 1726882294.31014: checking to see if all hosts have failed and the running result is not ok 15330 1726882294.31014: done checking to see if all hosts have failed 15330 1726882294.31015: getting the remaining hosts for this loop 15330 1726882294.31015: done getting the remaining hosts for this loop 15330 1726882294.31018: getting the next task for host managed_node3 15330 1726882294.31020: done getting next task for host managed_node3 15330 1726882294.31021: ^ task is: TASK: meta (flush_handlers) 15330 1726882294.31022: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882294.31024: getting variables 15330 1726882294.31025: in VariableManager get_vars() 15330 1726882294.31031: Calling all_inventory to load vars for managed_node3 15330 1726882294.31033: Calling groups_inventory to load vars for managed_node3 15330 1726882294.31034: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882294.31038: Calling all_plugins_play to load vars for managed_node3 15330 1726882294.31039: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882294.31041: Calling groups_plugins_play to load vars for managed_node3 15330 1726882294.31712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882294.32981: done with get_vars() 15330 1726882294.33003: done getting variables 15330 1726882294.33037: in VariableManager get_vars() 15330 1726882294.33043: Calling all_inventory to load vars for managed_node3 15330 1726882294.33044: Calling groups_inventory to load vars for managed_node3 15330 1726882294.33045: Calling all_plugins_inventory to load vars for managed_node3 15330 1726882294.33048: Calling all_plugins_play to load vars for managed_node3 15330 1726882294.33050: Calling groups_plugins_inventory to load vars for managed_node3 15330 1726882294.33053: Calling groups_plugins_play to load vars for managed_node3 15330 1726882294.33680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15330 1726882294.34528: done with get_vars() 15330 1726882294.34544: done queuing things up, now waiting for results queue to drain 15330 1726882294.34546: results queue empty 15330 1726882294.34546: checking for any_errors_fatal 15330 1726882294.34547: done checking for any_errors_fatal 15330 1726882294.34548: checking for max_fail_percentage 15330 1726882294.34548: done checking for max_fail_percentage 15330 1726882294.34549: checking to see if all hosts have failed and the running result is not ok 15330 1726882294.34549: done checking to see if all hosts have failed 15330 1726882294.34550: getting the remaining hosts for this loop 15330 1726882294.34550: done getting the remaining hosts for this loop 15330 1726882294.34552: getting the next task for host managed_node3 15330 1726882294.34555: done getting next task for host managed_node3 15330 1726882294.34555: ^ task is: None 15330 1726882294.34556: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15330 1726882294.34557: done queuing things up, now waiting for results queue to drain 15330 1726882294.34557: results queue empty 15330 1726882294.34558: checking for any_errors_fatal 15330 1726882294.34558: done checking for any_errors_fatal 15330 1726882294.34559: checking for max_fail_percentage 15330 1726882294.34559: done checking for max_fail_percentage 15330 1726882294.34559: checking to see if all hosts have failed and the running result is not ok 15330 1726882294.34560: done checking to see if all hosts have failed 15330 1726882294.34561: getting the next task for host managed_node3 15330 1726882294.34562: done getting next task for host managed_node3 15330 1726882294.34562: ^ task is: None 15330 1726882294.34563: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=82 changed=3 unreachable=0 failed=0 skipped=71 rescued=0 ignored=2 Friday 20 September 2024 21:31:34 -0400 (0:00:00.513) 0:00:43.552 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.11s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.92s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.79s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.52s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.10s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.09s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.08s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.06s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 0.95s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 0.93s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 0.91s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 fedora.linux_system_roles.network : Check which packages are installed --- 0.91s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.89s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.83s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 15330 1726882294.34649: RUNNING CLEANUP