[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 18285 1726853393.31186: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 18285 1726853393.31664: Added group all to inventory 18285 1726853393.31666: Added group ungrouped to inventory 18285 1726853393.31670: Group all now contains ungrouped 18285 1726853393.31675: Examining possible inventory source: /tmp/network-iHm/inventory.yml 18285 1726853393.52560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 18285 1726853393.52653: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 18285 1726853393.52678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 18285 1726853393.52852: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 18285 1726853393.52984: Loaded config def from plugin (inventory/script) 18285 1726853393.52986: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 18285 1726853393.53031: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 18285 1726853393.53248: Loaded config def from plugin (inventory/yaml) 18285 1726853393.53364: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 18285 1726853393.53447: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 18285 1726853393.54395: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 18285 1726853393.54398: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 18285 1726853393.54401: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 18285 1726853393.54407: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 18285 1726853393.54411: Loading data from /tmp/network-iHm/inventory.yml 18285 1726853393.54595: /tmp/network-iHm/inventory.yml was not parsable by auto 18285 1726853393.54776: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 18285 1726853393.54815: Loading data from /tmp/network-iHm/inventory.yml 18285 1726853393.55011: group all already in inventory 18285 1726853393.55018: set inventory_file for managed_node1 18285 1726853393.55022: set inventory_dir for managed_node1 18285 1726853393.55023: Added host managed_node1 to inventory 18285 1726853393.55027: Added host managed_node1 to group all 18285 1726853393.55028: set ansible_host for managed_node1 18285 1726853393.55028: set ansible_ssh_extra_args for managed_node1 18285 1726853393.55032: set inventory_file for managed_node2 18285 1726853393.55034: set inventory_dir for managed_node2 18285 1726853393.55035: Added host managed_node2 to inventory 18285 1726853393.55037: Added host managed_node2 to group all 18285 1726853393.55038: set ansible_host for managed_node2 18285 1726853393.55038: set ansible_ssh_extra_args for managed_node2 18285 1726853393.55041: set inventory_file for managed_node3 18285 1726853393.55044: set inventory_dir for managed_node3 18285 1726853393.55045: Added host managed_node3 to inventory 18285 1726853393.55046: Added host managed_node3 to group all 18285 1726853393.55047: set ansible_host for managed_node3 18285 1726853393.55047: set ansible_ssh_extra_args for managed_node3 18285 1726853393.55050: Reconcile groups and hosts in inventory. 18285 1726853393.55054: Group ungrouped now contains managed_node1 18285 1726853393.55055: Group ungrouped now contains managed_node2 18285 1726853393.55057: Group ungrouped now contains managed_node3 18285 1726853393.55249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 18285 1726853393.55411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 18285 1726853393.55575: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 18285 1726853393.55603: Loaded config def from plugin (vars/host_group_vars) 18285 1726853393.55605: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 18285 1726853393.55612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 18285 1726853393.55619: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 18285 1726853393.55815: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 18285 1726853393.56525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853393.56649: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 18285 1726853393.56690: Loaded config def from plugin (connection/local) 18285 1726853393.56693: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 18285 1726853393.58150: Loaded config def from plugin (connection/paramiko_ssh) 18285 1726853393.58154: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 18285 1726853393.60199: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18285 1726853393.60241: Loaded config def from plugin (connection/psrp) 18285 1726853393.60244: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 18285 1726853393.62461: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18285 1726853393.62517: Loaded config def from plugin (connection/ssh) 18285 1726853393.62520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 18285 1726853393.67412: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18285 1726853393.67451: Loaded config def from plugin (connection/winrm) 18285 1726853393.67455: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 18285 1726853393.67602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 18285 1726853393.67666: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 18285 1726853393.67841: Loaded config def from plugin (shell/cmd) 18285 1726853393.67843: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 18285 1726853393.67869: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 18285 1726853393.68074: Loaded config def from plugin (shell/powershell) 18285 1726853393.68076: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 18285 1726853393.68244: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 18285 1726853393.68597: Loaded config def from plugin (shell/sh) 18285 1726853393.68599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 18285 1726853393.68634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 18285 1726853393.68864: Loaded config def from plugin (become/runas) 18285 1726853393.68867: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 18285 1726853393.69382: Loaded config def from plugin (become/su) 18285 1726853393.69385: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 18285 1726853393.70103: Loaded config def from plugin (become/sudo) 18285 1726853393.70106: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 18285 1726853393.70140: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml 18285 1726853393.71137: in VariableManager get_vars() 18285 1726853393.71161: done with get_vars() 18285 1726853393.71753: trying /usr/local/lib/python3.12/site-packages/ansible/modules 18285 1726853393.78414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 18285 1726853393.78765: in VariableManager get_vars() 18285 1726853393.78772: done with get_vars() 18285 1726853393.78776: variable 'playbook_dir' from source: magic vars 18285 1726853393.78776: variable 'ansible_playbook_python' from source: magic vars 18285 1726853393.78777: variable 'ansible_config_file' from source: magic vars 18285 1726853393.78778: variable 'groups' from source: magic vars 18285 1726853393.78779: variable 'omit' from source: magic vars 18285 1726853393.78779: variable 'ansible_version' from source: magic vars 18285 1726853393.78780: variable 'ansible_check_mode' from source: magic vars 18285 1726853393.78781: variable 'ansible_diff_mode' from source: magic vars 18285 1726853393.78781: variable 'ansible_forks' from source: magic vars 18285 1726853393.78782: variable 'ansible_inventory_sources' from source: magic vars 18285 1726853393.78783: variable 'ansible_skip_tags' from source: magic vars 18285 1726853393.78783: variable 'ansible_limit' from source: magic vars 18285 1726853393.78784: variable 'ansible_run_tags' from source: magic vars 18285 1726853393.78785: variable 'ansible_verbosity' from source: magic vars 18285 1726853393.78822: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 18285 1726853393.80393: in VariableManager get_vars() 18285 1726853393.80411: done with get_vars() 18285 1726853393.80448: in VariableManager get_vars() 18285 1726853393.80470: done with get_vars() 18285 1726853393.80621: in VariableManager get_vars() 18285 1726853393.80635: done with get_vars() 18285 1726853393.80665: in VariableManager get_vars() 18285 1726853393.80710: done with get_vars() 18285 1726853393.80786: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18285 1726853393.81280: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18285 1726853393.81532: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18285 1726853393.83110: in VariableManager get_vars() 18285 1726853393.83133: done with get_vars() 18285 1726853393.84195: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 18285 1726853393.84445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18285 1726853393.87046: in VariableManager get_vars() 18285 1726853393.87065: done with get_vars() 18285 1726853393.87435: in VariableManager get_vars() 18285 1726853393.87440: done with get_vars() 18285 1726853393.87443: variable 'playbook_dir' from source: magic vars 18285 1726853393.87443: variable 'ansible_playbook_python' from source: magic vars 18285 1726853393.87444: variable 'ansible_config_file' from source: magic vars 18285 1726853393.87445: variable 'groups' from source: magic vars 18285 1726853393.87446: variable 'omit' from source: magic vars 18285 1726853393.87446: variable 'ansible_version' from source: magic vars 18285 1726853393.87447: variable 'ansible_check_mode' from source: magic vars 18285 1726853393.87448: variable 'ansible_diff_mode' from source: magic vars 18285 1726853393.87449: variable 'ansible_forks' from source: magic vars 18285 1726853393.87449: variable 'ansible_inventory_sources' from source: magic vars 18285 1726853393.87450: variable 'ansible_skip_tags' from source: magic vars 18285 1726853393.87451: variable 'ansible_limit' from source: magic vars 18285 1726853393.87451: variable 'ansible_run_tags' from source: magic vars 18285 1726853393.87452: variable 'ansible_verbosity' from source: magic vars 18285 1726853393.87486: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 18285 1726853393.87795: in VariableManager get_vars() 18285 1726853393.87799: done with get_vars() 18285 1726853393.87801: variable 'playbook_dir' from source: magic vars 18285 1726853393.87801: variable 'ansible_playbook_python' from source: magic vars 18285 1726853393.87802: variable 'ansible_config_file' from source: magic vars 18285 1726853393.87803: variable 'groups' from source: magic vars 18285 1726853393.87804: variable 'omit' from source: magic vars 18285 1726853393.87804: variable 'ansible_version' from source: magic vars 18285 1726853393.87805: variable 'ansible_check_mode' from source: magic vars 18285 1726853393.87806: variable 'ansible_diff_mode' from source: magic vars 18285 1726853393.87807: variable 'ansible_forks' from source: magic vars 18285 1726853393.87807: variable 'ansible_inventory_sources' from source: magic vars 18285 1726853393.87808: variable 'ansible_skip_tags' from source: magic vars 18285 1726853393.87809: variable 'ansible_limit' from source: magic vars 18285 1726853393.87809: variable 'ansible_run_tags' from source: magic vars 18285 1726853393.87810: variable 'ansible_verbosity' from source: magic vars 18285 1726853393.87841: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 18285 1726853393.87928: in VariableManager get_vars() 18285 1726853393.87941: done with get_vars() 18285 1726853393.88100: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18285 1726853393.88321: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18285 1726853393.88399: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18285 1726853393.89490: in VariableManager get_vars() 18285 1726853393.89514: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18285 1726853393.92743: in VariableManager get_vars() 18285 1726853393.92764: done with get_vars() 18285 1726853393.92918: in VariableManager get_vars() 18285 1726853393.92922: done with get_vars() 18285 1726853393.92924: variable 'playbook_dir' from source: magic vars 18285 1726853393.92925: variable 'ansible_playbook_python' from source: magic vars 18285 1726853393.92926: variable 'ansible_config_file' from source: magic vars 18285 1726853393.92926: variable 'groups' from source: magic vars 18285 1726853393.92927: variable 'omit' from source: magic vars 18285 1726853393.92928: variable 'ansible_version' from source: magic vars 18285 1726853393.92929: variable 'ansible_check_mode' from source: magic vars 18285 1726853393.92929: variable 'ansible_diff_mode' from source: magic vars 18285 1726853393.92930: variable 'ansible_forks' from source: magic vars 18285 1726853393.92931: variable 'ansible_inventory_sources' from source: magic vars 18285 1726853393.92931: variable 'ansible_skip_tags' from source: magic vars 18285 1726853393.92932: variable 'ansible_limit' from source: magic vars 18285 1726853393.92933: variable 'ansible_run_tags' from source: magic vars 18285 1726853393.92933: variable 'ansible_verbosity' from source: magic vars 18285 1726853393.92965: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 18285 1726853393.93142: in VariableManager get_vars() 18285 1726853393.93153: done with get_vars() 18285 1726853393.93195: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18285 1726853393.97203: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18285 1726853393.97397: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18285 1726853393.98227: in VariableManager get_vars() 18285 1726853393.98248: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18285 1726853394.01466: in VariableManager get_vars() 18285 1726853394.01585: done with get_vars() 18285 1726853394.01627: in VariableManager get_vars() 18285 1726853394.01639: done with get_vars() 18285 1726853394.01703: in VariableManager get_vars() 18285 1726853394.01832: done with get_vars() 18285 1726853394.01985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 18285 1726853394.01999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 18285 1726853394.02469: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 18285 1726853394.02851: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 18285 1726853394.02854: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 18285 1726853394.02887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 18285 1726853394.02911: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 18285 1726853394.03369: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 18285 1726853394.03431: Loaded config def from plugin (callback/default) 18285 1726853394.03434: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18285 1726853394.05901: Loaded config def from plugin (callback/junit) 18285 1726853394.05904: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18285 1726853394.05947: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 18285 1726853394.06012: Loaded config def from plugin (callback/minimal) 18285 1726853394.06015: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18285 1726853394.06053: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18285 1726853394.06316: Loaded config def from plugin (callback/tree) 18285 1726853394.06319: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 18285 1726853394.06446: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 18285 1726853394.06449: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_initscripts.yml *************************************** 10 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml 18285 1726853394.06477: in VariableManager get_vars() 18285 1726853394.06492: done with get_vars() 18285 1726853394.06498: in VariableManager get_vars() 18285 1726853394.06507: done with get_vars() 18285 1726853394.06512: variable 'omit' from source: magic vars 18285 1726853394.06550: in VariableManager get_vars() 18285 1726853394.06565: done with get_vars() 18285 1726853394.06788: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with initscripts as provider] *** 18285 1726853394.07719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 18285 1726853394.07994: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 18285 1726853394.08090: getting the remaining hosts for this loop 18285 1726853394.08092: done getting the remaining hosts for this loop 18285 1726853394.08095: getting the next task for host managed_node1 18285 1726853394.08099: done getting next task for host managed_node1 18285 1726853394.08101: ^ task is: TASK: Gathering Facts 18285 1726853394.08102: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853394.08110: getting variables 18285 1726853394.08111: in VariableManager get_vars() 18285 1726853394.08121: Calling all_inventory to load vars for managed_node1 18285 1726853394.08123: Calling groups_inventory to load vars for managed_node1 18285 1726853394.08126: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853394.08138: Calling all_plugins_play to load vars for managed_node1 18285 1726853394.08149: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853394.08153: Calling groups_plugins_play to load vars for managed_node1 18285 1726853394.08188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853394.08242: done with get_vars() 18285 1726853394.08249: done getting variables 18285 1726853394.08520: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml:5 Friday 20 September 2024 13:29:54 -0400 (0:00:00.021) 0:00:00.021 ****** 18285 1726853394.08540: entering _queue_task() for managed_node1/gather_facts 18285 1726853394.08542: Creating lock for gather_facts 18285 1726853394.09420: worker is 1 (out of 1 available) 18285 1726853394.09432: exiting _queue_task() for managed_node1/gather_facts 18285 1726853394.09444: done queuing things up, now waiting for results queue to drain 18285 1726853394.09446: waiting for pending results... 18285 1726853394.09858: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18285 1726853394.10181: in run() - task 02083763-bbaf-9200-7ca6-00000000007c 18285 1726853394.10196: variable 'ansible_search_path' from source: unknown 18285 1726853394.10458: calling self._execute() 18285 1726853394.10517: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853394.10546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853394.10549: variable 'omit' from source: magic vars 18285 1726853394.10863: variable 'omit' from source: magic vars 18285 1726853394.11124: variable 'omit' from source: magic vars 18285 1726853394.11177: variable 'omit' from source: magic vars 18285 1726853394.11207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18285 1726853394.11414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18285 1726853394.11548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18285 1726853394.11568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18285 1726853394.11583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18285 1726853394.11618: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18285 1726853394.11621: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853394.11624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853394.11951: Set connection var ansible_timeout to 10 18285 1726853394.11961: Set connection var ansible_shell_executable to /bin/sh 18285 1726853394.11975: Set connection var ansible_pipelining to False 18285 1726853394.12197: Set connection var ansible_shell_type to sh 18285 1726853394.12200: Set connection var ansible_module_compression to ZIP_DEFLATED 18285 1726853394.12203: Set connection var ansible_connection to ssh 18285 1726853394.12376: variable 'ansible_shell_executable' from source: unknown 18285 1726853394.12380: variable 'ansible_connection' from source: unknown 18285 1726853394.12383: variable 'ansible_module_compression' from source: unknown 18285 1726853394.12385: variable 'ansible_shell_type' from source: unknown 18285 1726853394.12388: variable 'ansible_shell_executable' from source: unknown 18285 1726853394.12390: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853394.12392: variable 'ansible_pipelining' from source: unknown 18285 1726853394.12394: variable 'ansible_timeout' from source: unknown 18285 1726853394.12395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853394.13182: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 18285 1726853394.13380: variable 'omit' from source: magic vars 18285 1726853394.13386: starting attempt loop 18285 1726853394.13389: running the handler 18285 1726853394.13404: variable 'ansible_facts' from source: unknown 18285 1726853394.13495: _low_level_execute_command(): starting 18285 1726853394.13502: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18285 1726853394.15689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18285 1726853394.15693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18285 1726853394.15877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18285 1726853394.15904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853394.16087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853394.17695: stdout chunk (state=3): >>>/root <<< 18285 1726853394.17795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853394.17843: stderr chunk (state=3): >>><<< 18285 1726853394.17852: stdout chunk (state=3): >>><<< 18285 1726853394.17875: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18285 1726853394.18051: _low_level_execute_command(): starting 18285 1726853394.18056: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446 `" && echo ansible-tmp-1726853394.179619-18313-231748256414446="` echo /root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446 `" ) && sleep 0' 18285 1726853394.19129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18285 1726853394.19144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18285 1726853394.19166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18285 1726853394.19187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18285 1726853394.19277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853394.19307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18285 1726853394.19333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853394.19501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853394.21438: stdout chunk (state=3): >>>ansible-tmp-1726853394.179619-18313-231748256414446=/root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446 <<< 18285 1726853394.21599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853394.21602: stdout chunk (state=3): >>><<< 18285 1726853394.21605: stderr chunk (state=3): >>><<< 18285 1726853394.21622: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853394.179619-18313-231748256414446=/root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18285 1726853394.21878: variable 'ansible_module_compression' from source: unknown 18285 1726853394.21881: ANSIBALLZ: Using generic lock for ansible.legacy.setup 18285 1726853394.21884: ANSIBALLZ: Acquiring lock 18285 1726853394.21886: ANSIBALLZ: Lock acquired: 140256816318320 18285 1726853394.21889: ANSIBALLZ: Creating module 18285 1726853394.66613: ANSIBALLZ: Writing module into payload 18285 1726853394.66769: ANSIBALLZ: Writing module 18285 1726853394.66800: ANSIBALLZ: Renaming module 18285 1726853394.66811: ANSIBALLZ: Done creating module 18285 1726853394.66847: variable 'ansible_facts' from source: unknown 18285 1726853394.66862: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18285 1726853394.66880: _low_level_execute_command(): starting 18285 1726853394.66893: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 18285 1726853394.67553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853394.67645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18285 1726853394.67662: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18285 1726853394.67714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853394.67775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853394.69456: stdout chunk (state=3): >>>PLATFORM <<< 18285 1726853394.69551: stdout chunk (state=3): >>>Linux <<< 18285 1726853394.69564: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 18285 1726853394.69727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853394.69910: stdout chunk (state=3): >>><<< 18285 1726853394.69914: stderr chunk (state=3): >>><<< 18285 1726853394.69917: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18285 1726853394.69923 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 18285 1726853394.69926: _low_level_execute_command(): starting 18285 1726853394.69928: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 18285 1726853394.70176: Sending initial data 18285 1726853394.70180: Sent initial data (1181 bytes) 18285 1726853394.71079: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18285 1726853394.71082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18285 1726853394.71085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18285 1726853394.71087: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18285 1726853394.71090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853394.71137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18285 1726853394.71373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853394.71437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853394.74882: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 18285 1726853394.75284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853394.75298: stderr chunk (state=3): >>><<< 18285 1726853394.75308: stdout chunk (state=3): >>><<< 18285 1726853394.75329: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18285 1726853394.75424: variable 'ansible_facts' from source: unknown 18285 1726853394.75434: variable 'ansible_facts' from source: unknown 18285 1726853394.75452: variable 'ansible_module_compression' from source: unknown 18285 1726853394.75498: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18285ef0wk9dz/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18285 1726853394.75531: variable 'ansible_facts' from source: unknown 18285 1726853394.75823: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446/AnsiballZ_setup.py 18285 1726853394.75959: Sending initial data 18285 1726853394.76185: Sent initial data (153 bytes) 18285 1726853394.76483: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18285 1726853394.76498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18285 1726853394.76512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18285 1726853394.76534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18285 1726853394.76588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853394.76652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18285 1726853394.76670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18285 1726853394.76695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853394.76759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853394.78408: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18285 1726853394.78443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18285 1726853394.78483: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18285ef0wk9dz/tmpjns7sm3h /root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446/AnsiballZ_setup.py <<< 18285 1726853394.78497: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446/AnsiballZ_setup.py" <<< 18285 1726853394.78557: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18285ef0wk9dz/tmpjns7sm3h" to remote "/root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446/AnsiballZ_setup.py" <<< 18285 1726853394.81180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853394.81221: stdout chunk (state=3): >>><<< 18285 1726853394.81233: stderr chunk (state=3): >>><<< 18285 1726853394.81379: done transferring module to remote 18285 1726853394.81383: _low_level_execute_command(): starting 18285 1726853394.81386: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446/ /root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446/AnsiballZ_setup.py && sleep 0' 18285 1726853394.82858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 18285 1726853394.82973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853394.83039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853394.84814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853394.84934: stderr chunk (state=3): >>><<< 18285 1726853394.84942: stdout chunk (state=3): >>><<< 18285 1726853394.84968: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18285 1726853394.84979: _low_level_execute_command(): starting 18285 1726853394.85026: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446/AnsiballZ_setup.py && sleep 0' 18285 1726853394.86122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18285 1726853394.86190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18285 1726853394.86206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18285 1726853394.86221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18285 1726853394.86243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18285 1726853394.86260: stderr chunk (state=3): >>>debug2: match not found <<< 18285 1726853394.86405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853394.86432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18285 1726853394.86473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18285 1726853394.86521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853394.86601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853394.88759: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 18285 1726853394.88797: stdout chunk (state=3): >>>import _imp # builtin <<< 18285 1726853394.88822: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 18285 1726853394.88890: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 18285 1726853394.88926: stdout chunk (state=3): >>>import 'posix' # <<< 18285 1726853394.89002: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 18285 1726853394.89021: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 18285 1726853394.89053: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 18285 1726853394.89173: stdout chunk (state=3): >>>import 'codecs' # <<< 18285 1726853394.89202: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991e184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991de7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991e1aa50> import '_signal' # <<< 18285 1726853394.89235: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 18285 1726853394.89246: stdout chunk (state=3): >>>import 'io' # <<< 18285 1726853394.89280: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 18285 1726853394.89359: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18285 1726853394.89433: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # <<< 18285 1726853394.89516: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 18285 1726853394.89520: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c2d130> <<< 18285 1726853394.89565: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 18285 1726853394.89715: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c2dfa0> import 'site' # <<< 18285 1726853394.89727: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18285 1726853394.90016: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 18285 1726853394.90048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18285 1726853394.90073: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18285 1726853394.90187: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 18285 1726853394.90191: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c6be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 18285 1726853394.90213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 18285 1726853394.90239: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c6bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 18285 1726853394.90419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991ca37d0> <<< 18285 1726853394.90423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991ca3e60> <<< 18285 1726853394.90441: stdout chunk (state=3): >>>import '_collections' # <<< 18285 1726853394.90487: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c83ad0> <<< 18285 1726853394.90510: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c811f0> <<< 18285 1726853394.90606: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c68fb0> <<< 18285 1726853394.90724: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 18285 1726853394.90747: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 18285 1726853394.90769: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cc3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cc2390> <<< 18285 1726853394.90805: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cc0bc0> <<< 18285 1726853394.90861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cf8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c68230> <<< 18285 1726853394.90921: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991cf8cb0> <<< 18285 1726853394.91121: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cf8b60> <<< 18285 1726853394.91125: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991cf8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c66d50> <<< 18285 1726853394.91137: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cf9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cf9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cfa480> import 'importlib.util' # import 'runpy' # <<< 18285 1726853394.91158: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 18285 1726853394.91197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 18285 1726853394.91240: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991d106b0> <<< 18285 1726853394.91290: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991d11d90> <<< 18285 1726853394.91361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991d12c30> <<< 18285 1726853394.91412: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991d13290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991d12180> <<< 18285 1726853394.91572: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 18285 1726853394.91601: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991d13d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991d13440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cfa4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 18285 1726853394.91637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 18285 1726853394.91641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 18285 1726853394.91677: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991a1bbc0> <<< 18285 1726853394.91729: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 18285 1726853394.91733: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991a446b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a44410> <<< 18285 1726853394.91785: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991a446e0> <<< 18285 1726853394.91982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18285 1726853394.92008: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991a45010> <<< 18285 1726853394.92125: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991a45a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a448c0> <<< 18285 1726853394.92154: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a19d60> <<< 18285 1726853394.92170: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 18285 1726853394.92194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 18285 1726853394.92226: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a46e10> <<< 18285 1726853394.92261: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a45b50> <<< 18285 1726853394.92276: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cfabd0> <<< 18285 1726853394.92299: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 18285 1726853394.92362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853394.92387: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 18285 1726853394.92415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 18285 1726853394.92446: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a73140> <<< 18285 1726853394.92517: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 18285 1726853394.92521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853394.92566: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 18285 1726853394.92569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 18285 1726853394.92594: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a93500> <<< 18285 1726853394.92617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18285 1726853394.92667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18285 1726853394.92719: stdout chunk (state=3): >>>import 'ntpath' # <<< 18285 1726853394.92761: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991af4260> <<< 18285 1726853394.92767: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18285 1726853394.92796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18285 1726853394.92877: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 18285 1726853394.92899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 18285 1726853394.92951: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991af69c0> <<< 18285 1726853394.93022: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991af4380> <<< 18285 1726853394.93066: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991abd280> <<< 18285 1726853394.93131: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99918f5370> <<< 18285 1726853394.93135: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a92300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a47d40> <<< 18285 1726853394.93339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 18285 1726853394.93345: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9991a92420> <<< 18285 1726853394.93586: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xgyskx43/ansible_ansible.legacy.setup_payload.zip' <<< 18285 1726853394.93604: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853394.93876: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853394.93900: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999195aff0> import '_typing' # <<< 18285 1726853394.94068: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991939ee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991939040> # zipimport: zlib available <<< 18285 1726853394.94095: stdout chunk (state=3): >>>import 'ansible' # <<< 18285 1726853394.94129: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18285 1726853394.94157: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 18285 1726853394.95567: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853394.96720: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991958ec0> <<< 18285 1726853394.96806: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853394.96832: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999198e900> <<< 18285 1726853394.97002: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999198e690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999198dfa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999198e3f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999195bc80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999198f6b0> <<< 18285 1726853394.97058: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999198f8f0> <<< 18285 1726853394.97061: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18285 1726853394.97096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 18285 1726853394.97111: stdout chunk (state=3): >>>import '_locale' # <<< 18285 1726853394.97278: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999198fe30> import 'pwd' # <<< 18285 1726853394.97411: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991325bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99913277d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991328170> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991329310> <<< 18285 1726853394.97496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 18285 1726853394.97499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 18285 1726853394.97502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 18285 1726853394.97585: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999132bda0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999193b0e0> <<< 18285 1726853394.97611: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999132a060> <<< 18285 1726853394.97782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 18285 1726853394.97833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 18285 1726853394.97943: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991333d40> <<< 18285 1726853394.97947: stdout chunk (state=3): >>>import '_tokenize' # <<< 18285 1726853394.97961: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991332810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991332570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 18285 1726853394.98052: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991332ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999132a570> <<< 18285 1726853394.98170: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991377f80> <<< 18285 1726853394.98176: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99913780e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 18285 1726853394.98260: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991379bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991379970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18285 1726853394.98754: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999137c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999137a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999137f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999137c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991380b90> <<< 18285 1726853394.98758: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991380710> <<< 18285 1726853394.98760: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991380230> <<< 18285 1726853394.98763: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99913782f0> <<< 18285 1726853394.98765: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 18285 1726853394.98841: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 18285 1726853394.98844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 18285 1726853394.98847: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18285 1726853394.98851: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999120c110> <<< 18285 1726853394.99067: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999120d040> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99913828a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991383c50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991382540> <<< 18285 1726853394.99072: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 18285 1726853394.99088: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853394.99198: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853394.99392: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 18285 1726853394.99475: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853394.99700: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.00105: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.00645: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 18285 1726853395.00673: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 18285 1726853395.00698: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853395.00902: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991211220> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991211fd0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999120d1f0> import 'ansible.module_utils.compat.selinux' # <<< 18285 1726853395.00934: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18285 1726853395.00962: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 18285 1726853395.01122: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.01341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991212210> # zipimport: zlib available <<< 18285 1726853395.01791: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.02185: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.02260: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.02330: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 18285 1726853395.02385: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18285 1726853395.02416: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 18285 1726853395.02554: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.02618: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # <<< 18285 1726853395.02621: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 18285 1726853395.02624: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.02769: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 18285 1726853395.02775: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.02938: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.03314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18285 1726853395.03386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912132c0> # zipimport: zlib available <<< 18285 1726853395.03403: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.03462: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 18285 1726853395.03532: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 18285 1726853395.03577: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.03597: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 18285 1726853395.03638: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.03957: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.03960: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853395.03993: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999121de80> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999121bf50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 18285 1726853395.04013: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.04067: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.04129: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.04157: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.04209: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853395.04222: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18285 1726853395.04250: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 18285 1726853395.04329: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18285 1726853395.04603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18285 1726853395.04616: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991306630> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99913fe300> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991212f60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991380d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 18285 1726853395.04662: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 18285 1726853395.04694: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 18285 1726853395.04717: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.04785: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.04836: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.04861: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.04876: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.04913: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.04952: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.05053: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.05138: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available <<< 18285 1726853395.05179: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.05199: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.05244: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 18285 1726853395.05262: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.05816: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912b1a90> <<< 18285 1726853395.05823: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 18285 1726853395.05905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 18285 1726853395.05926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990e43c80> <<< 18285 1726853395.05969: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 18285 1726853395.05982: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990e43fe0> <<< 18285 1726853395.06022: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999129a660> <<< 18285 1726853395.06078: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912b25d0> <<< 18285 1726853395.06100: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912b0200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912b3b90> <<< 18285 1726853395.06165: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 18285 1726853395.06238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 18285 1726853395.06302: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990e5ef30> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990e5e7e0> <<< 18285 1726853395.06305: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990e5e9c0> <<< 18285 1726853395.06537: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990e5dc10> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990e5f0e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 18285 1726853395.06544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 18285 1726853395.06547: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990eb5bb0> <<< 18285 1726853395.06677: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990e5fbc0> <<< 18285 1726853395.06680: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912b3920> import 'ansible.module_utils.facts.timeout' # <<< 18285 1726853395.06732: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 18285 1726853395.06735: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.07081: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 18285 1726853395.07211: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 18285 1726853395.07276: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.07334: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.07392: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.07452: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 18285 1726853395.07467: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.07938: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.08392: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 18285 1726853395.08434: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.08597: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 18285 1726853395.08685: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 18285 1726853395.08732: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.09065: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 18285 1726853395.09069: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18285 1726853395.09073: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 18285 1726853395.09076: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.09226: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990eb7860> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 18285 1726853395.09276: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990eb6780> <<< 18285 1726853395.09279: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 18285 1726853395.09281: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.09595: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 18285 1726853395.09613: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.09833: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.09856: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 18285 1726853395.09890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 18285 1726853395.09959: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 18285 1726853395.10015: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990ee5fa0> <<< 18285 1726853395.10199: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990ed6de0> import 'ansible.module_utils.facts.system.python' # <<< 18285 1726853395.10217: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.10277: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.10324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 18285 1726853395.10341: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.10415: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.10539: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.10613: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.10754: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 18285 1726853395.10779: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.10806: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.10847: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 18285 1726853395.10866: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.10896: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.10947: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 18285 1726853395.10989: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 18285 1726853395.11014: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990ef9be0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990ed6f90> import 'ansible.module_utils.facts.system.user' # <<< 18285 1726853395.11050: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18285 1726853395.11104: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 18285 1726853395.11117: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.11255: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 18285 1726853395.11269: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.11309: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.11459: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 18285 1726853395.11569: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.11592: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.11693: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.11717: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.11759: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 18285 1726853395.11818: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.11829: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18285 1726853395.11959: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.12102: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 18285 1726853395.12151: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.12244: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.12377: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 18285 1726853395.12380: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.12432: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.12454: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.13012: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.13512: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 18285 1726853395.13516: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.13670: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.13724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 18285 1726853395.13900: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.13904: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.13934: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 18285 1726853395.13948: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.14092: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.14245: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 18285 1726853395.14279: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 18285 1726853395.14298: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.14339: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.14388: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 18285 1726853395.14399: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.14489: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.14587: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.14784: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.14995: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 18285 1726853395.15010: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.15108: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.15129: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 18285 1726853395.15148: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 18285 1726853395.15279: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.15301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 18285 1726853395.15604: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.15607: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 18285 1726853395.15621: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 18285 1726853395.15861: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16117: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 18285 1726853395.16138: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16184: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16253: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 18285 1726853395.16279: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16294: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 18285 1726853395.16399: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 18285 1726853395.16417: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16444: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16485: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 18285 1726853395.16572: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16665: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 18285 1726853395.16703: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 18285 1726853395.16706: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16779: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16819: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 18285 1726853395.16905: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16939: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.16942: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18285 1726853395.17012: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.17078: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 18285 1726853395.17137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 18285 1726853395.17157: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.17253: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 18285 1726853395.17464: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.17600: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 18285 1726853395.17682: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.17709: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 18285 1726853395.17757: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.17808: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 18285 1726853395.17821: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.17899: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.18021: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 18285 1726853395.18024: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.18101: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.18170: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 18285 1726853395.18255: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853395.18482: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 18285 1726853395.18485: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 18285 1726853395.18537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 18285 1726853395.18552: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990c93200> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990c93140> <<< 18285 1726853395.18658: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990c914c0> <<< 18285 1726853395.31315: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990cd8e90> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990cd8f20> <<< 18285 1726853395.31379: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853395.31401: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990cdb290> <<< 18285 1726853395.31404: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990cd9e50> <<< 18285 1726853395.31666: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 18285 1726853395.55788: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "<<< 18285 1726853395.55845: stdout chunk (state=3): >>>ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "55", "epoch": "1726853395", "epoch_int": "1726853395", "date": "2024-09-20", "time": "13:29:55", "iso8601_micro": "2024-09-20T17:29:55.191992Z", "iso8601": "2024-09-20T17:29:55Z", "iso8601_basic": "20240920T132955191992", "iso8601_basic_short": "20240920T132955", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 561, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794922496, "block_size": 4096, "block_total": 65519099, "block_available": 63914776, "block_used": 1604323, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.51123046875, "5m": 0.36865234375, "15m": 0.17041015625}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_fips": false, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18285 1726853395.56505: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 18285 1726853395.56615: stdout chunk (state=3): >>># cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale <<< 18285 1726853395.56712: stdout chunk (state=3): >>># cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl<<< 18285 1726853395.56985: stdout chunk (state=3): >>> # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips <<< 18285 1726853395.57077: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 18285 1726853395.57138: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 18285 1726853395.57159: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 18285 1726853395.57185: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 18285 1726853395.57209: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 18285 1726853395.57231: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 18285 1726853395.57257: stdout chunk (state=3): >>># destroy ntpath <<< 18285 1726853395.57307: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 18285 1726853395.57311: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 18285 1726853395.57341: stdout chunk (state=3): >>># destroy _locale <<< 18285 1726853395.57348: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 18285 1726853395.57415: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 18285 1726853395.57418: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 18285 1726853395.57421: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 18285 1726853395.57463: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 18285 1726853395.57493: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle # destroy _pickle <<< 18285 1726853395.57513: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 18285 1726853395.57551: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 18285 1726853395.57630: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 18285 1726853395.57635: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 18285 1726853395.57675: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 18285 1726853395.57733: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 18285 1726853395.57872: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 18285 1726853395.57982: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 18285 1726853395.58004: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18285 1726853395.58144: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 18285 1726853395.58147: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 18285 1726853395.58249: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 18285 1726853395.58252: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib<<< 18285 1726853395.58294: stdout chunk (state=3): >>> <<< 18285 1726853395.58351: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig <<< 18285 1726853395.58470: stdout chunk (state=3): >>># destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18285 1726853395.58893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18285 1726853395.58896: stdout chunk (state=3): >>><<< 18285 1726853395.58914: stderr chunk (state=3): >>><<< 18285 1726853395.59264: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991e184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991de7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991e1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c6be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c6bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991ca37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991ca3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c83ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c811f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c68fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cc3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cc2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cc0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cf8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c68230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991cf8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cf8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991cf8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991c66d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cf9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cf9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cfa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991d106b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991d11d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991d12c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991d13290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991d12180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991d13d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991d13440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cfa4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991a1bbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991a446b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a44410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991a446e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991a45010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991a45a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a448c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a19d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a46e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a45b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991cfabd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a73140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a93500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991af4260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991af69c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991af4380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991abd280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99918f5370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a92300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991a47d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9991a92420> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xgyskx43/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999195aff0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991939ee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991939040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991958ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999198e900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999198e690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999198dfa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999198e3f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999195bc80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999198f6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999198f8f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999198fe30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991325bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99913277d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991328170> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991329310> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999132bda0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999193b0e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999132a060> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991333d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991332810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991332570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991332ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999132a570> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991377f80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99913780e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991379bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991379970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999137c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999137a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999137f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999137c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991380b90> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991380710> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991380230> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99913782f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999120c110> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999120d040> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99913828a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991383c50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991382540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9991211220> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991211fd0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999120d1f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991212210> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912132c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f999121de80> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999121bf50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991306630> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99913fe300> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991212f60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9991380d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912b1a90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990e43c80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990e43fe0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f999129a660> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912b25d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912b0200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912b3b90> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990e5ef30> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990e5e7e0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990e5e9c0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990e5dc10> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990e5f0e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990eb5bb0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990e5fbc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99912b3920> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990eb7860> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990eb6780> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990ee5fa0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990ed6de0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990ef9be0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990ed6f90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9990c93200> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990c93140> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990c914c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990cd8e90> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990cd8f20> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990cdb290> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9990cd9e50> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "55", "epoch": "1726853395", "epoch_int": "1726853395", "date": "2024-09-20", "time": "13:29:55", "iso8601_micro": "2024-09-20T17:29:55.191992Z", "iso8601": "2024-09-20T17:29:55Z", "iso8601_basic": "20240920T132955191992", "iso8601_basic_short": "20240920T132955", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 561, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794922496, "block_size": 4096, "block_total": 65519099, "block_available": 63914776, "block_used": 1604323, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.51123046875, "5m": 0.36865234375, "15m": 0.17041015625}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_fips": false, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 18285 1726853395.61715: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18285 1726853395.61742: _low_level_execute_command(): starting 18285 1726853395.61751: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853394.179619-18313-231748256414446/ > /dev/null 2>&1 && sleep 0' 18285 1726853395.62614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18285 1726853395.62677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853395.62737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18285 1726853395.62753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18285 1726853395.62780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853395.62858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18285 1726853395.65016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853395.65019: stdout chunk (state=3): >>><<< 18285 1726853395.65022: stderr chunk (state=3): >>><<< 18285 1726853395.65176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18285 1726853395.65179: handler run complete 18285 1726853395.65209: variable 'ansible_facts' from source: unknown 18285 1726853395.65333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853395.65703: variable 'ansible_facts' from source: unknown 18285 1726853395.65793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853395.65935: attempt loop complete, returning result 18285 1726853395.65951: _execute() done 18285 1726853395.65959: dumping result to json 18285 1726853395.65998: done dumping result, returning 18285 1726853395.66011: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-9200-7ca6-00000000007c] 18285 1726853395.66020: sending task result for task 02083763-bbaf-9200-7ca6-00000000007c 18285 1726853395.66839: done sending task result for task 02083763-bbaf-9200-7ca6-00000000007c 18285 1726853395.66843: WORKER PROCESS EXITING ok: [managed_node1] 18285 1726853395.67142: no more pending results, returning what we have 18285 1726853395.67145: results queue empty 18285 1726853395.67146: checking for any_errors_fatal 18285 1726853395.67147: done checking for any_errors_fatal 18285 1726853395.67148: checking for max_fail_percentage 18285 1726853395.67152: done checking for max_fail_percentage 18285 1726853395.67153: checking to see if all hosts have failed and the running result is not ok 18285 1726853395.67154: done checking to see if all hosts have failed 18285 1726853395.67155: getting the remaining hosts for this loop 18285 1726853395.67156: done getting the remaining hosts for this loop 18285 1726853395.67160: getting the next task for host managed_node1 18285 1726853395.67166: done getting next task for host managed_node1 18285 1726853395.67167: ^ task is: TASK: meta (flush_handlers) 18285 1726853395.67169: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853395.67175: getting variables 18285 1726853395.67176: in VariableManager get_vars() 18285 1726853395.67196: Calling all_inventory to load vars for managed_node1 18285 1726853395.67199: Calling groups_inventory to load vars for managed_node1 18285 1726853395.67202: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853395.67212: Calling all_plugins_play to load vars for managed_node1 18285 1726853395.67215: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853395.67218: Calling groups_plugins_play to load vars for managed_node1 18285 1726853395.67410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853395.67606: done with get_vars() 18285 1726853395.67616: done getting variables 18285 1726853395.67680: in VariableManager get_vars() 18285 1726853395.67688: Calling all_inventory to load vars for managed_node1 18285 1726853395.67690: Calling groups_inventory to load vars for managed_node1 18285 1726853395.67693: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853395.67697: Calling all_plugins_play to load vars for managed_node1 18285 1726853395.67699: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853395.67701: Calling groups_plugins_play to load vars for managed_node1 18285 1726853395.67839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853395.68031: done with get_vars() 18285 1726853395.68043: done queuing things up, now waiting for results queue to drain 18285 1726853395.68044: results queue empty 18285 1726853395.68045: checking for any_errors_fatal 18285 1726853395.68047: done checking for any_errors_fatal 18285 1726853395.68048: checking for max_fail_percentage 18285 1726853395.68051: done checking for max_fail_percentage 18285 1726853395.68052: checking to see if all hosts have failed and the running result is not ok 18285 1726853395.68058: done checking to see if all hosts have failed 18285 1726853395.68059: getting the remaining hosts for this loop 18285 1726853395.68060: done getting the remaining hosts for this loop 18285 1726853395.68063: getting the next task for host managed_node1 18285 1726853395.68067: done getting next task for host managed_node1 18285 1726853395.68069: ^ task is: TASK: Include the task 'el_repo_setup.yml' 18285 1726853395.68073: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853395.68075: getting variables 18285 1726853395.68076: in VariableManager get_vars() 18285 1726853395.68084: Calling all_inventory to load vars for managed_node1 18285 1726853395.68086: Calling groups_inventory to load vars for managed_node1 18285 1726853395.68089: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853395.68094: Calling all_plugins_play to load vars for managed_node1 18285 1726853395.68097: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853395.68099: Calling groups_plugins_play to load vars for managed_node1 18285 1726853395.68256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853395.68473: done with get_vars() 18285 1726853395.68481: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml:10 Friday 20 September 2024 13:29:55 -0400 (0:00:01.600) 0:00:01.621 ****** 18285 1726853395.68558: entering _queue_task() for managed_node1/include_tasks 18285 1726853395.68559: Creating lock for include_tasks 18285 1726853395.68857: worker is 1 (out of 1 available) 18285 1726853395.68979: exiting _queue_task() for managed_node1/include_tasks 18285 1726853395.68988: done queuing things up, now waiting for results queue to drain 18285 1726853395.68989: waiting for pending results... 18285 1726853395.69204: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 18285 1726853395.69304: in run() - task 02083763-bbaf-9200-7ca6-000000000006 18285 1726853395.69309: variable 'ansible_search_path' from source: unknown 18285 1726853395.69345: calling self._execute() 18285 1726853395.69656: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853395.69660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853395.69663: variable 'omit' from source: magic vars 18285 1726853395.69992: _execute() done 18285 1726853395.70197: dumping result to json 18285 1726853395.70201: done dumping result, returning 18285 1726853395.70204: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-9200-7ca6-000000000006] 18285 1726853395.70206: sending task result for task 02083763-bbaf-9200-7ca6-000000000006 18285 1726853395.70282: done sending task result for task 02083763-bbaf-9200-7ca6-000000000006 18285 1726853395.70285: WORKER PROCESS EXITING 18285 1726853395.70340: no more pending results, returning what we have 18285 1726853395.70346: in VariableManager get_vars() 18285 1726853395.70382: Calling all_inventory to load vars for managed_node1 18285 1726853395.70385: Calling groups_inventory to load vars for managed_node1 18285 1726853395.70389: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853395.70403: Calling all_plugins_play to load vars for managed_node1 18285 1726853395.70407: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853395.70410: Calling groups_plugins_play to load vars for managed_node1 18285 1726853395.70782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853395.71256: done with get_vars() 18285 1726853395.71264: variable 'ansible_search_path' from source: unknown 18285 1726853395.71278: we have included files to process 18285 1726853395.71279: generating all_blocks data 18285 1726853395.71280: done generating all_blocks data 18285 1726853395.71281: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18285 1726853395.71282: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18285 1726853395.71284: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18285 1726853395.71992: in VariableManager get_vars() 18285 1726853395.72007: done with get_vars() 18285 1726853395.72018: done processing included file 18285 1726853395.72019: iterating over new_blocks loaded from include file 18285 1726853395.72021: in VariableManager get_vars() 18285 1726853395.72029: done with get_vars() 18285 1726853395.72031: filtering new block on tags 18285 1726853395.72044: done filtering new block on tags 18285 1726853395.72047: in VariableManager get_vars() 18285 1726853395.72058: done with get_vars() 18285 1726853395.72060: filtering new block on tags 18285 1726853395.72076: done filtering new block on tags 18285 1726853395.72079: in VariableManager get_vars() 18285 1726853395.72088: done with get_vars() 18285 1726853395.72089: filtering new block on tags 18285 1726853395.72101: done filtering new block on tags 18285 1726853395.72103: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 18285 1726853395.72108: extending task lists for all hosts with included blocks 18285 1726853395.72155: done extending task lists 18285 1726853395.72156: done processing included files 18285 1726853395.72157: results queue empty 18285 1726853395.72158: checking for any_errors_fatal 18285 1726853395.72159: done checking for any_errors_fatal 18285 1726853395.72159: checking for max_fail_percentage 18285 1726853395.72160: done checking for max_fail_percentage 18285 1726853395.72161: checking to see if all hosts have failed and the running result is not ok 18285 1726853395.72162: done checking to see if all hosts have failed 18285 1726853395.72162: getting the remaining hosts for this loop 18285 1726853395.72163: done getting the remaining hosts for this loop 18285 1726853395.72166: getting the next task for host managed_node1 18285 1726853395.72169: done getting next task for host managed_node1 18285 1726853395.72173: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 18285 1726853395.72175: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853395.72177: getting variables 18285 1726853395.72178: in VariableManager get_vars() 18285 1726853395.72186: Calling all_inventory to load vars for managed_node1 18285 1726853395.72188: Calling groups_inventory to load vars for managed_node1 18285 1726853395.72190: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853395.72195: Calling all_plugins_play to load vars for managed_node1 18285 1726853395.72197: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853395.72200: Calling groups_plugins_play to load vars for managed_node1 18285 1726853395.72359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853395.72545: done with get_vars() 18285 1726853395.72555: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:29:55 -0400 (0:00:00.040) 0:00:01.662 ****** 18285 1726853395.72618: entering _queue_task() for managed_node1/setup 18285 1726853395.72875: worker is 1 (out of 1 available) 18285 1726853395.72887: exiting _queue_task() for managed_node1/setup 18285 1726853395.72899: done queuing things up, now waiting for results queue to drain 18285 1726853395.72900: waiting for pending results... 18285 1726853395.73303: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 18285 1726853395.73345: in run() - task 02083763-bbaf-9200-7ca6-00000000008d 18285 1726853395.73578: variable 'ansible_search_path' from source: unknown 18285 1726853395.73581: variable 'ansible_search_path' from source: unknown 18285 1726853395.73584: calling self._execute() 18285 1726853395.73692: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853395.73705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853395.73723: variable 'omit' from source: magic vars 18285 1726853395.74519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853395.77340: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853395.77408: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853395.77446: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853395.77501: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853395.77533: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853395.77622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853395.77692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853395.77696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853395.77738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853395.77762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853395.78244: variable 'ansible_facts' from source: unknown 18285 1726853395.78394: variable 'network_test_required_facts' from source: task vars 18285 1726853395.78435: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 18285 1726853395.78485: when evaluation is False, skipping this task 18285 1726853395.78494: _execute() done 18285 1726853395.78502: dumping result to json 18285 1726853395.78777: done dumping result, returning 18285 1726853395.78781: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-9200-7ca6-00000000008d] 18285 1726853395.78784: sending task result for task 02083763-bbaf-9200-7ca6-00000000008d 18285 1726853395.78854: done sending task result for task 02083763-bbaf-9200-7ca6-00000000008d 18285 1726853395.78857: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 18285 1726853395.78946: no more pending results, returning what we have 18285 1726853395.78950: results queue empty 18285 1726853395.78951: checking for any_errors_fatal 18285 1726853395.78952: done checking for any_errors_fatal 18285 1726853395.78953: checking for max_fail_percentage 18285 1726853395.78955: done checking for max_fail_percentage 18285 1726853395.78955: checking to see if all hosts have failed and the running result is not ok 18285 1726853395.78956: done checking to see if all hosts have failed 18285 1726853395.78957: getting the remaining hosts for this loop 18285 1726853395.78958: done getting the remaining hosts for this loop 18285 1726853395.78963: getting the next task for host managed_node1 18285 1726853395.78974: done getting next task for host managed_node1 18285 1726853395.78977: ^ task is: TASK: Check if system is ostree 18285 1726853395.78980: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853395.78984: getting variables 18285 1726853395.78986: in VariableManager get_vars() 18285 1726853395.79013: Calling all_inventory to load vars for managed_node1 18285 1726853395.79015: Calling groups_inventory to load vars for managed_node1 18285 1726853395.79020: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853395.79033: Calling all_plugins_play to load vars for managed_node1 18285 1726853395.79037: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853395.79041: Calling groups_plugins_play to load vars for managed_node1 18285 1726853395.79591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853395.79828: done with get_vars() 18285 1726853395.79837: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:29:55 -0400 (0:00:00.073) 0:00:01.735 ****** 18285 1726853395.79935: entering _queue_task() for managed_node1/stat 18285 1726853395.80161: worker is 1 (out of 1 available) 18285 1726853395.80376: exiting _queue_task() for managed_node1/stat 18285 1726853395.80385: done queuing things up, now waiting for results queue to drain 18285 1726853395.80387: waiting for pending results... 18285 1726853395.80421: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 18285 1726853395.80511: in run() - task 02083763-bbaf-9200-7ca6-00000000008f 18285 1726853395.80529: variable 'ansible_search_path' from source: unknown 18285 1726853395.80535: variable 'ansible_search_path' from source: unknown 18285 1726853395.80569: calling self._execute() 18285 1726853395.80641: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853395.80719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853395.80723: variable 'omit' from source: magic vars 18285 1726853395.81109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18285 1726853395.81344: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18285 1726853395.81398: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18285 1726853395.81435: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18285 1726853395.81473: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18285 1726853395.81554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18285 1726853395.81588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18285 1726853395.81618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853395.81647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18285 1726853395.81761: Evaluated conditional (not __network_is_ostree is defined): True 18285 1726853395.81805: variable 'omit' from source: magic vars 18285 1726853395.81817: variable 'omit' from source: magic vars 18285 1726853395.81857: variable 'omit' from source: magic vars 18285 1726853395.81886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18285 1726853395.81922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18285 1726853395.81963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18285 1726853395.82022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18285 1726853395.82026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18285 1726853395.82030: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18285 1726853395.82037: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853395.82045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853395.82148: Set connection var ansible_timeout to 10 18285 1726853395.82160: Set connection var ansible_shell_executable to /bin/sh 18285 1726853395.82169: Set connection var ansible_pipelining to False 18285 1726853395.82240: Set connection var ansible_shell_type to sh 18285 1726853395.82243: Set connection var ansible_module_compression to ZIP_DEFLATED 18285 1726853395.82246: Set connection var ansible_connection to ssh 18285 1726853395.82248: variable 'ansible_shell_executable' from source: unknown 18285 1726853395.82250: variable 'ansible_connection' from source: unknown 18285 1726853395.82252: variable 'ansible_module_compression' from source: unknown 18285 1726853395.82254: variable 'ansible_shell_type' from source: unknown 18285 1726853395.82256: variable 'ansible_shell_executable' from source: unknown 18285 1726853395.82258: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853395.82259: variable 'ansible_pipelining' from source: unknown 18285 1726853395.82261: variable 'ansible_timeout' from source: unknown 18285 1726853395.82263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853395.82396: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 18285 1726853395.82412: variable 'omit' from source: magic vars 18285 1726853395.82421: starting attempt loop 18285 1726853395.82427: running the handler 18285 1726853395.82444: _low_level_execute_command(): starting 18285 1726853395.82459: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18285 1726853395.83221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18285 1726853395.83287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853395.83339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18285 1726853395.83362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18285 1726853395.83381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853395.83459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18285 1726853395.85812: stdout chunk (state=3): >>>/root <<< 18285 1726853395.85933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853395.85974: stdout chunk (state=3): >>><<< 18285 1726853395.85981: stderr chunk (state=3): >>><<< 18285 1726853395.86000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18285 1726853395.86067: _low_level_execute_command(): starting 18285 1726853395.86072: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632 `" && echo ansible-tmp-1726853395.8602123-18366-122270721351632="` echo /root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632 `" ) && sleep 0' 18285 1726853395.86653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18285 1726853395.86668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18285 1726853395.86686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18285 1726853395.86704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18285 1726853395.86729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 18285 1726853395.86741: stderr chunk (state=3): >>>debug2: match not found <<< 18285 1726853395.86840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18285 1726853395.86861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18285 1726853395.86898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853395.86976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18285 1726853395.89681: stdout chunk (state=3): >>>ansible-tmp-1726853395.8602123-18366-122270721351632=/root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632 <<< 18285 1726853395.89821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853395.90176: stdout chunk (state=3): >>><<< 18285 1726853395.90180: stderr chunk (state=3): >>><<< 18285 1726853395.90183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853395.8602123-18366-122270721351632=/root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18285 1726853395.90185: variable 'ansible_module_compression' from source: unknown 18285 1726853395.90187: ANSIBALLZ: Using lock for stat 18285 1726853395.90189: ANSIBALLZ: Acquiring lock 18285 1726853395.90191: ANSIBALLZ: Lock acquired: 140256816814400 18285 1726853395.90193: ANSIBALLZ: Creating module 18285 1726853396.20284: ANSIBALLZ: Writing module into payload 18285 1726853396.20375: ANSIBALLZ: Writing module 18285 1726853396.20833: ANSIBALLZ: Renaming module 18285 1726853396.20839: ANSIBALLZ: Done creating module 18285 1726853396.20859: variable 'ansible_facts' from source: unknown 18285 1726853396.21145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632/AnsiballZ_stat.py 18285 1726853396.21593: Sending initial data 18285 1726853396.21596: Sent initial data (153 bytes) 18285 1726853396.22765: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853396.22798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18285 1726853396.22884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18285 1726853396.22890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853396.22959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853396.24732: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632/AnsiballZ_stat.py" <<< 18285 1726853396.24742: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18285ef0wk9dz/tmpiulvcx3j /root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632/AnsiballZ_stat.py <<< 18285 1726853396.24763: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18285ef0wk9dz/tmpiulvcx3j" to remote "/root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632/AnsiballZ_stat.py" <<< 18285 1726853396.26510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853396.26535: stderr chunk (state=3): >>><<< 18285 1726853396.26539: stdout chunk (state=3): >>><<< 18285 1726853396.26572: done transferring module to remote 18285 1726853396.26586: _low_level_execute_command(): starting 18285 1726853396.26591: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632/ /root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632/AnsiballZ_stat.py && sleep 0' 18285 1726853396.27959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18285 1726853396.27963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18285 1726853396.27994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853396.28085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18285 1726853396.28089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853396.28415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853396.30134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853396.30142: stderr chunk (state=3): >>><<< 18285 1726853396.30145: stdout chunk (state=3): >>><<< 18285 1726853396.30169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18285 1726853396.30174: _low_level_execute_command(): starting 18285 1726853396.30180: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632/AnsiballZ_stat.py && sleep 0' 18285 1726853396.32063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18285 1726853396.32067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853396.32157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853396.34353: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 18285 1726853396.34356: stdout chunk (state=3): >>>import _imp # builtin <<< 18285 1726853396.34389: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 18285 1726853396.34563: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # <<< 18285 1726853396.34567: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 18285 1726853396.34631: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853396.34653: stdout chunk (state=3): >>>import '_codecs' # <<< 18285 1726853396.34674: stdout chunk (state=3): >>>import 'codecs' # <<< 18285 1726853396.34710: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 18285 1726853396.34890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d063bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0638bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d063bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 18285 1726853396.34895: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 18285 1726853396.35019: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 18285 1726853396.35044: stdout chunk (state=3): >>>import 'os' # <<< 18285 1726853396.35105: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 18285 1726853396.35108: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 18285 1726853396.35291: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d063cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d063cdfa0> import 'site' # <<< 18285 1726853396.35325: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18285 1726853396.35526: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 18285 1726853396.35530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18285 1726853396.35564: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853396.35653: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18285 1726853396.35780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061abe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 18285 1726853396.35796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061abf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 18285 1726853396.35839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 18285 1726853396.35889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853396.35980: stdout chunk (state=3): >>>import 'itertools' # <<< 18285 1726853396.36088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061e3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061e3f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061c3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061c1280> <<< 18285 1726853396.36158: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061a9040> <<< 18285 1726853396.36208: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 18285 1726853396.36316: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 18285 1726853396.36370: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06203800> <<< 18285 1726853396.36386: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06202420> <<< 18285 1726853396.36393: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061c2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06200b60> <<< 18285 1726853396.36563: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 18285 1726853396.36567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06238860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061a82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 18285 1726853396.36570: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d06238d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06238bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d06238f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061a6de0> <<< 18285 1726853396.37003: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 18285 1726853396.37113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06239610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d062392e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0623a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06250710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d06251df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06252c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d062532f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d062521e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d06253d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d062534a0> <<< 18285 1726853396.37224: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0623a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 18285 1726853396.37240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 18285 1726853396.37290: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05fcfc50> <<< 18285 1726853396.37301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 18285 1726853396.37315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 18285 1726853396.37586: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05ff87a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05ff8500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05ff8710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18285 1726853396.37594: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18285 1726853396.37686: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05ff90d0> <<< 18285 1726853396.37768: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05ff9ac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05ff8980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05fcddf0> <<< 18285 1726853396.37780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 18285 1726853396.37803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 18285 1726853396.37814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 18285 1726853396.37837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05ffaea0> <<< 18285 1726853396.37907: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05ff9be0> <<< 18285 1726853396.37910: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0623ac30> <<< 18285 1726853396.38015: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 18285 1726853396.38162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 18285 1726853396.38168: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d060231d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 18285 1726853396.38201: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06047590> <<< 18285 1726853396.38225: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18285 1726853396.38341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18285 1726853396.38344: stdout chunk (state=3): >>>import 'ntpath' # <<< 18285 1726853396.38359: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 18285 1726853396.38372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d060a82f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18285 1726853396.38403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18285 1726853396.38475: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 18285 1726853396.38568: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d060aaa20> <<< 18285 1726853396.38694: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d060a83e0> <<< 18285 1726853396.38906: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0606d310> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059253a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06046390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05ffbda0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 18285 1726853396.38985: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6d06046990> <<< 18285 1726853396.39344: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_4jam7pzq/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available <<< 18285 1726853396.39361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 18285 1726853396.39378: stdout chunk (state=3): >>> <<< 18285 1726853396.39461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18285 1726853396.39592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18285 1726853396.39644: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 18285 1726853396.39665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0597b080><<< 18285 1726853396.39683: stdout chunk (state=3): >>> <<< 18285 1726853396.39789: stdout chunk (state=3): >>>import '_typing' # <<< 18285 1726853396.40006: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05959f70> <<< 18285 1726853396.40019: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05959100> <<< 18285 1726853396.40114: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # <<< 18285 1726853396.40134: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.40190: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18285 1726853396.40241: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 18285 1726853396.42266: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.43289: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05978f20> <<< 18285 1726853396.43305: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853396.43347: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 18285 1726853396.43353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 18285 1726853396.43368: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 18285 1726853396.43405: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d059a29c0> <<< 18285 1726853396.43504: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059a2780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059a20f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 18285 1726853396.43594: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059a2ae0> <<< 18285 1726853396.43597: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0597bb00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d059a3710> <<< 18285 1726853396.43653: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d059a3950> <<< 18285 1726853396.43664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18285 1726853396.43762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 18285 1726853396.43769: stdout chunk (state=3): >>>import '_locale' # <<< 18285 1726853396.43782: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059a3e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 18285 1726853396.43802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 18285 1726853396.43840: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0580dbe0> <<< 18285 1726853396.43882: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0580f7d0> <<< 18285 1726853396.43899: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 18285 1726853396.43995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058141d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 18285 1726853396.44013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 18285 1726853396.44016: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05815340> <<< 18285 1726853396.44041: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 18285 1726853396.44090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 18285 1726853396.44308: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 18285 1726853396.44311: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05817e00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05959070> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058160c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 18285 1726853396.44318: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 18285 1726853396.44321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 18285 1726853396.44342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 18285 1726853396.44359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 18285 1726853396.44373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 18285 1726853396.44391: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0581bce0> <<< 18285 1726853396.44402: stdout chunk (state=3): >>>import '_tokenize' # <<< 18285 1726853396.44538: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0581a7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0581a510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 18285 1726853396.44641: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0581aa80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058165d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05863fb0> <<< 18285 1726853396.44677: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05864110> <<< 18285 1726853396.44757: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 18285 1726853396.44848: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05865bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05865970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18285 1726853396.44937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18285 1726853396.44993: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d058680e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05866270> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 18285 1726853396.45311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0586b830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05868200> <<< 18285 1726853396.45520: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0586c5f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0586c860> <<< 18285 1726853396.45728: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0586cbf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058642f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d058f82f0> <<< 18285 1726853396.45975: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 18285 1726853396.45987: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d058f9910> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0586ea80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0586fe30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0586e6c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 18285 1726853396.46100: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.46244: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 18285 1726853396.46280: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18285 1726853396.46304: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 18285 1726853396.46354: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.46495: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.46678: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.47807: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.48216: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 18285 1726853396.48246: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 18285 1726853396.48261: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 18285 1726853396.48280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853396.48388: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d058fdbb0> <<< 18285 1726853396.48406: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 18285 1726853396.48444: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058fe840> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058f9a60> <<< 18285 1726853396.48487: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 18285 1726853396.48554: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 18285 1726853396.48792: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.48901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058fe540> # zipimport: zlib available <<< 18285 1726853396.49338: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.49972: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18285 1726853396.50100: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 18285 1726853396.50190: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 18285 1726853396.50303: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.50316: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.50443: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18285 1726853396.50448: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.50463: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 18285 1726853396.50494: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.50543: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.50596: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 18285 1726853396.50640: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.51063: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.51384: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18285 1726853396.51503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 18285 1726853396.51546: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058ffa70> <<< 18285 1726853396.51586: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.51722: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.51842: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 18285 1726853396.51984: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 18285 1726853396.52017: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.52088: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.52311: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18285 1726853396.52353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18285 1726853396.52585: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0570a450> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05705ca0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 18285 1726853396.52667: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.52768: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.52813: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.52884: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 18285 1726853396.52953: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18285 1726853396.52970: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 18285 1726853396.53004: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18285 1726853396.53090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 18285 1726853396.53174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18285 1726853396.53248: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059dec90> <<< 18285 1726853396.53316: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059ee960> <<< 18285 1726853396.53712: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0570a420> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0586d970> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 18285 1726853396.53839: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.54147: stdout chunk (state=3): >>># zipimport: zlib available <<< 18285 1726853396.54290: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 18285 1726853396.54324: stdout chunk (state=3): >>># destroy __main__ <<< 18285 1726853396.54804: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 18285 1726853396.54808: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ <<< 18285 1726853396.54903: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc <<< 18285 1726853396.55076: stdout chunk (state=3): >>># cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch <<< 18285 1726853396.55079: stdout chunk (state=3): >>># cleanup[2] removing errno # cleanup[2] removing zlib <<< 18285 1726853396.55098: stdout chunk (state=3): >>># cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random <<< 18285 1726853396.55207: stdout chunk (state=3): >>># cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 18285 1726853396.55510: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 18285 1726853396.55513: stdout chunk (state=3): >>># destroy _bz2 <<< 18285 1726853396.55540: stdout chunk (state=3): >>># destroy _compression # destroy _lzma <<< 18285 1726853396.55611: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 18285 1726853396.55669: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 18285 1726853396.55703: stdout chunk (state=3): >>># destroy ntpath <<< 18285 1726853396.55722: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 18285 1726853396.55817: stdout chunk (state=3): >>># destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 18285 1726853396.55821: stdout chunk (state=3): >>># destroy json.scanner <<< 18285 1726853396.55824: stdout chunk (state=3): >>># destroy _json <<< 18285 1726853396.55900: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select <<< 18285 1726853396.55904: stdout chunk (state=3): >>># destroy _signal <<< 18285 1726853396.55923: stdout chunk (state=3): >>># destroy _posixsubprocess <<< 18285 1726853396.55926: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 18285 1726853396.55996: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime <<< 18285 1726853396.56012: stdout chunk (state=3): >>># destroy selinux <<< 18285 1726853396.56283: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre <<< 18285 1726853396.56323: stdout chunk (state=3): >>># cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18285 1726853396.56399: stdout chunk (state=3): >>># destroy sys.monitoring <<< 18285 1726853396.56402: stdout chunk (state=3): >>># destroy _socket <<< 18285 1726853396.56432: stdout chunk (state=3): >>># destroy _collections <<< 18285 1726853396.56487: stdout chunk (state=3): >>># destroy platform <<< 18285 1726853396.56490: stdout chunk (state=3): >>># destroy _uuid <<< 18285 1726853396.56493: stdout chunk (state=3): >>># destroy stat # destroy genericpath <<< 18285 1726853396.56547: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib <<< 18285 1726853396.56553: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 18285 1726853396.56637: stdout chunk (state=3): >>># destroy _typing <<< 18285 1726853396.56640: stdout chunk (state=3): >>># destroy _tokenize <<< 18285 1726853396.56759: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 18285 1726853396.56793: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 18285 1726853396.56885: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools <<< 18285 1726853396.56981: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18285 1726853396.57473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 18285 1726853396.57532: stderr chunk (state=3): >>><<< 18285 1726853396.57541: stdout chunk (state=3): >>><<< 18285 1726853396.57840: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d063bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0638bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d063bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d063cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d063cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061abe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061abf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061e3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061e3f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061c3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061c1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061a9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06203800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06202420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061c2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06200b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06238860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061a82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d06238d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06238bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d06238f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d061a6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06239610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d062392e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0623a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06250710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d06251df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06252c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d062532f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d062521e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d06253d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d062534a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0623a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05fcfc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05ff87a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05ff8500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05ff8710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05ff90d0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05ff9ac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05ff8980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05fcddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05ffaea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05ff9be0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0623ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d060231d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06047590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d060a82f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d060aaa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d060a83e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0606d310> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059253a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d06046390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05ffbda0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6d06046990> # zipimport: found 30 names in '/tmp/ansible_stat_payload_4jam7pzq/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0597b080> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05959f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05959100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05978f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d059a29c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059a2780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059a20f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059a2ae0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0597bb00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d059a3710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d059a3950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059a3e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0580dbe0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0580f7d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058141d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05815340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05817e00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05959070> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058160c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0581bce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0581a7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0581a510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0581aa80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058165d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05863fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05864110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d05865bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05865970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d058680e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05866270> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0586b830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05868200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0586c5f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0586c860> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0586cbf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058642f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d058f82f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d058f9910> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0586ea80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0586fe30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0586e6c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d058fdbb0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058fe840> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058f9a60> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058fe540> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d058ffa70> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d0570a450> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d05705ca0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059dec90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d059ee960> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0570a420> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d0586d970> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 18285 1726853396.58963: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18285 1726853396.58966: _low_level_execute_command(): starting 18285 1726853396.58968: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853395.8602123-18366-122270721351632/ > /dev/null 2>&1 && sleep 0' 18285 1726853396.59545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18285 1726853396.59551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 18285 1726853396.59554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853396.59557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18285 1726853396.59559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18285 1726853396.59785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 18285 1726853396.59789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18285 1726853396.59791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18285 1726853396.59892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18285 1726853396.62596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18285 1726853396.62600: stdout chunk (state=3): >>><<< 18285 1726853396.62606: stderr chunk (state=3): >>><<< 18285 1726853396.62623: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18285 1726853396.62631: handler run complete 18285 1726853396.62659: attempt loop complete, returning result 18285 1726853396.62662: _execute() done 18285 1726853396.62665: dumping result to json 18285 1726853396.62667: done dumping result, returning 18285 1726853396.62705: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [02083763-bbaf-9200-7ca6-00000000008f] 18285 1726853396.62708: sending task result for task 02083763-bbaf-9200-7ca6-00000000008f 18285 1726853396.62862: done sending task result for task 02083763-bbaf-9200-7ca6-00000000008f 18285 1726853396.62864: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18285 1726853396.62963: no more pending results, returning what we have 18285 1726853396.62966: results queue empty 18285 1726853396.62967: checking for any_errors_fatal 18285 1726853396.62974: done checking for any_errors_fatal 18285 1726853396.62975: checking for max_fail_percentage 18285 1726853396.62976: done checking for max_fail_percentage 18285 1726853396.62977: checking to see if all hosts have failed and the running result is not ok 18285 1726853396.62977: done checking to see if all hosts have failed 18285 1726853396.62978: getting the remaining hosts for this loop 18285 1726853396.62979: done getting the remaining hosts for this loop 18285 1726853396.62984: getting the next task for host managed_node1 18285 1726853396.62988: done getting next task for host managed_node1 18285 1726853396.62991: ^ task is: TASK: Set flag to indicate system is ostree 18285 1726853396.62993: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853396.62996: getting variables 18285 1726853396.62997: in VariableManager get_vars() 18285 1726853396.63021: Calling all_inventory to load vars for managed_node1 18285 1726853396.63023: Calling groups_inventory to load vars for managed_node1 18285 1726853396.63027: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.63036: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.63038: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.63041: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.63220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.63417: done with get_vars() 18285 1726853396.63428: done getting variables 18285 1726853396.63525: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:29:56 -0400 (0:00:00.836) 0:00:02.571 ****** 18285 1726853396.63553: entering _queue_task() for managed_node1/set_fact 18285 1726853396.63555: Creating lock for set_fact 18285 1726853396.63831: worker is 1 (out of 1 available) 18285 1726853396.63843: exiting _queue_task() for managed_node1/set_fact 18285 1726853396.63855: done queuing things up, now waiting for results queue to drain 18285 1726853396.63857: waiting for pending results... 18285 1726853396.64203: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 18285 1726853396.64220: in run() - task 02083763-bbaf-9200-7ca6-000000000090 18285 1726853396.64276: variable 'ansible_search_path' from source: unknown 18285 1726853396.64300: variable 'ansible_search_path' from source: unknown 18285 1726853396.64327: calling self._execute() 18285 1726853396.64515: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.64520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.64523: variable 'omit' from source: magic vars 18285 1726853396.65260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18285 1726853396.65943: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18285 1726853396.66033: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18285 1726853396.66081: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18285 1726853396.66118: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18285 1726853396.66222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18285 1726853396.66256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18285 1726853396.66287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853396.66316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18285 1726853396.66417: Evaluated conditional (not __network_is_ostree is defined): True 18285 1726853396.66422: variable 'omit' from source: magic vars 18285 1726853396.66458: variable 'omit' from source: magic vars 18285 1726853396.66540: variable '__ostree_booted_stat' from source: set_fact 18285 1726853396.66585: variable 'omit' from source: magic vars 18285 1726853396.66606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18285 1726853396.66629: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18285 1726853396.66644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18285 1726853396.66656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18285 1726853396.66666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18285 1726853396.66691: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18285 1726853396.66694: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.66697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.66763: Set connection var ansible_timeout to 10 18285 1726853396.66767: Set connection var ansible_shell_executable to /bin/sh 18285 1726853396.66774: Set connection var ansible_pipelining to False 18285 1726853396.66785: Set connection var ansible_shell_type to sh 18285 1726853396.66788: Set connection var ansible_module_compression to ZIP_DEFLATED 18285 1726853396.66790: Set connection var ansible_connection to ssh 18285 1726853396.66804: variable 'ansible_shell_executable' from source: unknown 18285 1726853396.66807: variable 'ansible_connection' from source: unknown 18285 1726853396.66810: variable 'ansible_module_compression' from source: unknown 18285 1726853396.66812: variable 'ansible_shell_type' from source: unknown 18285 1726853396.66814: variable 'ansible_shell_executable' from source: unknown 18285 1726853396.66817: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.66820: variable 'ansible_pipelining' from source: unknown 18285 1726853396.66822: variable 'ansible_timeout' from source: unknown 18285 1726853396.66826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.66897: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 18285 1726853396.66905: variable 'omit' from source: magic vars 18285 1726853396.66911: starting attempt loop 18285 1726853396.66913: running the handler 18285 1726853396.66922: handler run complete 18285 1726853396.66929: attempt loop complete, returning result 18285 1726853396.66932: _execute() done 18285 1726853396.66934: dumping result to json 18285 1726853396.66938: done dumping result, returning 18285 1726853396.66946: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [02083763-bbaf-9200-7ca6-000000000090] 18285 1726853396.66951: sending task result for task 02083763-bbaf-9200-7ca6-000000000090 18285 1726853396.67024: done sending task result for task 02083763-bbaf-9200-7ca6-000000000090 18285 1726853396.67028: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 18285 1726853396.67099: no more pending results, returning what we have 18285 1726853396.67102: results queue empty 18285 1726853396.67103: checking for any_errors_fatal 18285 1726853396.67108: done checking for any_errors_fatal 18285 1726853396.67109: checking for max_fail_percentage 18285 1726853396.67111: done checking for max_fail_percentage 18285 1726853396.67111: checking to see if all hosts have failed and the running result is not ok 18285 1726853396.67112: done checking to see if all hosts have failed 18285 1726853396.67112: getting the remaining hosts for this loop 18285 1726853396.67114: done getting the remaining hosts for this loop 18285 1726853396.67118: getting the next task for host managed_node1 18285 1726853396.67126: done getting next task for host managed_node1 18285 1726853396.67128: ^ task is: TASK: Fix CentOS6 Base repo 18285 1726853396.67130: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853396.67133: getting variables 18285 1726853396.67135: in VariableManager get_vars() 18285 1726853396.67163: Calling all_inventory to load vars for managed_node1 18285 1726853396.67166: Calling groups_inventory to load vars for managed_node1 18285 1726853396.67169: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.67179: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.67182: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.67190: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.67336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.67449: done with get_vars() 18285 1726853396.67457: done getting variables 18285 1726853396.67539: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:29:56 -0400 (0:00:00.040) 0:00:02.612 ****** 18285 1726853396.67559: entering _queue_task() for managed_node1/copy 18285 1726853396.67741: worker is 1 (out of 1 available) 18285 1726853396.67753: exiting _queue_task() for managed_node1/copy 18285 1726853396.67763: done queuing things up, now waiting for results queue to drain 18285 1726853396.67765: waiting for pending results... 18285 1726853396.67916: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 18285 1726853396.67991: in run() - task 02083763-bbaf-9200-7ca6-000000000092 18285 1726853396.68008: variable 'ansible_search_path' from source: unknown 18285 1726853396.68011: variable 'ansible_search_path' from source: unknown 18285 1726853396.68035: calling self._execute() 18285 1726853396.68338: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.68342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.68344: variable 'omit' from source: magic vars 18285 1726853396.68815: variable 'ansible_distribution' from source: facts 18285 1726853396.68854: Evaluated conditional (ansible_distribution == 'CentOS'): True 18285 1726853396.68989: variable 'ansible_distribution_major_version' from source: facts 18285 1726853396.69001: Evaluated conditional (ansible_distribution_major_version == '6'): False 18285 1726853396.69008: when evaluation is False, skipping this task 18285 1726853396.69015: _execute() done 18285 1726853396.69021: dumping result to json 18285 1726853396.69030: done dumping result, returning 18285 1726853396.69039: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [02083763-bbaf-9200-7ca6-000000000092] 18285 1726853396.69051: sending task result for task 02083763-bbaf-9200-7ca6-000000000092 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18285 1726853396.69218: no more pending results, returning what we have 18285 1726853396.69221: results queue empty 18285 1726853396.69222: checking for any_errors_fatal 18285 1726853396.69227: done checking for any_errors_fatal 18285 1726853396.69228: checking for max_fail_percentage 18285 1726853396.69229: done checking for max_fail_percentage 18285 1726853396.69230: checking to see if all hosts have failed and the running result is not ok 18285 1726853396.69231: done checking to see if all hosts have failed 18285 1726853396.69231: getting the remaining hosts for this loop 18285 1726853396.69233: done getting the remaining hosts for this loop 18285 1726853396.69237: getting the next task for host managed_node1 18285 1726853396.69243: done getting next task for host managed_node1 18285 1726853396.69246: ^ task is: TASK: Include the task 'enable_epel.yml' 18285 1726853396.69252: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853396.69256: getting variables 18285 1726853396.69258: in VariableManager get_vars() 18285 1726853396.69398: Calling all_inventory to load vars for managed_node1 18285 1726853396.69403: Calling groups_inventory to load vars for managed_node1 18285 1726853396.69407: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.69422: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.69426: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.69429: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.69758: done sending task result for task 02083763-bbaf-9200-7ca6-000000000092 18285 1726853396.69762: WORKER PROCESS EXITING 18285 1726853396.69786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.70083: done with get_vars() 18285 1726853396.70094: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:29:56 -0400 (0:00:00.026) 0:00:02.638 ****** 18285 1726853396.70205: entering _queue_task() for managed_node1/include_tasks 18285 1726853396.70602: worker is 1 (out of 1 available) 18285 1726853396.70612: exiting _queue_task() for managed_node1/include_tasks 18285 1726853396.70621: done queuing things up, now waiting for results queue to drain 18285 1726853396.70622: waiting for pending results... 18285 1726853396.70809: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 18285 1726853396.70928: in run() - task 02083763-bbaf-9200-7ca6-000000000093 18285 1726853396.70945: variable 'ansible_search_path' from source: unknown 18285 1726853396.70962: variable 'ansible_search_path' from source: unknown 18285 1726853396.71002: calling self._execute() 18285 1726853396.71095: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.71146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.71153: variable 'omit' from source: magic vars 18285 1726853396.71745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853396.74235: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853396.74336: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853396.74411: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853396.74436: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853396.74577: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853396.74581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853396.74610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853396.74642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853396.74698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853396.74720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853396.74852: variable '__network_is_ostree' from source: set_fact 18285 1726853396.74877: Evaluated conditional (not __network_is_ostree | d(false)): True 18285 1726853396.74921: _execute() done 18285 1726853396.74925: dumping result to json 18285 1726853396.74927: done dumping result, returning 18285 1726853396.74930: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-9200-7ca6-000000000093] 18285 1726853396.74932: sending task result for task 02083763-bbaf-9200-7ca6-000000000093 18285 1726853396.75099: done sending task result for task 02083763-bbaf-9200-7ca6-000000000093 18285 1726853396.75102: WORKER PROCESS EXITING 18285 1726853396.75139: no more pending results, returning what we have 18285 1726853396.75145: in VariableManager get_vars() 18285 1726853396.75183: Calling all_inventory to load vars for managed_node1 18285 1726853396.75186: Calling groups_inventory to load vars for managed_node1 18285 1726853396.75189: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.75202: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.75205: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.75208: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.75718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.75924: done with get_vars() 18285 1726853396.75932: variable 'ansible_search_path' from source: unknown 18285 1726853396.75933: variable 'ansible_search_path' from source: unknown 18285 1726853396.75972: we have included files to process 18285 1726853396.75973: generating all_blocks data 18285 1726853396.75975: done generating all_blocks data 18285 1726853396.75980: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18285 1726853396.75981: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18285 1726853396.75983: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18285 1726853396.77680: done processing included file 18285 1726853396.77683: iterating over new_blocks loaded from include file 18285 1726853396.77685: in VariableManager get_vars() 18285 1726853396.77699: done with get_vars() 18285 1726853396.77701: filtering new block on tags 18285 1726853396.77727: done filtering new block on tags 18285 1726853396.77730: in VariableManager get_vars() 18285 1726853396.77742: done with get_vars() 18285 1726853396.77743: filtering new block on tags 18285 1726853396.77762: done filtering new block on tags 18285 1726853396.77765: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 18285 1726853396.77772: extending task lists for all hosts with included blocks 18285 1726853396.77876: done extending task lists 18285 1726853396.77878: done processing included files 18285 1726853396.77879: results queue empty 18285 1726853396.77879: checking for any_errors_fatal 18285 1726853396.77883: done checking for any_errors_fatal 18285 1726853396.77883: checking for max_fail_percentage 18285 1726853396.77884: done checking for max_fail_percentage 18285 1726853396.77885: checking to see if all hosts have failed and the running result is not ok 18285 1726853396.77886: done checking to see if all hosts have failed 18285 1726853396.77887: getting the remaining hosts for this loop 18285 1726853396.77888: done getting the remaining hosts for this loop 18285 1726853396.77890: getting the next task for host managed_node1 18285 1726853396.77894: done getting next task for host managed_node1 18285 1726853396.77896: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 18285 1726853396.77899: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853396.77901: getting variables 18285 1726853396.77902: in VariableManager get_vars() 18285 1726853396.77909: Calling all_inventory to load vars for managed_node1 18285 1726853396.77912: Calling groups_inventory to load vars for managed_node1 18285 1726853396.77914: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.77919: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.77927: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.77929: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.78103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.78296: done with get_vars() 18285 1726853396.78314: done getting variables 18285 1726853396.78380: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 18285 1726853396.78589: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:29:56 -0400 (0:00:00.084) 0:00:02.722 ****** 18285 1726853396.78649: entering _queue_task() for managed_node1/command 18285 1726853396.78651: Creating lock for command 18285 1726853396.79094: worker is 1 (out of 1 available) 18285 1726853396.79105: exiting _queue_task() for managed_node1/command 18285 1726853396.79116: done queuing things up, now waiting for results queue to drain 18285 1726853396.79118: waiting for pending results... 18285 1726853396.79401: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 18285 1726853396.79409: in run() - task 02083763-bbaf-9200-7ca6-0000000000ad 18285 1726853396.79428: variable 'ansible_search_path' from source: unknown 18285 1726853396.79492: variable 'ansible_search_path' from source: unknown 18285 1726853396.79495: calling self._execute() 18285 1726853396.79562: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.79578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.79596: variable 'omit' from source: magic vars 18285 1726853396.79993: variable 'ansible_distribution' from source: facts 18285 1726853396.80008: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18285 1726853396.80363: variable 'ansible_distribution_major_version' from source: facts 18285 1726853396.80368: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18285 1726853396.80372: when evaluation is False, skipping this task 18285 1726853396.80375: _execute() done 18285 1726853396.80377: dumping result to json 18285 1726853396.80379: done dumping result, returning 18285 1726853396.80382: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [02083763-bbaf-9200-7ca6-0000000000ad] 18285 1726853396.80384: sending task result for task 02083763-bbaf-9200-7ca6-0000000000ad 18285 1726853396.80452: done sending task result for task 02083763-bbaf-9200-7ca6-0000000000ad 18285 1726853396.80455: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18285 1726853396.80514: no more pending results, returning what we have 18285 1726853396.80517: results queue empty 18285 1726853396.80518: checking for any_errors_fatal 18285 1726853396.80519: done checking for any_errors_fatal 18285 1726853396.80520: checking for max_fail_percentage 18285 1726853396.80522: done checking for max_fail_percentage 18285 1726853396.80522: checking to see if all hosts have failed and the running result is not ok 18285 1726853396.80523: done checking to see if all hosts have failed 18285 1726853396.80524: getting the remaining hosts for this loop 18285 1726853396.80525: done getting the remaining hosts for this loop 18285 1726853396.80528: getting the next task for host managed_node1 18285 1726853396.80535: done getting next task for host managed_node1 18285 1726853396.80537: ^ task is: TASK: Install yum-utils package 18285 1726853396.80541: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853396.80545: getting variables 18285 1726853396.80547: in VariableManager get_vars() 18285 1726853396.80693: Calling all_inventory to load vars for managed_node1 18285 1726853396.80696: Calling groups_inventory to load vars for managed_node1 18285 1726853396.80700: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.80714: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.80718: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.80721: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.81300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.81728: done with get_vars() 18285 1726853396.81737: done getting variables 18285 1726853396.81954: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:29:56 -0400 (0:00:00.033) 0:00:02.756 ****** 18285 1726853396.82024: entering _queue_task() for managed_node1/package 18285 1726853396.82026: Creating lock for package 18285 1726853396.82442: worker is 1 (out of 1 available) 18285 1726853396.82453: exiting _queue_task() for managed_node1/package 18285 1726853396.82463: done queuing things up, now waiting for results queue to drain 18285 1726853396.82464: waiting for pending results... 18285 1726853396.82759: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 18285 1726853396.82765: in run() - task 02083763-bbaf-9200-7ca6-0000000000ae 18285 1726853396.82768: variable 'ansible_search_path' from source: unknown 18285 1726853396.82770: variable 'ansible_search_path' from source: unknown 18285 1726853396.82805: calling self._execute() 18285 1726853396.82968: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.82974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.82978: variable 'omit' from source: magic vars 18285 1726853396.83306: variable 'ansible_distribution' from source: facts 18285 1726853396.83324: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18285 1726853396.83460: variable 'ansible_distribution_major_version' from source: facts 18285 1726853396.83473: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18285 1726853396.83481: when evaluation is False, skipping this task 18285 1726853396.83488: _execute() done 18285 1726853396.83495: dumping result to json 18285 1726853396.83516: done dumping result, returning 18285 1726853396.83527: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [02083763-bbaf-9200-7ca6-0000000000ae] 18285 1726853396.83537: sending task result for task 02083763-bbaf-9200-7ca6-0000000000ae skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18285 1726853396.83772: no more pending results, returning what we have 18285 1726853396.83776: results queue empty 18285 1726853396.83777: checking for any_errors_fatal 18285 1726853396.83973: done checking for any_errors_fatal 18285 1726853396.83975: checking for max_fail_percentage 18285 1726853396.83977: done checking for max_fail_percentage 18285 1726853396.83977: checking to see if all hosts have failed and the running result is not ok 18285 1726853396.83978: done checking to see if all hosts have failed 18285 1726853396.83979: getting the remaining hosts for this loop 18285 1726853396.83980: done getting the remaining hosts for this loop 18285 1726853396.83984: getting the next task for host managed_node1 18285 1726853396.83989: done getting next task for host managed_node1 18285 1726853396.83991: ^ task is: TASK: Enable EPEL 7 18285 1726853396.83995: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853396.83997: getting variables 18285 1726853396.83999: in VariableManager get_vars() 18285 1726853396.84022: Calling all_inventory to load vars for managed_node1 18285 1726853396.84024: Calling groups_inventory to load vars for managed_node1 18285 1726853396.84027: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.84036: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.84039: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.84042: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.84231: done sending task result for task 02083763-bbaf-9200-7ca6-0000000000ae 18285 1726853396.84234: WORKER PROCESS EXITING 18285 1726853396.84256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.84590: done with get_vars() 18285 1726853396.84599: done getting variables 18285 1726853396.84769: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:29:56 -0400 (0:00:00.028) 0:00:02.785 ****** 18285 1726853396.84862: entering _queue_task() for managed_node1/command 18285 1726853396.85418: worker is 1 (out of 1 available) 18285 1726853396.85431: exiting _queue_task() for managed_node1/command 18285 1726853396.85504: done queuing things up, now waiting for results queue to drain 18285 1726853396.85506: waiting for pending results... 18285 1726853396.86087: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 18285 1726853396.86094: in run() - task 02083763-bbaf-9200-7ca6-0000000000af 18285 1726853396.86097: variable 'ansible_search_path' from source: unknown 18285 1726853396.86100: variable 'ansible_search_path' from source: unknown 18285 1726853396.86102: calling self._execute() 18285 1726853396.86284: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.86336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.86388: variable 'omit' from source: magic vars 18285 1726853396.87193: variable 'ansible_distribution' from source: facts 18285 1726853396.87290: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18285 1726853396.87757: variable 'ansible_distribution_major_version' from source: facts 18285 1726853396.87760: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18285 1726853396.87762: when evaluation is False, skipping this task 18285 1726853396.87764: _execute() done 18285 1726853396.87766: dumping result to json 18285 1726853396.87768: done dumping result, returning 18285 1726853396.87772: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [02083763-bbaf-9200-7ca6-0000000000af] 18285 1726853396.87775: sending task result for task 02083763-bbaf-9200-7ca6-0000000000af 18285 1726853396.87841: done sending task result for task 02083763-bbaf-9200-7ca6-0000000000af 18285 1726853396.87845: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18285 1726853396.87907: no more pending results, returning what we have 18285 1726853396.87910: results queue empty 18285 1726853396.87911: checking for any_errors_fatal 18285 1726853396.87916: done checking for any_errors_fatal 18285 1726853396.87917: checking for max_fail_percentage 18285 1726853396.87919: done checking for max_fail_percentage 18285 1726853396.87919: checking to see if all hosts have failed and the running result is not ok 18285 1726853396.87920: done checking to see if all hosts have failed 18285 1726853396.87921: getting the remaining hosts for this loop 18285 1726853396.87923: done getting the remaining hosts for this loop 18285 1726853396.87927: getting the next task for host managed_node1 18285 1726853396.87934: done getting next task for host managed_node1 18285 1726853396.87937: ^ task is: TASK: Enable EPEL 8 18285 1726853396.87941: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853396.87946: getting variables 18285 1726853396.87947: in VariableManager get_vars() 18285 1726853396.87983: Calling all_inventory to load vars for managed_node1 18285 1726853396.87985: Calling groups_inventory to load vars for managed_node1 18285 1726853396.87989: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.88004: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.88007: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.88010: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.88431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.89079: done with get_vars() 18285 1726853396.89090: done getting variables 18285 1726853396.89147: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:29:56 -0400 (0:00:00.043) 0:00:02.828 ****** 18285 1726853396.89183: entering _queue_task() for managed_node1/command 18285 1726853396.89869: worker is 1 (out of 1 available) 18285 1726853396.90080: exiting _queue_task() for managed_node1/command 18285 1726853396.90090: done queuing things up, now waiting for results queue to drain 18285 1726853396.90091: waiting for pending results... 18285 1726853396.90339: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 18285 1726853396.90613: in run() - task 02083763-bbaf-9200-7ca6-0000000000b0 18285 1726853396.90714: variable 'ansible_search_path' from source: unknown 18285 1726853396.90717: variable 'ansible_search_path' from source: unknown 18285 1726853396.90744: calling self._execute() 18285 1726853396.91077: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.91081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.91083: variable 'omit' from source: magic vars 18285 1726853396.91740: variable 'ansible_distribution' from source: facts 18285 1726853396.91768: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18285 1726853396.92007: variable 'ansible_distribution_major_version' from source: facts 18285 1726853396.92018: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18285 1726853396.92024: when evaluation is False, skipping this task 18285 1726853396.92030: _execute() done 18285 1726853396.92036: dumping result to json 18285 1726853396.92042: done dumping result, returning 18285 1726853396.92058: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [02083763-bbaf-9200-7ca6-0000000000b0] 18285 1726853396.92068: sending task result for task 02083763-bbaf-9200-7ca6-0000000000b0 18285 1726853396.92335: done sending task result for task 02083763-bbaf-9200-7ca6-0000000000b0 18285 1726853396.92338: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18285 1726853396.92418: no more pending results, returning what we have 18285 1726853396.92421: results queue empty 18285 1726853396.92422: checking for any_errors_fatal 18285 1726853396.92427: done checking for any_errors_fatal 18285 1726853396.92428: checking for max_fail_percentage 18285 1726853396.92430: done checking for max_fail_percentage 18285 1726853396.92431: checking to see if all hosts have failed and the running result is not ok 18285 1726853396.92431: done checking to see if all hosts have failed 18285 1726853396.92432: getting the remaining hosts for this loop 18285 1726853396.92433: done getting the remaining hosts for this loop 18285 1726853396.92437: getting the next task for host managed_node1 18285 1726853396.92446: done getting next task for host managed_node1 18285 1726853396.92448: ^ task is: TASK: Enable EPEL 6 18285 1726853396.92454: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853396.92457: getting variables 18285 1726853396.92459: in VariableManager get_vars() 18285 1726853396.92490: Calling all_inventory to load vars for managed_node1 18285 1726853396.92493: Calling groups_inventory to load vars for managed_node1 18285 1726853396.92496: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.92511: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.92514: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.92517: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.92900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.93378: done with get_vars() 18285 1726853396.93388: done getting variables 18285 1726853396.93444: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:29:56 -0400 (0:00:00.042) 0:00:02.873 ****** 18285 1726853396.93680: entering _queue_task() for managed_node1/copy 18285 1726853396.94145: worker is 1 (out of 1 available) 18285 1726853396.94159: exiting _queue_task() for managed_node1/copy 18285 1726853396.94169: done queuing things up, now waiting for results queue to drain 18285 1726853396.94374: waiting for pending results... 18285 1726853396.94615: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 18285 1726853396.94903: in run() - task 02083763-bbaf-9200-7ca6-0000000000b2 18285 1726853396.94909: variable 'ansible_search_path' from source: unknown 18285 1726853396.94912: variable 'ansible_search_path' from source: unknown 18285 1726853396.95011: calling self._execute() 18285 1726853396.95070: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.95128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.95378: variable 'omit' from source: magic vars 18285 1726853396.95741: variable 'ansible_distribution' from source: facts 18285 1726853396.95762: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18285 1726853396.95885: variable 'ansible_distribution_major_version' from source: facts 18285 1726853396.95896: Evaluated conditional (ansible_distribution_major_version == '6'): False 18285 1726853396.95904: when evaluation is False, skipping this task 18285 1726853396.95911: _execute() done 18285 1726853396.95924: dumping result to json 18285 1726853396.95932: done dumping result, returning 18285 1726853396.95942: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [02083763-bbaf-9200-7ca6-0000000000b2] 18285 1726853396.95956: sending task result for task 02083763-bbaf-9200-7ca6-0000000000b2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18285 1726853396.96111: no more pending results, returning what we have 18285 1726853396.96116: results queue empty 18285 1726853396.96116: checking for any_errors_fatal 18285 1726853396.96122: done checking for any_errors_fatal 18285 1726853396.96123: checking for max_fail_percentage 18285 1726853396.96124: done checking for max_fail_percentage 18285 1726853396.96125: checking to see if all hosts have failed and the running result is not ok 18285 1726853396.96126: done checking to see if all hosts have failed 18285 1726853396.96127: getting the remaining hosts for this loop 18285 1726853396.96128: done getting the remaining hosts for this loop 18285 1726853396.96132: getting the next task for host managed_node1 18285 1726853396.96141: done getting next task for host managed_node1 18285 1726853396.96145: ^ task is: TASK: Set network provider to 'initscripts' 18285 1726853396.96148: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853396.96155: getting variables 18285 1726853396.96157: in VariableManager get_vars() 18285 1726853396.96188: Calling all_inventory to load vars for managed_node1 18285 1726853396.96191: Calling groups_inventory to load vars for managed_node1 18285 1726853396.96195: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.96210: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.96213: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.96216: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.96618: done sending task result for task 02083763-bbaf-9200-7ca6-0000000000b2 18285 1726853396.96621: WORKER PROCESS EXITING 18285 1726853396.96644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.96897: done with get_vars() 18285 1726853396.96906: done getting variables 18285 1726853396.96966: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'initscripts'] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml:12 Friday 20 September 2024 13:29:56 -0400 (0:00:00.033) 0:00:02.906 ****** 18285 1726853396.96994: entering _queue_task() for managed_node1/set_fact 18285 1726853396.97233: worker is 1 (out of 1 available) 18285 1726853396.97245: exiting _queue_task() for managed_node1/set_fact 18285 1726853396.97258: done queuing things up, now waiting for results queue to drain 18285 1726853396.97260: waiting for pending results... 18285 1726853396.97507: running TaskExecutor() for managed_node1/TASK: Set network provider to 'initscripts' 18285 1726853396.97595: in run() - task 02083763-bbaf-9200-7ca6-000000000007 18285 1726853396.97614: variable 'ansible_search_path' from source: unknown 18285 1726853396.97654: calling self._execute() 18285 1726853396.97736: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.97752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.97766: variable 'omit' from source: magic vars 18285 1726853396.97877: variable 'omit' from source: magic vars 18285 1726853396.97918: variable 'omit' from source: magic vars 18285 1726853396.97963: variable 'omit' from source: magic vars 18285 1726853396.98011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18285 1726853396.98061: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18285 1726853396.98089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18285 1726853396.98110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18285 1726853396.98132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18285 1726853396.98168: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18285 1726853396.98180: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.98189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.98308: Set connection var ansible_timeout to 10 18285 1726853396.98320: Set connection var ansible_shell_executable to /bin/sh 18285 1726853396.98330: Set connection var ansible_pipelining to False 18285 1726853396.98345: Set connection var ansible_shell_type to sh 18285 1726853396.98361: Set connection var ansible_module_compression to ZIP_DEFLATED 18285 1726853396.98368: Set connection var ansible_connection to ssh 18285 1726853396.98394: variable 'ansible_shell_executable' from source: unknown 18285 1726853396.98402: variable 'ansible_connection' from source: unknown 18285 1726853396.98409: variable 'ansible_module_compression' from source: unknown 18285 1726853396.98415: variable 'ansible_shell_type' from source: unknown 18285 1726853396.98421: variable 'ansible_shell_executable' from source: unknown 18285 1726853396.98428: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853396.98434: variable 'ansible_pipelining' from source: unknown 18285 1726853396.98440: variable 'ansible_timeout' from source: unknown 18285 1726853396.98447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853396.98777: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 18285 1726853396.98781: variable 'omit' from source: magic vars 18285 1726853396.98783: starting attempt loop 18285 1726853396.98786: running the handler 18285 1726853396.98788: handler run complete 18285 1726853396.98790: attempt loop complete, returning result 18285 1726853396.98792: _execute() done 18285 1726853396.98795: dumping result to json 18285 1726853396.98796: done dumping result, returning 18285 1726853396.98798: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'initscripts' [02083763-bbaf-9200-7ca6-000000000007] 18285 1726853396.98800: sending task result for task 02083763-bbaf-9200-7ca6-000000000007 18285 1726853396.98863: done sending task result for task 02083763-bbaf-9200-7ca6-000000000007 18285 1726853396.98866: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "initscripts" }, "changed": false } 18285 1726853396.98921: no more pending results, returning what we have 18285 1726853396.98924: results queue empty 18285 1726853396.98925: checking for any_errors_fatal 18285 1726853396.98932: done checking for any_errors_fatal 18285 1726853396.98932: checking for max_fail_percentage 18285 1726853396.98934: done checking for max_fail_percentage 18285 1726853396.98935: checking to see if all hosts have failed and the running result is not ok 18285 1726853396.98936: done checking to see if all hosts have failed 18285 1726853396.98936: getting the remaining hosts for this loop 18285 1726853396.98938: done getting the remaining hosts for this loop 18285 1726853396.98941: getting the next task for host managed_node1 18285 1726853396.98948: done getting next task for host managed_node1 18285 1726853396.98953: ^ task is: TASK: meta (flush_handlers) 18285 1726853396.98955: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853396.98959: getting variables 18285 1726853396.98960: in VariableManager get_vars() 18285 1726853396.98989: Calling all_inventory to load vars for managed_node1 18285 1726853396.98991: Calling groups_inventory to load vars for managed_node1 18285 1726853396.98995: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853396.99006: Calling all_plugins_play to load vars for managed_node1 18285 1726853396.99010: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853396.99013: Calling groups_plugins_play to load vars for managed_node1 18285 1726853396.99306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853396.99999: done with get_vars() 18285 1726853397.00007: done getting variables 18285 1726853397.00067: in VariableManager get_vars() 18285 1726853397.00378: Calling all_inventory to load vars for managed_node1 18285 1726853397.00381: Calling groups_inventory to load vars for managed_node1 18285 1726853397.00383: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.00388: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.00390: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.00393: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.00523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.00807: done with get_vars() 18285 1726853397.00820: done queuing things up, now waiting for results queue to drain 18285 1726853397.00822: results queue empty 18285 1726853397.00822: checking for any_errors_fatal 18285 1726853397.00824: done checking for any_errors_fatal 18285 1726853397.00825: checking for max_fail_percentage 18285 1726853397.00826: done checking for max_fail_percentage 18285 1726853397.00827: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.00827: done checking to see if all hosts have failed 18285 1726853397.00828: getting the remaining hosts for this loop 18285 1726853397.00829: done getting the remaining hosts for this loop 18285 1726853397.00831: getting the next task for host managed_node1 18285 1726853397.00835: done getting next task for host managed_node1 18285 1726853397.00836: ^ task is: TASK: meta (flush_handlers) 18285 1726853397.00837: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.00845: getting variables 18285 1726853397.00846: in VariableManager get_vars() 18285 1726853397.00856: Calling all_inventory to load vars for managed_node1 18285 1726853397.00858: Calling groups_inventory to load vars for managed_node1 18285 1726853397.00860: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.00864: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.00867: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.00869: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.01196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.01706: done with get_vars() 18285 1726853397.01713: done getting variables 18285 1726853397.01760: in VariableManager get_vars() 18285 1726853397.01768: Calling all_inventory to load vars for managed_node1 18285 1726853397.01770: Calling groups_inventory to load vars for managed_node1 18285 1726853397.01976: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.01980: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.01983: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.01985: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.02109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.02501: done with get_vars() 18285 1726853397.02512: done queuing things up, now waiting for results queue to drain 18285 1726853397.02514: results queue empty 18285 1726853397.02515: checking for any_errors_fatal 18285 1726853397.02516: done checking for any_errors_fatal 18285 1726853397.02517: checking for max_fail_percentage 18285 1726853397.02518: done checking for max_fail_percentage 18285 1726853397.02518: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.02519: done checking to see if all hosts have failed 18285 1726853397.02520: getting the remaining hosts for this loop 18285 1726853397.02521: done getting the remaining hosts for this loop 18285 1726853397.02523: getting the next task for host managed_node1 18285 1726853397.02525: done getting next task for host managed_node1 18285 1726853397.02526: ^ task is: None 18285 1726853397.02528: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.02529: done queuing things up, now waiting for results queue to drain 18285 1726853397.02530: results queue empty 18285 1726853397.02530: checking for any_errors_fatal 18285 1726853397.02531: done checking for any_errors_fatal 18285 1726853397.02532: checking for max_fail_percentage 18285 1726853397.02532: done checking for max_fail_percentage 18285 1726853397.02533: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.02534: done checking to see if all hosts have failed 18285 1726853397.02536: getting the next task for host managed_node1 18285 1726853397.02538: done getting next task for host managed_node1 18285 1726853397.02538: ^ task is: None 18285 1726853397.02540: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.02805: in VariableManager get_vars() 18285 1726853397.02818: done with get_vars() 18285 1726853397.02824: in VariableManager get_vars() 18285 1726853397.02832: done with get_vars() 18285 1726853397.02836: variable 'omit' from source: magic vars 18285 1726853397.02869: in VariableManager get_vars() 18285 1726853397.02880: done with get_vars() 18285 1726853397.02899: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 18285 1726853397.03274: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18285 1726853397.03303: getting the remaining hosts for this loop 18285 1726853397.03304: done getting the remaining hosts for this loop 18285 1726853397.03307: getting the next task for host managed_node1 18285 1726853397.03310: done getting next task for host managed_node1 18285 1726853397.03312: ^ task is: TASK: Gathering Facts 18285 1726853397.03314: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.03316: getting variables 18285 1726853397.03317: in VariableManager get_vars() 18285 1726853397.03325: Calling all_inventory to load vars for managed_node1 18285 1726853397.03327: Calling groups_inventory to load vars for managed_node1 18285 1726853397.03329: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.03334: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.03347: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.03354: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.03729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.04109: done with get_vars() 18285 1726853397.04116: done getting variables 18285 1726853397.04160: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Friday 20 September 2024 13:29:57 -0400 (0:00:00.071) 0:00:02.978 ****** 18285 1726853397.04186: entering _queue_task() for managed_node1/gather_facts 18285 1726853397.04913: worker is 1 (out of 1 available) 18285 1726853397.04924: exiting _queue_task() for managed_node1/gather_facts 18285 1726853397.04934: done queuing things up, now waiting for results queue to drain 18285 1726853397.04935: waiting for pending results... 18285 1726853397.05476: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18285 1726853397.05787: in run() - task 02083763-bbaf-9200-7ca6-0000000000d8 18285 1726853397.05799: variable 'ansible_search_path' from source: unknown 18285 1726853397.05838: calling self._execute() 18285 1726853397.06052: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.06056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.06218: variable 'omit' from source: magic vars 18285 1726853397.07474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.13772: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.13839: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.14094: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.14098: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.14101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.14253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.14344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.14448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.14498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.14545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.14832: variable 'ansible_distribution' from source: facts 18285 1726853397.14846: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.14874: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.15076: when evaluation is False, skipping this task 18285 1726853397.15079: _execute() done 18285 1726853397.15081: dumping result to json 18285 1726853397.15083: done dumping result, returning 18285 1726853397.15085: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-9200-7ca6-0000000000d8] 18285 1726853397.15087: sending task result for task 02083763-bbaf-9200-7ca6-0000000000d8 18285 1726853397.15156: done sending task result for task 02083763-bbaf-9200-7ca6-0000000000d8 18285 1726853397.15159: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.15209: no more pending results, returning what we have 18285 1726853397.15212: results queue empty 18285 1726853397.15213: checking for any_errors_fatal 18285 1726853397.15215: done checking for any_errors_fatal 18285 1726853397.15215: checking for max_fail_percentage 18285 1726853397.15217: done checking for max_fail_percentage 18285 1726853397.15217: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.15218: done checking to see if all hosts have failed 18285 1726853397.15219: getting the remaining hosts for this loop 18285 1726853397.15220: done getting the remaining hosts for this loop 18285 1726853397.15224: getting the next task for host managed_node1 18285 1726853397.15231: done getting next task for host managed_node1 18285 1726853397.15232: ^ task is: TASK: meta (flush_handlers) 18285 1726853397.15234: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.15239: getting variables 18285 1726853397.15240: in VariableManager get_vars() 18285 1726853397.15274: Calling all_inventory to load vars for managed_node1 18285 1726853397.15277: Calling groups_inventory to load vars for managed_node1 18285 1726853397.15281: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.15294: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.15297: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.15300: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.15879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.16336: done with get_vars() 18285 1726853397.16347: done getting variables 18285 1726853397.16416: in VariableManager get_vars() 18285 1726853397.16425: Calling all_inventory to load vars for managed_node1 18285 1726853397.16427: Calling groups_inventory to load vars for managed_node1 18285 1726853397.16430: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.16434: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.16436: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.16438: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.16807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.17290: done with get_vars() 18285 1726853397.17303: done queuing things up, now waiting for results queue to drain 18285 1726853397.17305: results queue empty 18285 1726853397.17306: checking for any_errors_fatal 18285 1726853397.17308: done checking for any_errors_fatal 18285 1726853397.17309: checking for max_fail_percentage 18285 1726853397.17310: done checking for max_fail_percentage 18285 1726853397.17310: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.17311: done checking to see if all hosts have failed 18285 1726853397.17312: getting the remaining hosts for this loop 18285 1726853397.17313: done getting the remaining hosts for this loop 18285 1726853397.17315: getting the next task for host managed_node1 18285 1726853397.17318: done getting next task for host managed_node1 18285 1726853397.17321: ^ task is: TASK: Show inside ethernet tests 18285 1726853397.17322: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.17324: getting variables 18285 1726853397.17325: in VariableManager get_vars() 18285 1726853397.17332: Calling all_inventory to load vars for managed_node1 18285 1726853397.17334: Calling groups_inventory to load vars for managed_node1 18285 1726853397.17336: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.17344: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.17346: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.17352: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.17655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.17941: done with get_vars() 18285 1726853397.17951: done getting variables 18285 1726853397.18021: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Friday 20 September 2024 13:29:57 -0400 (0:00:00.138) 0:00:03.116 ****** 18285 1726853397.18046: entering _queue_task() for managed_node1/debug 18285 1726853397.18048: Creating lock for debug 18285 1726853397.18734: worker is 1 (out of 1 available) 18285 1726853397.18746: exiting _queue_task() for managed_node1/debug 18285 1726853397.18760: done queuing things up, now waiting for results queue to drain 18285 1726853397.18762: waiting for pending results... 18285 1726853397.19127: running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests 18285 1726853397.19205: in run() - task 02083763-bbaf-9200-7ca6-00000000000b 18285 1726853397.19210: variable 'ansible_search_path' from source: unknown 18285 1726853397.19224: calling self._execute() 18285 1726853397.19325: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.19336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.19423: variable 'omit' from source: magic vars 18285 1726853397.20451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.24774: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.24858: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.24910: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.24957: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.24994: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.25174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.25179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.25181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.25183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.25204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.25350: variable 'ansible_distribution' from source: facts 18285 1726853397.25362: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.25392: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.25400: when evaluation is False, skipping this task 18285 1726853397.25407: _execute() done 18285 1726853397.25413: dumping result to json 18285 1726853397.25420: done dumping result, returning 18285 1726853397.25432: done running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests [02083763-bbaf-9200-7ca6-00000000000b] 18285 1726853397.25442: sending task result for task 02083763-bbaf-9200-7ca6-00000000000b 18285 1726853397.25681: done sending task result for task 02083763-bbaf-9200-7ca6-00000000000b 18285 1726853397.25688: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853397.25738: no more pending results, returning what we have 18285 1726853397.25742: results queue empty 18285 1726853397.25743: checking for any_errors_fatal 18285 1726853397.25745: done checking for any_errors_fatal 18285 1726853397.25745: checking for max_fail_percentage 18285 1726853397.25752: done checking for max_fail_percentage 18285 1726853397.25752: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.25754: done checking to see if all hosts have failed 18285 1726853397.25754: getting the remaining hosts for this loop 18285 1726853397.25756: done getting the remaining hosts for this loop 18285 1726853397.25760: getting the next task for host managed_node1 18285 1726853397.25766: done getting next task for host managed_node1 18285 1726853397.25769: ^ task is: TASK: Show network_provider 18285 1726853397.25774: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.25778: getting variables 18285 1726853397.25780: in VariableManager get_vars() 18285 1726853397.25812: Calling all_inventory to load vars for managed_node1 18285 1726853397.25815: Calling groups_inventory to load vars for managed_node1 18285 1726853397.25825: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.25839: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.25842: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.25845: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.26593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.27144: done with get_vars() 18285 1726853397.27156: done getting variables 18285 1726853397.27215: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Friday 20 September 2024 13:29:57 -0400 (0:00:00.091) 0:00:03.208 ****** 18285 1726853397.27245: entering _queue_task() for managed_node1/debug 18285 1726853397.28190: worker is 1 (out of 1 available) 18285 1726853397.28198: exiting _queue_task() for managed_node1/debug 18285 1726853397.28207: done queuing things up, now waiting for results queue to drain 18285 1726853397.28209: waiting for pending results... 18285 1726853397.28289: running TaskExecutor() for managed_node1/TASK: Show network_provider 18285 1726853397.28380: in run() - task 02083763-bbaf-9200-7ca6-00000000000c 18285 1726853397.28385: variable 'ansible_search_path' from source: unknown 18285 1726853397.28415: calling self._execute() 18285 1726853397.28496: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.28544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.28549: variable 'omit' from source: magic vars 18285 1726853397.29065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.31631: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.31714: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.31777: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.31797: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.31889: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.31931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.31966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.32005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.32053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.32075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.32217: variable 'ansible_distribution' from source: facts 18285 1726853397.32230: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.32257: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.32265: when evaluation is False, skipping this task 18285 1726853397.32273: _execute() done 18285 1726853397.32352: dumping result to json 18285 1726853397.32355: done dumping result, returning 18285 1726853397.32357: done running TaskExecutor() for managed_node1/TASK: Show network_provider [02083763-bbaf-9200-7ca6-00000000000c] 18285 1726853397.32359: sending task result for task 02083763-bbaf-9200-7ca6-00000000000c 18285 1726853397.32424: done sending task result for task 02083763-bbaf-9200-7ca6-00000000000c 18285 1726853397.32427: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853397.32476: no more pending results, returning what we have 18285 1726853397.32481: results queue empty 18285 1726853397.32482: checking for any_errors_fatal 18285 1726853397.32487: done checking for any_errors_fatal 18285 1726853397.32488: checking for max_fail_percentage 18285 1726853397.32490: done checking for max_fail_percentage 18285 1726853397.32490: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.32491: done checking to see if all hosts have failed 18285 1726853397.32492: getting the remaining hosts for this loop 18285 1726853397.32494: done getting the remaining hosts for this loop 18285 1726853397.32498: getting the next task for host managed_node1 18285 1726853397.32504: done getting next task for host managed_node1 18285 1726853397.32506: ^ task is: TASK: meta (flush_handlers) 18285 1726853397.32508: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.32512: getting variables 18285 1726853397.32514: in VariableManager get_vars() 18285 1726853397.32542: Calling all_inventory to load vars for managed_node1 18285 1726853397.32544: Calling groups_inventory to load vars for managed_node1 18285 1726853397.32548: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.32560: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.32563: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.32566: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.33058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.33251: done with get_vars() 18285 1726853397.33261: done getting variables 18285 1726853397.33376: in VariableManager get_vars() 18285 1726853397.33385: Calling all_inventory to load vars for managed_node1 18285 1726853397.33387: Calling groups_inventory to load vars for managed_node1 18285 1726853397.33389: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.33394: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.33396: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.33399: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.33529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.33715: done with get_vars() 18285 1726853397.33727: done queuing things up, now waiting for results queue to drain 18285 1726853397.33729: results queue empty 18285 1726853397.33730: checking for any_errors_fatal 18285 1726853397.33732: done checking for any_errors_fatal 18285 1726853397.33733: checking for max_fail_percentage 18285 1726853397.33734: done checking for max_fail_percentage 18285 1726853397.33734: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.33735: done checking to see if all hosts have failed 18285 1726853397.33736: getting the remaining hosts for this loop 18285 1726853397.33737: done getting the remaining hosts for this loop 18285 1726853397.33739: getting the next task for host managed_node1 18285 1726853397.33742: done getting next task for host managed_node1 18285 1726853397.33744: ^ task is: TASK: meta (flush_handlers) 18285 1726853397.33746: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.33748: getting variables 18285 1726853397.33749: in VariableManager get_vars() 18285 1726853397.33756: Calling all_inventory to load vars for managed_node1 18285 1726853397.33758: Calling groups_inventory to load vars for managed_node1 18285 1726853397.33761: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.33765: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.33774: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.33777: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.33925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.34098: done with get_vars() 18285 1726853397.34105: done getting variables 18285 1726853397.34146: in VariableManager get_vars() 18285 1726853397.34153: Calling all_inventory to load vars for managed_node1 18285 1726853397.34155: Calling groups_inventory to load vars for managed_node1 18285 1726853397.34158: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.34161: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.34164: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.34166: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.34297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.34475: done with get_vars() 18285 1726853397.34485: done queuing things up, now waiting for results queue to drain 18285 1726853397.34487: results queue empty 18285 1726853397.34487: checking for any_errors_fatal 18285 1726853397.34488: done checking for any_errors_fatal 18285 1726853397.34489: checking for max_fail_percentage 18285 1726853397.34490: done checking for max_fail_percentage 18285 1726853397.34491: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.34491: done checking to see if all hosts have failed 18285 1726853397.34492: getting the remaining hosts for this loop 18285 1726853397.34493: done getting the remaining hosts for this loop 18285 1726853397.34495: getting the next task for host managed_node1 18285 1726853397.34498: done getting next task for host managed_node1 18285 1726853397.34498: ^ task is: None 18285 1726853397.34500: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.34501: done queuing things up, now waiting for results queue to drain 18285 1726853397.34502: results queue empty 18285 1726853397.34502: checking for any_errors_fatal 18285 1726853397.34503: done checking for any_errors_fatal 18285 1726853397.34504: checking for max_fail_percentage 18285 1726853397.34505: done checking for max_fail_percentage 18285 1726853397.34505: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.34506: done checking to see if all hosts have failed 18285 1726853397.34507: getting the next task for host managed_node1 18285 1726853397.34509: done getting next task for host managed_node1 18285 1726853397.34510: ^ task is: None 18285 1726853397.34511: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.34544: in VariableManager get_vars() 18285 1726853397.34557: done with get_vars() 18285 1726853397.34562: in VariableManager get_vars() 18285 1726853397.34570: done with get_vars() 18285 1726853397.34576: variable 'omit' from source: magic vars 18285 1726853397.34602: in VariableManager get_vars() 18285 1726853397.34612: done with get_vars() 18285 1726853397.34631: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 18285 1726853397.34802: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18285 1726853397.34823: getting the remaining hosts for this loop 18285 1726853397.34824: done getting the remaining hosts for this loop 18285 1726853397.34826: getting the next task for host managed_node1 18285 1726853397.34829: done getting next task for host managed_node1 18285 1726853397.34830: ^ task is: TASK: Gathering Facts 18285 1726853397.34831: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.34833: getting variables 18285 1726853397.34834: in VariableManager get_vars() 18285 1726853397.34841: Calling all_inventory to load vars for managed_node1 18285 1726853397.34842: Calling groups_inventory to load vars for managed_node1 18285 1726853397.34844: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.34848: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.34851: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.34853: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.35011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.35182: done with get_vars() 18285 1726853397.35189: done getting variables 18285 1726853397.35223: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Friday 20 September 2024 13:29:57 -0400 (0:00:00.079) 0:00:03.288 ****** 18285 1726853397.35245: entering _queue_task() for managed_node1/gather_facts 18285 1726853397.35699: worker is 1 (out of 1 available) 18285 1726853397.35708: exiting _queue_task() for managed_node1/gather_facts 18285 1726853397.35718: done queuing things up, now waiting for results queue to drain 18285 1726853397.35720: waiting for pending results... 18285 1726853397.35846: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18285 1726853397.35876: in run() - task 02083763-bbaf-9200-7ca6-0000000000f0 18285 1726853397.35896: variable 'ansible_search_path' from source: unknown 18285 1726853397.35939: calling self._execute() 18285 1726853397.36020: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.36032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.36046: variable 'omit' from source: magic vars 18285 1726853397.36459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.38613: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.38748: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.38752: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.38766: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.38795: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.38887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.38923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.38952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.39003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.39024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.39165: variable 'ansible_distribution' from source: facts 18285 1726853397.39182: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.39206: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.39288: when evaluation is False, skipping this task 18285 1726853397.39291: _execute() done 18285 1726853397.39293: dumping result to json 18285 1726853397.39295: done dumping result, returning 18285 1726853397.39297: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-9200-7ca6-0000000000f0] 18285 1726853397.39299: sending task result for task 02083763-bbaf-9200-7ca6-0000000000f0 18285 1726853397.39359: done sending task result for task 02083763-bbaf-9200-7ca6-0000000000f0 18285 1726853397.39362: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.39439: no more pending results, returning what we have 18285 1726853397.39443: results queue empty 18285 1726853397.39444: checking for any_errors_fatal 18285 1726853397.39445: done checking for any_errors_fatal 18285 1726853397.39446: checking for max_fail_percentage 18285 1726853397.39448: done checking for max_fail_percentage 18285 1726853397.39448: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.39449: done checking to see if all hosts have failed 18285 1726853397.39450: getting the remaining hosts for this loop 18285 1726853397.39451: done getting the remaining hosts for this loop 18285 1726853397.39456: getting the next task for host managed_node1 18285 1726853397.39461: done getting next task for host managed_node1 18285 1726853397.39463: ^ task is: TASK: meta (flush_handlers) 18285 1726853397.39465: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.39469: getting variables 18285 1726853397.39470: in VariableManager get_vars() 18285 1726853397.39500: Calling all_inventory to load vars for managed_node1 18285 1726853397.39502: Calling groups_inventory to load vars for managed_node1 18285 1726853397.39506: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.39518: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.39521: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.39524: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.39910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.40101: done with get_vars() 18285 1726853397.40108: done getting variables 18285 1726853397.40157: in VariableManager get_vars() 18285 1726853397.40163: Calling all_inventory to load vars for managed_node1 18285 1726853397.40164: Calling groups_inventory to load vars for managed_node1 18285 1726853397.40166: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.40169: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.40172: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.40174: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.40272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.40377: done with get_vars() 18285 1726853397.40385: done queuing things up, now waiting for results queue to drain 18285 1726853397.40386: results queue empty 18285 1726853397.40387: checking for any_errors_fatal 18285 1726853397.40388: done checking for any_errors_fatal 18285 1726853397.40389: checking for max_fail_percentage 18285 1726853397.40389: done checking for max_fail_percentage 18285 1726853397.40390: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.40390: done checking to see if all hosts have failed 18285 1726853397.40390: getting the remaining hosts for this loop 18285 1726853397.40391: done getting the remaining hosts for this loop 18285 1726853397.40393: getting the next task for host managed_node1 18285 1726853397.40395: done getting next task for host managed_node1 18285 1726853397.40396: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 18285 1726853397.40397: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.40399: getting variables 18285 1726853397.40399: in VariableManager get_vars() 18285 1726853397.40404: Calling all_inventory to load vars for managed_node1 18285 1726853397.40405: Calling groups_inventory to load vars for managed_node1 18285 1726853397.40407: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.40415: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.40416: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.40418: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.40496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.40613: done with get_vars() 18285 1726853397.40619: done getting variables 18285 1726853397.40647: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 18285 1726853397.40743: variable 'type' from source: play vars 18285 1726853397.40747: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Friday 20 September 2024 13:29:57 -0400 (0:00:00.055) 0:00:03.344 ****** 18285 1726853397.40776: entering _queue_task() for managed_node1/set_fact 18285 1726853397.40964: worker is 1 (out of 1 available) 18285 1726853397.40978: exiting _queue_task() for managed_node1/set_fact 18285 1726853397.40990: done queuing things up, now waiting for results queue to drain 18285 1726853397.40991: waiting for pending results... 18285 1726853397.41141: running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 18285 1726853397.41202: in run() - task 02083763-bbaf-9200-7ca6-00000000000f 18285 1726853397.41214: variable 'ansible_search_path' from source: unknown 18285 1726853397.41245: calling self._execute() 18285 1726853397.41300: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.41305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.41313: variable 'omit' from source: magic vars 18285 1726853397.41613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.43603: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.43651: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.43876: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.43879: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.43882: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.43885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.43888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.43890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.43926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.43945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.44081: variable 'ansible_distribution' from source: facts 18285 1726853397.44094: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.44117: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.44125: when evaluation is False, skipping this task 18285 1726853397.44132: _execute() done 18285 1726853397.44139: dumping result to json 18285 1726853397.44148: done dumping result, returning 18285 1726853397.44161: done running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 [02083763-bbaf-9200-7ca6-00000000000f] 18285 1726853397.44169: sending task result for task 02083763-bbaf-9200-7ca6-00000000000f 18285 1726853397.44269: done sending task result for task 02083763-bbaf-9200-7ca6-00000000000f 18285 1726853397.44291: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.44344: no more pending results, returning what we have 18285 1726853397.44347: results queue empty 18285 1726853397.44348: checking for any_errors_fatal 18285 1726853397.44350: done checking for any_errors_fatal 18285 1726853397.44350: checking for max_fail_percentage 18285 1726853397.44352: done checking for max_fail_percentage 18285 1726853397.44353: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.44353: done checking to see if all hosts have failed 18285 1726853397.44354: getting the remaining hosts for this loop 18285 1726853397.44355: done getting the remaining hosts for this loop 18285 1726853397.44359: getting the next task for host managed_node1 18285 1726853397.44364: done getting next task for host managed_node1 18285 1726853397.44367: ^ task is: TASK: Include the task 'show_interfaces.yml' 18285 1726853397.44369: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.44412: getting variables 18285 1726853397.44414: in VariableManager get_vars() 18285 1726853397.44440: Calling all_inventory to load vars for managed_node1 18285 1726853397.44443: Calling groups_inventory to load vars for managed_node1 18285 1726853397.44446: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.44457: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.44460: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.44462: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.44713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.44904: done with get_vars() 18285 1726853397.44914: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Friday 20 September 2024 13:29:57 -0400 (0:00:00.042) 0:00:03.386 ****** 18285 1726853397.44991: entering _queue_task() for managed_node1/include_tasks 18285 1726853397.45189: worker is 1 (out of 1 available) 18285 1726853397.45202: exiting _queue_task() for managed_node1/include_tasks 18285 1726853397.45214: done queuing things up, now waiting for results queue to drain 18285 1726853397.45215: waiting for pending results... 18285 1726853397.45364: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 18285 1726853397.45424: in run() - task 02083763-bbaf-9200-7ca6-000000000010 18285 1726853397.45447: variable 'ansible_search_path' from source: unknown 18285 1726853397.45467: calling self._execute() 18285 1726853397.45521: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.45524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.45532: variable 'omit' from source: magic vars 18285 1726853397.45886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.47579: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.47897: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.47931: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.47960: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.47982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.48041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.48064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.48084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.48112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.48122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.48217: variable 'ansible_distribution' from source: facts 18285 1726853397.48221: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.48236: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.48239: when evaluation is False, skipping this task 18285 1726853397.48241: _execute() done 18285 1726853397.48244: dumping result to json 18285 1726853397.48248: done dumping result, returning 18285 1726853397.48258: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-9200-7ca6-000000000010] 18285 1726853397.48260: sending task result for task 02083763-bbaf-9200-7ca6-000000000010 18285 1726853397.48345: done sending task result for task 02083763-bbaf-9200-7ca6-000000000010 18285 1726853397.48347: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.48417: no more pending results, returning what we have 18285 1726853397.48420: results queue empty 18285 1726853397.48421: checking for any_errors_fatal 18285 1726853397.48426: done checking for any_errors_fatal 18285 1726853397.48427: checking for max_fail_percentage 18285 1726853397.48428: done checking for max_fail_percentage 18285 1726853397.48428: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.48429: done checking to see if all hosts have failed 18285 1726853397.48430: getting the remaining hosts for this loop 18285 1726853397.48431: done getting the remaining hosts for this loop 18285 1726853397.48435: getting the next task for host managed_node1 18285 1726853397.48439: done getting next task for host managed_node1 18285 1726853397.48442: ^ task is: TASK: Include the task 'manage_test_interface.yml' 18285 1726853397.48443: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.48446: getting variables 18285 1726853397.48447: in VariableManager get_vars() 18285 1726853397.48473: Calling all_inventory to load vars for managed_node1 18285 1726853397.48476: Calling groups_inventory to load vars for managed_node1 18285 1726853397.48480: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.48489: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.48491: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.48494: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.48651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.48763: done with get_vars() 18285 1726853397.48769: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Friday 20 September 2024 13:29:57 -0400 (0:00:00.038) 0:00:03.424 ****** 18285 1726853397.48831: entering _queue_task() for managed_node1/include_tasks 18285 1726853397.49015: worker is 1 (out of 1 available) 18285 1726853397.49030: exiting _queue_task() for managed_node1/include_tasks 18285 1726853397.49041: done queuing things up, now waiting for results queue to drain 18285 1726853397.49043: waiting for pending results... 18285 1726853397.49188: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 18285 1726853397.49243: in run() - task 02083763-bbaf-9200-7ca6-000000000011 18285 1726853397.49274: variable 'ansible_search_path' from source: unknown 18285 1726853397.49286: calling self._execute() 18285 1726853397.49339: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.49343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.49353: variable 'omit' from source: magic vars 18285 1726853397.49653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.51275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.51315: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.51342: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.51374: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.51393: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.51456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.51475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.51492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.51518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.51528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.51623: variable 'ansible_distribution' from source: facts 18285 1726853397.51626: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.51641: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.51644: when evaluation is False, skipping this task 18285 1726853397.51646: _execute() done 18285 1726853397.51652: dumping result to json 18285 1726853397.51654: done dumping result, returning 18285 1726853397.51661: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [02083763-bbaf-9200-7ca6-000000000011] 18285 1726853397.51664: sending task result for task 02083763-bbaf-9200-7ca6-000000000011 18285 1726853397.51743: done sending task result for task 02083763-bbaf-9200-7ca6-000000000011 18285 1726853397.51746: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.51821: no more pending results, returning what we have 18285 1726853397.51825: results queue empty 18285 1726853397.51827: checking for any_errors_fatal 18285 1726853397.51830: done checking for any_errors_fatal 18285 1726853397.51831: checking for max_fail_percentage 18285 1726853397.51832: done checking for max_fail_percentage 18285 1726853397.51833: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.51834: done checking to see if all hosts have failed 18285 1726853397.51834: getting the remaining hosts for this loop 18285 1726853397.51836: done getting the remaining hosts for this loop 18285 1726853397.51839: getting the next task for host managed_node1 18285 1726853397.51843: done getting next task for host managed_node1 18285 1726853397.51846: ^ task is: TASK: Include the task 'assert_device_present.yml' 18285 1726853397.51848: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.51852: getting variables 18285 1726853397.51854: in VariableManager get_vars() 18285 1726853397.51878: Calling all_inventory to load vars for managed_node1 18285 1726853397.51882: Calling groups_inventory to load vars for managed_node1 18285 1726853397.51885: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.51894: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.51896: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.51898: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.52026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.52146: done with get_vars() 18285 1726853397.52154: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Friday 20 September 2024 13:29:57 -0400 (0:00:00.033) 0:00:03.458 ****** 18285 1726853397.52226: entering _queue_task() for managed_node1/include_tasks 18285 1726853397.52413: worker is 1 (out of 1 available) 18285 1726853397.52425: exiting _queue_task() for managed_node1/include_tasks 18285 1726853397.52437: done queuing things up, now waiting for results queue to drain 18285 1726853397.52438: waiting for pending results... 18285 1726853397.52588: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 18285 1726853397.52642: in run() - task 02083763-bbaf-9200-7ca6-000000000012 18285 1726853397.52653: variable 'ansible_search_path' from source: unknown 18285 1726853397.52687: calling self._execute() 18285 1726853397.52740: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.52744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.52755: variable 'omit' from source: magic vars 18285 1726853397.53114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.54682: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.54727: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.54752: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.54790: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.54808: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.54868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.54889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.54906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.54933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.54945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.55032: variable 'ansible_distribution' from source: facts 18285 1726853397.55036: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.55053: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.55057: when evaluation is False, skipping this task 18285 1726853397.55060: _execute() done 18285 1726853397.55063: dumping result to json 18285 1726853397.55065: done dumping result, returning 18285 1726853397.55077: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [02083763-bbaf-9200-7ca6-000000000012] 18285 1726853397.55080: sending task result for task 02083763-bbaf-9200-7ca6-000000000012 18285 1726853397.55152: done sending task result for task 02083763-bbaf-9200-7ca6-000000000012 18285 1726853397.55157: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.55214: no more pending results, returning what we have 18285 1726853397.55217: results queue empty 18285 1726853397.55218: checking for any_errors_fatal 18285 1726853397.55221: done checking for any_errors_fatal 18285 1726853397.55222: checking for max_fail_percentage 18285 1726853397.55223: done checking for max_fail_percentage 18285 1726853397.55224: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.55225: done checking to see if all hosts have failed 18285 1726853397.55226: getting the remaining hosts for this loop 18285 1726853397.55227: done getting the remaining hosts for this loop 18285 1726853397.55230: getting the next task for host managed_node1 18285 1726853397.55236: done getting next task for host managed_node1 18285 1726853397.55238: ^ task is: TASK: meta (flush_handlers) 18285 1726853397.55240: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.55242: getting variables 18285 1726853397.55244: in VariableManager get_vars() 18285 1726853397.55269: Calling all_inventory to load vars for managed_node1 18285 1726853397.55273: Calling groups_inventory to load vars for managed_node1 18285 1726853397.55276: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.55286: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.55288: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.55291: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.55444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.55555: done with get_vars() 18285 1726853397.55561: done getting variables 18285 1726853397.55611: in VariableManager get_vars() 18285 1726853397.55617: Calling all_inventory to load vars for managed_node1 18285 1726853397.55618: Calling groups_inventory to load vars for managed_node1 18285 1726853397.55619: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.55622: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.55623: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.55625: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.55706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.55810: done with get_vars() 18285 1726853397.55818: done queuing things up, now waiting for results queue to drain 18285 1726853397.55819: results queue empty 18285 1726853397.55819: checking for any_errors_fatal 18285 1726853397.55820: done checking for any_errors_fatal 18285 1726853397.55821: checking for max_fail_percentage 18285 1726853397.55822: done checking for max_fail_percentage 18285 1726853397.55822: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.55823: done checking to see if all hosts have failed 18285 1726853397.55823: getting the remaining hosts for this loop 18285 1726853397.55824: done getting the remaining hosts for this loop 18285 1726853397.55826: getting the next task for host managed_node1 18285 1726853397.55828: done getting next task for host managed_node1 18285 1726853397.55829: ^ task is: TASK: meta (flush_handlers) 18285 1726853397.55830: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.55832: getting variables 18285 1726853397.55833: in VariableManager get_vars() 18285 1726853397.55837: Calling all_inventory to load vars for managed_node1 18285 1726853397.55838: Calling groups_inventory to load vars for managed_node1 18285 1726853397.55839: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.55846: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.55848: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.55850: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.55941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.56044: done with get_vars() 18285 1726853397.56049: done getting variables 18285 1726853397.56078: in VariableManager get_vars() 18285 1726853397.56084: Calling all_inventory to load vars for managed_node1 18285 1726853397.56085: Calling groups_inventory to load vars for managed_node1 18285 1726853397.56086: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.56089: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.56090: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.56092: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.56170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.56275: done with get_vars() 18285 1726853397.56282: done queuing things up, now waiting for results queue to drain 18285 1726853397.56283: results queue empty 18285 1726853397.56284: checking for any_errors_fatal 18285 1726853397.56284: done checking for any_errors_fatal 18285 1726853397.56285: checking for max_fail_percentage 18285 1726853397.56285: done checking for max_fail_percentage 18285 1726853397.56286: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.56286: done checking to see if all hosts have failed 18285 1726853397.56287: getting the remaining hosts for this loop 18285 1726853397.56287: done getting the remaining hosts for this loop 18285 1726853397.56289: getting the next task for host managed_node1 18285 1726853397.56290: done getting next task for host managed_node1 18285 1726853397.56291: ^ task is: None 18285 1726853397.56292: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.56293: done queuing things up, now waiting for results queue to drain 18285 1726853397.56293: results queue empty 18285 1726853397.56293: checking for any_errors_fatal 18285 1726853397.56294: done checking for any_errors_fatal 18285 1726853397.56294: checking for max_fail_percentage 18285 1726853397.56295: done checking for max_fail_percentage 18285 1726853397.56295: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.56296: done checking to see if all hosts have failed 18285 1726853397.56296: getting the next task for host managed_node1 18285 1726853397.56297: done getting next task for host managed_node1 18285 1726853397.56298: ^ task is: None 18285 1726853397.56299: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.56323: in VariableManager get_vars() 18285 1726853397.56340: done with get_vars() 18285 1726853397.56343: in VariableManager get_vars() 18285 1726853397.56351: done with get_vars() 18285 1726853397.56354: variable 'omit' from source: magic vars 18285 1726853397.56374: in VariableManager get_vars() 18285 1726853397.56382: done with get_vars() 18285 1726853397.56395: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 18285 1726853397.56887: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18285 1726853397.56905: getting the remaining hosts for this loop 18285 1726853397.56906: done getting the remaining hosts for this loop 18285 1726853397.56908: getting the next task for host managed_node1 18285 1726853397.56909: done getting next task for host managed_node1 18285 1726853397.56911: ^ task is: TASK: Gathering Facts 18285 1726853397.56911: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.56913: getting variables 18285 1726853397.56913: in VariableManager get_vars() 18285 1726853397.56920: Calling all_inventory to load vars for managed_node1 18285 1726853397.56921: Calling groups_inventory to load vars for managed_node1 18285 1726853397.56922: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.56925: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.56927: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.56928: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.57009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.57112: done with get_vars() 18285 1726853397.57117: done getting variables 18285 1726853397.57141: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Friday 20 September 2024 13:29:57 -0400 (0:00:00.049) 0:00:03.507 ****** 18285 1726853397.57157: entering _queue_task() for managed_node1/gather_facts 18285 1726853397.57342: worker is 1 (out of 1 available) 18285 1726853397.57353: exiting _queue_task() for managed_node1/gather_facts 18285 1726853397.57365: done queuing things up, now waiting for results queue to drain 18285 1726853397.57366: waiting for pending results... 18285 1726853397.57513: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18285 1726853397.57573: in run() - task 02083763-bbaf-9200-7ca6-00000000010e 18285 1726853397.57585: variable 'ansible_search_path' from source: unknown 18285 1726853397.57616: calling self._execute() 18285 1726853397.57670: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.57675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.57683: variable 'omit' from source: magic vars 18285 1726853397.57985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.59466: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.59516: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.59543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.59575: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.59607: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.59675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.59697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.59714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.59739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.59749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.59856: variable 'ansible_distribution' from source: facts 18285 1726853397.59859: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.59874: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.59878: when evaluation is False, skipping this task 18285 1726853397.59884: _execute() done 18285 1726853397.59887: dumping result to json 18285 1726853397.59890: done dumping result, returning 18285 1726853397.59900: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-9200-7ca6-00000000010e] 18285 1726853397.59902: sending task result for task 02083763-bbaf-9200-7ca6-00000000010e 18285 1726853397.59978: done sending task result for task 02083763-bbaf-9200-7ca6-00000000010e 18285 1726853397.59981: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.60056: no more pending results, returning what we have 18285 1726853397.60060: results queue empty 18285 1726853397.60061: checking for any_errors_fatal 18285 1726853397.60062: done checking for any_errors_fatal 18285 1726853397.60063: checking for max_fail_percentage 18285 1726853397.60065: done checking for max_fail_percentage 18285 1726853397.60065: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.60066: done checking to see if all hosts have failed 18285 1726853397.60067: getting the remaining hosts for this loop 18285 1726853397.60068: done getting the remaining hosts for this loop 18285 1726853397.60073: getting the next task for host managed_node1 18285 1726853397.60078: done getting next task for host managed_node1 18285 1726853397.60080: ^ task is: TASK: meta (flush_handlers) 18285 1726853397.60081: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.60085: getting variables 18285 1726853397.60086: in VariableManager get_vars() 18285 1726853397.60121: Calling all_inventory to load vars for managed_node1 18285 1726853397.60123: Calling groups_inventory to load vars for managed_node1 18285 1726853397.60126: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.60135: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.60137: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.60139: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.60302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.60414: done with get_vars() 18285 1726853397.60421: done getting variables 18285 1726853397.60466: in VariableManager get_vars() 18285 1726853397.60475: Calling all_inventory to load vars for managed_node1 18285 1726853397.60477: Calling groups_inventory to load vars for managed_node1 18285 1726853397.60478: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.60481: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.60482: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.60484: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.60565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.60675: done with get_vars() 18285 1726853397.60683: done queuing things up, now waiting for results queue to drain 18285 1726853397.60685: results queue empty 18285 1726853397.60685: checking for any_errors_fatal 18285 1726853397.60686: done checking for any_errors_fatal 18285 1726853397.60687: checking for max_fail_percentage 18285 1726853397.60687: done checking for max_fail_percentage 18285 1726853397.60688: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.60688: done checking to see if all hosts have failed 18285 1726853397.60689: getting the remaining hosts for this loop 18285 1726853397.60689: done getting the remaining hosts for this loop 18285 1726853397.60691: getting the next task for host managed_node1 18285 1726853397.60693: done getting next task for host managed_node1 18285 1726853397.60695: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18285 1726853397.60696: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.60702: getting variables 18285 1726853397.60703: in VariableManager get_vars() 18285 1726853397.60710: Calling all_inventory to load vars for managed_node1 18285 1726853397.60712: Calling groups_inventory to load vars for managed_node1 18285 1726853397.60713: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.60722: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.60724: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.60725: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.60818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.60925: done with get_vars() 18285 1726853397.60931: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:29:57 -0400 (0:00:00.038) 0:00:03.546 ****** 18285 1726853397.60981: entering _queue_task() for managed_node1/include_tasks 18285 1726853397.61175: worker is 1 (out of 1 available) 18285 1726853397.61189: exiting _queue_task() for managed_node1/include_tasks 18285 1726853397.61200: done queuing things up, now waiting for results queue to drain 18285 1726853397.61202: waiting for pending results... 18285 1726853397.61355: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18285 1726853397.61414: in run() - task 02083763-bbaf-9200-7ca6-000000000019 18285 1726853397.61438: variable 'ansible_search_path' from source: unknown 18285 1726853397.61441: variable 'ansible_search_path' from source: unknown 18285 1726853397.61463: calling self._execute() 18285 1726853397.61524: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.61528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.61551: variable 'omit' from source: magic vars 18285 1726853397.62176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.63690: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.63732: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.63761: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.63790: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.63810: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.63868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.63897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.63913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.63938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.63948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.64044: variable 'ansible_distribution' from source: facts 18285 1726853397.64047: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.64064: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.64067: when evaluation is False, skipping this task 18285 1726853397.64070: _execute() done 18285 1726853397.64074: dumping result to json 18285 1726853397.64078: done dumping result, returning 18285 1726853397.64085: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-9200-7ca6-000000000019] 18285 1726853397.64089: sending task result for task 02083763-bbaf-9200-7ca6-000000000019 18285 1726853397.64177: done sending task result for task 02083763-bbaf-9200-7ca6-000000000019 18285 1726853397.64180: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.64248: no more pending results, returning what we have 18285 1726853397.64251: results queue empty 18285 1726853397.64252: checking for any_errors_fatal 18285 1726853397.64254: done checking for any_errors_fatal 18285 1726853397.64255: checking for max_fail_percentage 18285 1726853397.64256: done checking for max_fail_percentage 18285 1726853397.64257: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.64258: done checking to see if all hosts have failed 18285 1726853397.64258: getting the remaining hosts for this loop 18285 1726853397.64260: done getting the remaining hosts for this loop 18285 1726853397.64263: getting the next task for host managed_node1 18285 1726853397.64268: done getting next task for host managed_node1 18285 1726853397.64272: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18285 1726853397.64274: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.64289: getting variables 18285 1726853397.64291: in VariableManager get_vars() 18285 1726853397.64321: Calling all_inventory to load vars for managed_node1 18285 1726853397.64323: Calling groups_inventory to load vars for managed_node1 18285 1726853397.64325: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.64333: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.64335: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.64338: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.64461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.64638: done with get_vars() 18285 1726853397.64651: done getting variables 18285 1726853397.64757: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:29:57 -0400 (0:00:00.038) 0:00:03.585 ****** 18285 1726853397.64875: entering _queue_task() for managed_node1/debug 18285 1726853397.65299: worker is 1 (out of 1 available) 18285 1726853397.65308: exiting _queue_task() for managed_node1/debug 18285 1726853397.65317: done queuing things up, now waiting for results queue to drain 18285 1726853397.65319: waiting for pending results... 18285 1726853397.65559: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18285 1726853397.65563: in run() - task 02083763-bbaf-9200-7ca6-00000000001a 18285 1726853397.65567: variable 'ansible_search_path' from source: unknown 18285 1726853397.65569: variable 'ansible_search_path' from source: unknown 18285 1726853397.65574: calling self._execute() 18285 1726853397.65626: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.65637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.65662: variable 'omit' from source: magic vars 18285 1726853397.66093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.68416: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.68493: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.68545: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.68595: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.68623: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.68712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.68741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.68769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.68816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.68833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.68975: variable 'ansible_distribution' from source: facts 18285 1726853397.68988: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.69011: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.69027: when evaluation is False, skipping this task 18285 1726853397.69036: _execute() done 18285 1726853397.69043: dumping result to json 18285 1726853397.69055: done dumping result, returning 18285 1726853397.69067: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-9200-7ca6-00000000001a] 18285 1726853397.69080: sending task result for task 02083763-bbaf-9200-7ca6-00000000001a 18285 1726853397.69284: done sending task result for task 02083763-bbaf-9200-7ca6-00000000001a 18285 1726853397.69287: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853397.69336: no more pending results, returning what we have 18285 1726853397.69339: results queue empty 18285 1726853397.69340: checking for any_errors_fatal 18285 1726853397.69346: done checking for any_errors_fatal 18285 1726853397.69347: checking for max_fail_percentage 18285 1726853397.69352: done checking for max_fail_percentage 18285 1726853397.69353: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.69354: done checking to see if all hosts have failed 18285 1726853397.69354: getting the remaining hosts for this loop 18285 1726853397.69356: done getting the remaining hosts for this loop 18285 1726853397.69360: getting the next task for host managed_node1 18285 1726853397.69366: done getting next task for host managed_node1 18285 1726853397.69370: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18285 1726853397.69374: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.69387: getting variables 18285 1726853397.69389: in VariableManager get_vars() 18285 1726853397.69603: Calling all_inventory to load vars for managed_node1 18285 1726853397.69606: Calling groups_inventory to load vars for managed_node1 18285 1726853397.69608: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.69618: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.69621: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.69624: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.69956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.70168: done with get_vars() 18285 1726853397.70181: done getting variables 18285 1726853397.70276: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:29:57 -0400 (0:00:00.054) 0:00:03.639 ****** 18285 1726853397.70304: entering _queue_task() for managed_node1/fail 18285 1726853397.70306: Creating lock for fail 18285 1726853397.70690: worker is 1 (out of 1 available) 18285 1726853397.70702: exiting _queue_task() for managed_node1/fail 18285 1726853397.70713: done queuing things up, now waiting for results queue to drain 18285 1726853397.70715: waiting for pending results... 18285 1726853397.70992: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18285 1726853397.71065: in run() - task 02083763-bbaf-9200-7ca6-00000000001b 18285 1726853397.71094: variable 'ansible_search_path' from source: unknown 18285 1726853397.71102: variable 'ansible_search_path' from source: unknown 18285 1726853397.71141: calling self._execute() 18285 1726853397.71276: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.71280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.71283: variable 'omit' from source: magic vars 18285 1726853397.71814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.74103: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.74187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.74294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.74298: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.74300: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.74373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.74414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.74442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.74491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.74519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.74658: variable 'ansible_distribution' from source: facts 18285 1726853397.74673: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.74696: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.74703: when evaluation is False, skipping this task 18285 1726853397.74712: _execute() done 18285 1726853397.74724: dumping result to json 18285 1726853397.74776: done dumping result, returning 18285 1726853397.74780: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-9200-7ca6-00000000001b] 18285 1726853397.74783: sending task result for task 02083763-bbaf-9200-7ca6-00000000001b 18285 1726853397.75055: done sending task result for task 02083763-bbaf-9200-7ca6-00000000001b 18285 1726853397.75058: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.75104: no more pending results, returning what we have 18285 1726853397.75106: results queue empty 18285 1726853397.75107: checking for any_errors_fatal 18285 1726853397.75113: done checking for any_errors_fatal 18285 1726853397.75113: checking for max_fail_percentage 18285 1726853397.75115: done checking for max_fail_percentage 18285 1726853397.75116: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.75116: done checking to see if all hosts have failed 18285 1726853397.75117: getting the remaining hosts for this loop 18285 1726853397.75119: done getting the remaining hosts for this loop 18285 1726853397.75122: getting the next task for host managed_node1 18285 1726853397.75128: done getting next task for host managed_node1 18285 1726853397.75132: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18285 1726853397.75134: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.75151: getting variables 18285 1726853397.75153: in VariableManager get_vars() 18285 1726853397.75195: Calling all_inventory to load vars for managed_node1 18285 1726853397.75197: Calling groups_inventory to load vars for managed_node1 18285 1726853397.75200: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.75212: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.75215: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.75218: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.75554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.75803: done with get_vars() 18285 1726853397.75815: done getting variables 18285 1726853397.75884: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:29:57 -0400 (0:00:00.056) 0:00:03.695 ****** 18285 1726853397.75915: entering _queue_task() for managed_node1/fail 18285 1726853397.76310: worker is 1 (out of 1 available) 18285 1726853397.76321: exiting _queue_task() for managed_node1/fail 18285 1726853397.76334: done queuing things up, now waiting for results queue to drain 18285 1726853397.76335: waiting for pending results... 18285 1726853397.76533: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18285 1726853397.76705: in run() - task 02083763-bbaf-9200-7ca6-00000000001c 18285 1726853397.76709: variable 'ansible_search_path' from source: unknown 18285 1726853397.76712: variable 'ansible_search_path' from source: unknown 18285 1726853397.76751: calling self._execute() 18285 1726853397.76881: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.76884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.76890: variable 'omit' from source: magic vars 18285 1726853397.77385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.80601: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.80781: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.80785: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.80787: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.80817: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.80910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.80995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.80998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.81024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.81045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.81196: variable 'ansible_distribution' from source: facts 18285 1726853397.81211: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.81237: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.81243: when evaluation is False, skipping this task 18285 1726853397.81253: _execute() done 18285 1726853397.81261: dumping result to json 18285 1726853397.81275: done dumping result, returning 18285 1726853397.81318: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-9200-7ca6-00000000001c] 18285 1726853397.81322: sending task result for task 02083763-bbaf-9200-7ca6-00000000001c skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.81595: no more pending results, returning what we have 18285 1726853397.81599: results queue empty 18285 1726853397.81600: checking for any_errors_fatal 18285 1726853397.81607: done checking for any_errors_fatal 18285 1726853397.81608: checking for max_fail_percentage 18285 1726853397.81610: done checking for max_fail_percentage 18285 1726853397.81611: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.81612: done checking to see if all hosts have failed 18285 1726853397.81612: getting the remaining hosts for this loop 18285 1726853397.81614: done getting the remaining hosts for this loop 18285 1726853397.81618: getting the next task for host managed_node1 18285 1726853397.81624: done getting next task for host managed_node1 18285 1726853397.81628: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18285 1726853397.81630: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.81643: getting variables 18285 1726853397.81645: in VariableManager get_vars() 18285 1726853397.81686: Calling all_inventory to load vars for managed_node1 18285 1726853397.81689: Calling groups_inventory to load vars for managed_node1 18285 1726853397.81691: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.81703: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.81706: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.81709: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.82144: done sending task result for task 02083763-bbaf-9200-7ca6-00000000001c 18285 1726853397.82147: WORKER PROCESS EXITING 18285 1726853397.82173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.82381: done with get_vars() 18285 1726853397.82392: done getting variables 18285 1726853397.82452: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:29:57 -0400 (0:00:00.065) 0:00:03.761 ****** 18285 1726853397.82483: entering _queue_task() for managed_node1/fail 18285 1726853397.82811: worker is 1 (out of 1 available) 18285 1726853397.82823: exiting _queue_task() for managed_node1/fail 18285 1726853397.82834: done queuing things up, now waiting for results queue to drain 18285 1726853397.82948: waiting for pending results... 18285 1726853397.83064: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18285 1726853397.83187: in run() - task 02083763-bbaf-9200-7ca6-00000000001d 18285 1726853397.83277: variable 'ansible_search_path' from source: unknown 18285 1726853397.83283: variable 'ansible_search_path' from source: unknown 18285 1726853397.83286: calling self._execute() 18285 1726853397.83357: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.83370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.83401: variable 'omit' from source: magic vars 18285 1726853397.84003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.86580: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.86661: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.86703: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.86752: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.86796: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.86890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.86963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.86978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.87023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.87072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.87258: variable 'ansible_distribution' from source: facts 18285 1726853397.87261: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.87263: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.87265: when evaluation is False, skipping this task 18285 1726853397.87267: _execute() done 18285 1726853397.87269: dumping result to json 18285 1726853397.87273: done dumping result, returning 18285 1726853397.87292: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-9200-7ca6-00000000001d] 18285 1726853397.87303: sending task result for task 02083763-bbaf-9200-7ca6-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.87456: no more pending results, returning what we have 18285 1726853397.87460: results queue empty 18285 1726853397.87461: checking for any_errors_fatal 18285 1726853397.87466: done checking for any_errors_fatal 18285 1726853397.87467: checking for max_fail_percentage 18285 1726853397.87468: done checking for max_fail_percentage 18285 1726853397.87469: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.87470: done checking to see if all hosts have failed 18285 1726853397.87521: getting the remaining hosts for this loop 18285 1726853397.87524: done getting the remaining hosts for this loop 18285 1726853397.87528: getting the next task for host managed_node1 18285 1726853397.87534: done getting next task for host managed_node1 18285 1726853397.87537: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18285 1726853397.87540: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.87555: getting variables 18285 1726853397.87557: in VariableManager get_vars() 18285 1726853397.87597: Calling all_inventory to load vars for managed_node1 18285 1726853397.87600: Calling groups_inventory to load vars for managed_node1 18285 1726853397.87603: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.87615: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.87618: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.87622: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.88119: done sending task result for task 02083763-bbaf-9200-7ca6-00000000001d 18285 1726853397.88123: WORKER PROCESS EXITING 18285 1726853397.88167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.88407: done with get_vars() 18285 1726853397.88453: done getting variables 18285 1726853397.88553: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:29:57 -0400 (0:00:00.060) 0:00:03.822 ****** 18285 1726853397.88583: entering _queue_task() for managed_node1/dnf 18285 1726853397.89012: worker is 1 (out of 1 available) 18285 1726853397.89024: exiting _queue_task() for managed_node1/dnf 18285 1726853397.89036: done queuing things up, now waiting for results queue to drain 18285 1726853397.89038: waiting for pending results... 18285 1726853397.89219: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18285 1726853397.89338: in run() - task 02083763-bbaf-9200-7ca6-00000000001e 18285 1726853397.89363: variable 'ansible_search_path' from source: unknown 18285 1726853397.89394: variable 'ansible_search_path' from source: unknown 18285 1726853397.89532: calling self._execute() 18285 1726853397.89644: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.89660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.89677: variable 'omit' from source: magic vars 18285 1726853397.90220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853397.92548: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853397.92637: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853397.92686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853397.92720: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853397.92757: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853397.92845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853397.92886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853397.92915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853397.92968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853397.92987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853397.93277: variable 'ansible_distribution' from source: facts 18285 1726853397.93281: variable 'ansible_distribution_major_version' from source: facts 18285 1726853397.93283: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853397.93285: when evaluation is False, skipping this task 18285 1726853397.93287: _execute() done 18285 1726853397.93289: dumping result to json 18285 1726853397.93376: done dumping result, returning 18285 1726853397.93382: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-00000000001e] 18285 1726853397.93385: sending task result for task 02083763-bbaf-9200-7ca6-00000000001e 18285 1726853397.93462: done sending task result for task 02083763-bbaf-9200-7ca6-00000000001e 18285 1726853397.93465: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853397.93519: no more pending results, returning what we have 18285 1726853397.93522: results queue empty 18285 1726853397.93523: checking for any_errors_fatal 18285 1726853397.93529: done checking for any_errors_fatal 18285 1726853397.93529: checking for max_fail_percentage 18285 1726853397.93531: done checking for max_fail_percentage 18285 1726853397.93532: checking to see if all hosts have failed and the running result is not ok 18285 1726853397.93533: done checking to see if all hosts have failed 18285 1726853397.93533: getting the remaining hosts for this loop 18285 1726853397.93534: done getting the remaining hosts for this loop 18285 1726853397.93538: getting the next task for host managed_node1 18285 1726853397.93544: done getting next task for host managed_node1 18285 1726853397.93547: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18285 1726853397.93552: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853397.93565: getting variables 18285 1726853397.93567: in VariableManager get_vars() 18285 1726853397.93606: Calling all_inventory to load vars for managed_node1 18285 1726853397.93609: Calling groups_inventory to load vars for managed_node1 18285 1726853397.93612: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853397.93625: Calling all_plugins_play to load vars for managed_node1 18285 1726853397.93628: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853397.93631: Calling groups_plugins_play to load vars for managed_node1 18285 1726853397.94543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853397.94983: done with get_vars() 18285 1726853397.94995: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18285 1726853397.95183: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:29:57 -0400 (0:00:00.066) 0:00:03.888 ****** 18285 1726853397.95211: entering _queue_task() for managed_node1/yum 18285 1726853397.95212: Creating lock for yum 18285 1726853397.96001: worker is 1 (out of 1 available) 18285 1726853397.96018: exiting _queue_task() for managed_node1/yum 18285 1726853397.96158: done queuing things up, now waiting for results queue to drain 18285 1726853397.96160: waiting for pending results... 18285 1726853397.96592: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18285 1726853397.96647: in run() - task 02083763-bbaf-9200-7ca6-00000000001f 18285 1726853397.96662: variable 'ansible_search_path' from source: unknown 18285 1726853397.96666: variable 'ansible_search_path' from source: unknown 18285 1726853397.96852: calling self._execute() 18285 1726853397.97018: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853397.97023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853397.97027: variable 'omit' from source: magic vars 18285 1726853397.97987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.02639: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.02960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.02964: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.02986: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.03011: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.03196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.03222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.03246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.03392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.03403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.03738: variable 'ansible_distribution' from source: facts 18285 1726853398.03744: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.03762: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.03766: when evaluation is False, skipping this task 18285 1726853398.03768: _execute() done 18285 1726853398.03772: dumping result to json 18285 1726853398.03827: done dumping result, returning 18285 1726853398.03830: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-00000000001f] 18285 1726853398.03833: sending task result for task 02083763-bbaf-9200-7ca6-00000000001f 18285 1726853398.03907: done sending task result for task 02083763-bbaf-9200-7ca6-00000000001f 18285 1726853398.03909: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853398.03985: no more pending results, returning what we have 18285 1726853398.03988: results queue empty 18285 1726853398.03989: checking for any_errors_fatal 18285 1726853398.03995: done checking for any_errors_fatal 18285 1726853398.03996: checking for max_fail_percentage 18285 1726853398.03998: done checking for max_fail_percentage 18285 1726853398.03999: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.03999: done checking to see if all hosts have failed 18285 1726853398.04000: getting the remaining hosts for this loop 18285 1726853398.04001: done getting the remaining hosts for this loop 18285 1726853398.04005: getting the next task for host managed_node1 18285 1726853398.04011: done getting next task for host managed_node1 18285 1726853398.04015: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18285 1726853398.04017: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.04032: getting variables 18285 1726853398.04033: in VariableManager get_vars() 18285 1726853398.04075: Calling all_inventory to load vars for managed_node1 18285 1726853398.04077: Calling groups_inventory to load vars for managed_node1 18285 1726853398.04079: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.04091: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.04093: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.04095: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.04476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.04953: done with get_vars() 18285 1726853398.04964: done getting variables 18285 1726853398.05088: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:29:58 -0400 (0:00:00.099) 0:00:03.987 ****** 18285 1726853398.05119: entering _queue_task() for managed_node1/fail 18285 1726853398.05883: worker is 1 (out of 1 available) 18285 1726853398.06013: exiting _queue_task() for managed_node1/fail 18285 1726853398.06025: done queuing things up, now waiting for results queue to drain 18285 1726853398.06027: waiting for pending results... 18285 1726853398.06466: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18285 1726853398.06524: in run() - task 02083763-bbaf-9200-7ca6-000000000020 18285 1726853398.06537: variable 'ansible_search_path' from source: unknown 18285 1726853398.06541: variable 'ansible_search_path' from source: unknown 18285 1726853398.06627: calling self._execute() 18285 1726853398.06781: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.06786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.06789: variable 'omit' from source: magic vars 18285 1726853398.07702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.11963: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.12036: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.12481: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.12485: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.12487: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.12490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.12505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.12541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.12622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.12777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.12928: variable 'ansible_distribution' from source: facts 18285 1726853398.12985: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.13098: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.13106: when evaluation is False, skipping this task 18285 1726853398.13113: _execute() done 18285 1726853398.13120: dumping result to json 18285 1726853398.13128: done dumping result, returning 18285 1726853398.13140: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-000000000020] 18285 1726853398.13149: sending task result for task 02083763-bbaf-9200-7ca6-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853398.13307: no more pending results, returning what we have 18285 1726853398.13311: results queue empty 18285 1726853398.13311: checking for any_errors_fatal 18285 1726853398.13317: done checking for any_errors_fatal 18285 1726853398.13318: checking for max_fail_percentage 18285 1726853398.13319: done checking for max_fail_percentage 18285 1726853398.13320: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.13321: done checking to see if all hosts have failed 18285 1726853398.13321: getting the remaining hosts for this loop 18285 1726853398.13323: done getting the remaining hosts for this loop 18285 1726853398.13326: getting the next task for host managed_node1 18285 1726853398.13331: done getting next task for host managed_node1 18285 1726853398.13334: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18285 1726853398.13336: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.13347: getting variables 18285 1726853398.13351: in VariableManager get_vars() 18285 1726853398.13392: Calling all_inventory to load vars for managed_node1 18285 1726853398.13395: Calling groups_inventory to load vars for managed_node1 18285 1726853398.13397: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.13409: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.13412: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.13415: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.13849: done sending task result for task 02083763-bbaf-9200-7ca6-000000000020 18285 1726853398.13852: WORKER PROCESS EXITING 18285 1726853398.13875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.14137: done with get_vars() 18285 1726853398.14150: done getting variables 18285 1726853398.14215: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:29:58 -0400 (0:00:00.091) 0:00:04.078 ****** 18285 1726853398.14253: entering _queue_task() for managed_node1/package 18285 1726853398.14679: worker is 1 (out of 1 available) 18285 1726853398.14692: exiting _queue_task() for managed_node1/package 18285 1726853398.14702: done queuing things up, now waiting for results queue to drain 18285 1726853398.14704: waiting for pending results... 18285 1726853398.14887: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18285 1726853398.15005: in run() - task 02083763-bbaf-9200-7ca6-000000000021 18285 1726853398.15038: variable 'ansible_search_path' from source: unknown 18285 1726853398.15041: variable 'ansible_search_path' from source: unknown 18285 1726853398.15096: calling self._execute() 18285 1726853398.15205: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.15208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.15211: variable 'omit' from source: magic vars 18285 1726853398.15756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.18875: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.18949: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.19030: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.19049: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.19083: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.19248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.19251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.19254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.19283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.19305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.19450: variable 'ansible_distribution' from source: facts 18285 1726853398.19468: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.19495: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.19503: when evaluation is False, skipping this task 18285 1726853398.19512: _execute() done 18285 1726853398.19547: dumping result to json 18285 1726853398.19559: done dumping result, returning 18285 1726853398.19580: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-9200-7ca6-000000000021] 18285 1726853398.19589: sending task result for task 02083763-bbaf-9200-7ca6-000000000021 18285 1726853398.19977: done sending task result for task 02083763-bbaf-9200-7ca6-000000000021 18285 1726853398.19982: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853398.20051: no more pending results, returning what we have 18285 1726853398.20055: results queue empty 18285 1726853398.20056: checking for any_errors_fatal 18285 1726853398.20062: done checking for any_errors_fatal 18285 1726853398.20063: checking for max_fail_percentage 18285 1726853398.20065: done checking for max_fail_percentage 18285 1726853398.20066: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.20067: done checking to see if all hosts have failed 18285 1726853398.20068: getting the remaining hosts for this loop 18285 1726853398.20069: done getting the remaining hosts for this loop 18285 1726853398.20279: getting the next task for host managed_node1 18285 1726853398.20286: done getting next task for host managed_node1 18285 1726853398.20290: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18285 1726853398.20292: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.20306: getting variables 18285 1726853398.20308: in VariableManager get_vars() 18285 1726853398.20348: Calling all_inventory to load vars for managed_node1 18285 1726853398.20350: Calling groups_inventory to load vars for managed_node1 18285 1726853398.20353: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.20365: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.20368: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.20618: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.21073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.21505: done with get_vars() 18285 1726853398.21515: done getting variables 18285 1726853398.21574: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:29:58 -0400 (0:00:00.073) 0:00:04.153 ****** 18285 1726853398.21681: entering _queue_task() for managed_node1/package 18285 1726853398.22273: worker is 1 (out of 1 available) 18285 1726853398.22286: exiting _queue_task() for managed_node1/package 18285 1726853398.22298: done queuing things up, now waiting for results queue to drain 18285 1726853398.22299: waiting for pending results... 18285 1726853398.22759: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18285 1726853398.22929: in run() - task 02083763-bbaf-9200-7ca6-000000000022 18285 1726853398.22944: variable 'ansible_search_path' from source: unknown 18285 1726853398.22947: variable 'ansible_search_path' from source: unknown 18285 1726853398.23022: calling self._execute() 18285 1726853398.23132: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.23138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.23147: variable 'omit' from source: magic vars 18285 1726853398.24185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.29182: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.29195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.29361: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.29395: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.29421: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.29624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.29732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.29794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.29835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.29849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.30280: variable 'ansible_distribution' from source: facts 18285 1726853398.30283: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.30285: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.30286: when evaluation is False, skipping this task 18285 1726853398.30288: _execute() done 18285 1726853398.30290: dumping result to json 18285 1726853398.30291: done dumping result, returning 18285 1726853398.30293: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-9200-7ca6-000000000022] 18285 1726853398.30295: sending task result for task 02083763-bbaf-9200-7ca6-000000000022 18285 1726853398.30358: done sending task result for task 02083763-bbaf-9200-7ca6-000000000022 18285 1726853398.30361: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853398.30416: no more pending results, returning what we have 18285 1726853398.30419: results queue empty 18285 1726853398.30420: checking for any_errors_fatal 18285 1726853398.30425: done checking for any_errors_fatal 18285 1726853398.30426: checking for max_fail_percentage 18285 1726853398.30427: done checking for max_fail_percentage 18285 1726853398.30428: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.30429: done checking to see if all hosts have failed 18285 1726853398.30430: getting the remaining hosts for this loop 18285 1726853398.30431: done getting the remaining hosts for this loop 18285 1726853398.30434: getting the next task for host managed_node1 18285 1726853398.30439: done getting next task for host managed_node1 18285 1726853398.30442: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18285 1726853398.30444: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.30456: getting variables 18285 1726853398.30458: in VariableManager get_vars() 18285 1726853398.30493: Calling all_inventory to load vars for managed_node1 18285 1726853398.30496: Calling groups_inventory to load vars for managed_node1 18285 1726853398.30498: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.30508: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.30510: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.30513: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.31090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.31289: done with get_vars() 18285 1726853398.31299: done getting variables 18285 1726853398.31354: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:29:58 -0400 (0:00:00.099) 0:00:04.252 ****** 18285 1726853398.31587: entering _queue_task() for managed_node1/package 18285 1726853398.32070: worker is 1 (out of 1 available) 18285 1726853398.32085: exiting _queue_task() for managed_node1/package 18285 1726853398.32098: done queuing things up, now waiting for results queue to drain 18285 1726853398.32100: waiting for pending results... 18285 1726853398.32564: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18285 1726853398.32773: in run() - task 02083763-bbaf-9200-7ca6-000000000023 18285 1726853398.32987: variable 'ansible_search_path' from source: unknown 18285 1726853398.32991: variable 'ansible_search_path' from source: unknown 18285 1726853398.32995: calling self._execute() 18285 1726853398.33104: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.33115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.33128: variable 'omit' from source: magic vars 18285 1726853398.34006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.38552: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.38752: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.38977: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.38981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.39096: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.39184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.39315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.39348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.39431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.39535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.39793: variable 'ansible_distribution' from source: facts 18285 1726853398.39807: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.39885: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.39895: when evaluation is False, skipping this task 18285 1726853398.39904: _execute() done 18285 1726853398.39917: dumping result to json 18285 1726853398.39940: done dumping result, returning 18285 1726853398.39958: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-9200-7ca6-000000000023] 18285 1726853398.39985: sending task result for task 02083763-bbaf-9200-7ca6-000000000023 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853398.40145: no more pending results, returning what we have 18285 1726853398.40150: results queue empty 18285 1726853398.40151: checking for any_errors_fatal 18285 1726853398.40157: done checking for any_errors_fatal 18285 1726853398.40158: checking for max_fail_percentage 18285 1726853398.40159: done checking for max_fail_percentage 18285 1726853398.40160: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.40161: done checking to see if all hosts have failed 18285 1726853398.40161: getting the remaining hosts for this loop 18285 1726853398.40163: done getting the remaining hosts for this loop 18285 1726853398.40167: getting the next task for host managed_node1 18285 1726853398.40175: done getting next task for host managed_node1 18285 1726853398.40178: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18285 1726853398.40181: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.40194: getting variables 18285 1726853398.40196: in VariableManager get_vars() 18285 1726853398.40235: Calling all_inventory to load vars for managed_node1 18285 1726853398.40238: Calling groups_inventory to load vars for managed_node1 18285 1726853398.40241: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.40252: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.40255: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.40258: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.40765: done sending task result for task 02083763-bbaf-9200-7ca6-000000000023 18285 1726853398.40769: WORKER PROCESS EXITING 18285 1726853398.40793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.40983: done with get_vars() 18285 1726853398.40993: done getting variables 18285 1726853398.41084: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:29:58 -0400 (0:00:00.095) 0:00:04.347 ****** 18285 1726853398.41114: entering _queue_task() for managed_node1/service 18285 1726853398.41116: Creating lock for service 18285 1726853398.41358: worker is 1 (out of 1 available) 18285 1726853398.41370: exiting _queue_task() for managed_node1/service 18285 1726853398.41582: done queuing things up, now waiting for results queue to drain 18285 1726853398.41584: waiting for pending results... 18285 1726853398.41633: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18285 1726853398.41760: in run() - task 02083763-bbaf-9200-7ca6-000000000024 18285 1726853398.41790: variable 'ansible_search_path' from source: unknown 18285 1726853398.41799: variable 'ansible_search_path' from source: unknown 18285 1726853398.41843: calling self._execute() 18285 1726853398.41963: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.41977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.41991: variable 'omit' from source: magic vars 18285 1726853398.42573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.45386: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.45431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.45461: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.45490: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.45510: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.45574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.45596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.45613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.45641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.45653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.45751: variable 'ansible_distribution' from source: facts 18285 1726853398.45759: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.45777: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.45780: when evaluation is False, skipping this task 18285 1726853398.45783: _execute() done 18285 1726853398.45785: dumping result to json 18285 1726853398.45788: done dumping result, returning 18285 1726853398.45795: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-000000000024] 18285 1726853398.45799: sending task result for task 02083763-bbaf-9200-7ca6-000000000024 18285 1726853398.45890: done sending task result for task 02083763-bbaf-9200-7ca6-000000000024 18285 1726853398.45893: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853398.45939: no more pending results, returning what we have 18285 1726853398.45943: results queue empty 18285 1726853398.45943: checking for any_errors_fatal 18285 1726853398.45949: done checking for any_errors_fatal 18285 1726853398.45950: checking for max_fail_percentage 18285 1726853398.45952: done checking for max_fail_percentage 18285 1726853398.45953: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.45953: done checking to see if all hosts have failed 18285 1726853398.45954: getting the remaining hosts for this loop 18285 1726853398.45955: done getting the remaining hosts for this loop 18285 1726853398.45959: getting the next task for host managed_node1 18285 1726853398.45965: done getting next task for host managed_node1 18285 1726853398.45968: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18285 1726853398.45970: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.45983: getting variables 18285 1726853398.45985: in VariableManager get_vars() 18285 1726853398.46020: Calling all_inventory to load vars for managed_node1 18285 1726853398.46023: Calling groups_inventory to load vars for managed_node1 18285 1726853398.46025: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.46035: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.46037: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.46040: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.46185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.46303: done with get_vars() 18285 1726853398.46311: done getting variables 18285 1726853398.46352: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:29:58 -0400 (0:00:00.052) 0:00:04.400 ****** 18285 1726853398.46374: entering _queue_task() for managed_node1/service 18285 1726853398.46588: worker is 1 (out of 1 available) 18285 1726853398.46601: exiting _queue_task() for managed_node1/service 18285 1726853398.46612: done queuing things up, now waiting for results queue to drain 18285 1726853398.46614: waiting for pending results... 18285 1726853398.46968: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18285 1726853398.47020: in run() - task 02083763-bbaf-9200-7ca6-000000000025 18285 1726853398.47041: variable 'ansible_search_path' from source: unknown 18285 1726853398.47073: variable 'ansible_search_path' from source: unknown 18285 1726853398.47115: calling self._execute() 18285 1726853398.47284: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.47580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.47584: variable 'omit' from source: magic vars 18285 1726853398.48067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.49653: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.49698: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.49925: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.49946: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.49970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.50141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.50145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.50147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.50178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.50197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.50333: variable 'ansible_distribution' from source: facts 18285 1726853398.50345: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.50377: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.50386: when evaluation is False, skipping this task 18285 1726853398.50393: _execute() done 18285 1726853398.50400: dumping result to json 18285 1726853398.50408: done dumping result, returning 18285 1726853398.50480: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-9200-7ca6-000000000025] 18285 1726853398.50484: sending task result for task 02083763-bbaf-9200-7ca6-000000000025 18285 1726853398.50556: done sending task result for task 02083763-bbaf-9200-7ca6-000000000025 18285 1726853398.50560: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18285 1726853398.50711: no more pending results, returning what we have 18285 1726853398.50714: results queue empty 18285 1726853398.50715: checking for any_errors_fatal 18285 1726853398.50721: done checking for any_errors_fatal 18285 1726853398.50722: checking for max_fail_percentage 18285 1726853398.50723: done checking for max_fail_percentage 18285 1726853398.50724: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.50725: done checking to see if all hosts have failed 18285 1726853398.50726: getting the remaining hosts for this loop 18285 1726853398.50727: done getting the remaining hosts for this loop 18285 1726853398.50730: getting the next task for host managed_node1 18285 1726853398.50735: done getting next task for host managed_node1 18285 1726853398.50738: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18285 1726853398.50740: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.50752: getting variables 18285 1726853398.50753: in VariableManager get_vars() 18285 1726853398.50791: Calling all_inventory to load vars for managed_node1 18285 1726853398.50794: Calling groups_inventory to load vars for managed_node1 18285 1726853398.50796: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.50804: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.50807: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.50809: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.51021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.51225: done with get_vars() 18285 1726853398.51234: done getting variables 18285 1726853398.51291: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:29:58 -0400 (0:00:00.049) 0:00:04.449 ****** 18285 1726853398.51325: entering _queue_task() for managed_node1/service 18285 1726853398.51600: worker is 1 (out of 1 available) 18285 1726853398.51612: exiting _queue_task() for managed_node1/service 18285 1726853398.51624: done queuing things up, now waiting for results queue to drain 18285 1726853398.51626: waiting for pending results... 18285 1726853398.52001: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18285 1726853398.52006: in run() - task 02083763-bbaf-9200-7ca6-000000000026 18285 1726853398.52078: variable 'ansible_search_path' from source: unknown 18285 1726853398.52082: variable 'ansible_search_path' from source: unknown 18285 1726853398.52084: calling self._execute() 18285 1726853398.52148: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.52160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.52177: variable 'omit' from source: magic vars 18285 1726853398.52640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.55132: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.55467: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.55473: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.55476: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.55479: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.55647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.55736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.55842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.55940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.55961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.56275: variable 'ansible_distribution' from source: facts 18285 1726853398.56287: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.56310: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.56317: when evaluation is False, skipping this task 18285 1726853398.56377: _execute() done 18285 1726853398.56381: dumping result to json 18285 1726853398.56383: done dumping result, returning 18285 1726853398.56392: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-9200-7ca6-000000000026] 18285 1726853398.56402: sending task result for task 02083763-bbaf-9200-7ca6-000000000026 18285 1726853398.56658: done sending task result for task 02083763-bbaf-9200-7ca6-000000000026 18285 1726853398.56661: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853398.56713: no more pending results, returning what we have 18285 1726853398.56717: results queue empty 18285 1726853398.56718: checking for any_errors_fatal 18285 1726853398.56724: done checking for any_errors_fatal 18285 1726853398.56725: checking for max_fail_percentage 18285 1726853398.56727: done checking for max_fail_percentage 18285 1726853398.56728: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.56729: done checking to see if all hosts have failed 18285 1726853398.56729: getting the remaining hosts for this loop 18285 1726853398.56731: done getting the remaining hosts for this loop 18285 1726853398.56735: getting the next task for host managed_node1 18285 1726853398.56741: done getting next task for host managed_node1 18285 1726853398.56745: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18285 1726853398.56747: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.56760: getting variables 18285 1726853398.56763: in VariableManager get_vars() 18285 1726853398.56807: Calling all_inventory to load vars for managed_node1 18285 1726853398.56810: Calling groups_inventory to load vars for managed_node1 18285 1726853398.56813: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.56825: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.56828: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.56831: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.57265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.57469: done with get_vars() 18285 1726853398.57481: done getting variables 18285 1726853398.57545: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:29:58 -0400 (0:00:00.062) 0:00:04.512 ****** 18285 1726853398.57575: entering _queue_task() for managed_node1/service 18285 1726853398.57963: worker is 1 (out of 1 available) 18285 1726853398.57977: exiting _queue_task() for managed_node1/service 18285 1726853398.57987: done queuing things up, now waiting for results queue to drain 18285 1726853398.57988: waiting for pending results... 18285 1726853398.58227: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18285 1726853398.58276: in run() - task 02083763-bbaf-9200-7ca6-000000000027 18285 1726853398.58283: variable 'ansible_search_path' from source: unknown 18285 1726853398.58286: variable 'ansible_search_path' from source: unknown 18285 1726853398.58308: calling self._execute() 18285 1726853398.58397: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.58430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.58433: variable 'omit' from source: magic vars 18285 1726853398.58937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.62093: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.62097: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.62100: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.62232: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.62333: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.62662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.62666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.62668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.62785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.62807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.63046: variable 'ansible_distribution' from source: facts 18285 1726853398.63059: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.63084: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.63203: when evaluation is False, skipping this task 18285 1726853398.63206: _execute() done 18285 1726853398.63209: dumping result to json 18285 1726853398.63211: done dumping result, returning 18285 1726853398.63214: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-9200-7ca6-000000000027] 18285 1726853398.63216: sending task result for task 02083763-bbaf-9200-7ca6-000000000027 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18285 1726853398.63359: no more pending results, returning what we have 18285 1726853398.63363: results queue empty 18285 1726853398.63365: checking for any_errors_fatal 18285 1726853398.63373: done checking for any_errors_fatal 18285 1726853398.63374: checking for max_fail_percentage 18285 1726853398.63376: done checking for max_fail_percentage 18285 1726853398.63376: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.63377: done checking to see if all hosts have failed 18285 1726853398.63378: getting the remaining hosts for this loop 18285 1726853398.63380: done getting the remaining hosts for this loop 18285 1726853398.63384: getting the next task for host managed_node1 18285 1726853398.63391: done getting next task for host managed_node1 18285 1726853398.63394: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18285 1726853398.63396: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.63409: getting variables 18285 1726853398.63411: in VariableManager get_vars() 18285 1726853398.63454: Calling all_inventory to load vars for managed_node1 18285 1726853398.63457: Calling groups_inventory to load vars for managed_node1 18285 1726853398.63460: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.63731: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.63736: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.63740: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.64221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.64420: done with get_vars() 18285 1726853398.64430: done getting variables 18285 1726853398.64461: done sending task result for task 02083763-bbaf-9200-7ca6-000000000027 18285 1726853398.64464: WORKER PROCESS EXITING 18285 1726853398.64505: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:29:58 -0400 (0:00:00.069) 0:00:04.581 ****** 18285 1726853398.64532: entering _queue_task() for managed_node1/copy 18285 1726853398.64797: worker is 1 (out of 1 available) 18285 1726853398.64902: exiting _queue_task() for managed_node1/copy 18285 1726853398.64912: done queuing things up, now waiting for results queue to drain 18285 1726853398.65027: waiting for pending results... 18285 1726853398.65350: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18285 1726853398.65481: in run() - task 02083763-bbaf-9200-7ca6-000000000028 18285 1726853398.65503: variable 'ansible_search_path' from source: unknown 18285 1726853398.65511: variable 'ansible_search_path' from source: unknown 18285 1726853398.65552: calling self._execute() 18285 1726853398.65639: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.65649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.65662: variable 'omit' from source: magic vars 18285 1726853398.66099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.69564: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.69640: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.69683: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.69726: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.69761: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.69844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.69880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.69909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.69957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.69980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.70144: variable 'ansible_distribution' from source: facts 18285 1726853398.70147: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.70149: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.70152: when evaluation is False, skipping this task 18285 1726853398.70154: _execute() done 18285 1726853398.70157: dumping result to json 18285 1726853398.70165: done dumping result, returning 18285 1726853398.70179: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-9200-7ca6-000000000028] 18285 1726853398.70187: sending task result for task 02083763-bbaf-9200-7ca6-000000000028 18285 1726853398.70327: done sending task result for task 02083763-bbaf-9200-7ca6-000000000028 18285 1726853398.70330: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853398.70405: no more pending results, returning what we have 18285 1726853398.70409: results queue empty 18285 1726853398.70410: checking for any_errors_fatal 18285 1726853398.70414: done checking for any_errors_fatal 18285 1726853398.70415: checking for max_fail_percentage 18285 1726853398.70417: done checking for max_fail_percentage 18285 1726853398.70418: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.70418: done checking to see if all hosts have failed 18285 1726853398.70419: getting the remaining hosts for this loop 18285 1726853398.70421: done getting the remaining hosts for this loop 18285 1726853398.70425: getting the next task for host managed_node1 18285 1726853398.70430: done getting next task for host managed_node1 18285 1726853398.70433: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18285 1726853398.70435: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.70448: getting variables 18285 1726853398.70450: in VariableManager get_vars() 18285 1726853398.70489: Calling all_inventory to load vars for managed_node1 18285 1726853398.70492: Calling groups_inventory to load vars for managed_node1 18285 1726853398.70494: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.70507: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.70511: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.70514: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.71203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.71694: done with get_vars() 18285 1726853398.71707: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:29:58 -0400 (0:00:00.072) 0:00:04.654 ****** 18285 1726853398.71830: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18285 1726853398.71832: Creating lock for fedora.linux_system_roles.network_connections 18285 1726853398.72724: worker is 1 (out of 1 available) 18285 1726853398.72739: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18285 1726853398.72867: done queuing things up, now waiting for results queue to drain 18285 1726853398.72869: waiting for pending results... 18285 1726853398.73091: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18285 1726853398.73096: in run() - task 02083763-bbaf-9200-7ca6-000000000029 18285 1726853398.73117: variable 'ansible_search_path' from source: unknown 18285 1726853398.73125: variable 'ansible_search_path' from source: unknown 18285 1726853398.73166: calling self._execute() 18285 1726853398.73254: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.73265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.73296: variable 'omit' from source: magic vars 18285 1726853398.73798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.77689: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.77849: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.77853: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.77887: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.77924: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.78009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.78043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.78076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.78127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.78224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.78286: variable 'ansible_distribution' from source: facts 18285 1726853398.78297: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.78318: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.78330: when evaluation is False, skipping this task 18285 1726853398.78340: _execute() done 18285 1726853398.78347: dumping result to json 18285 1726853398.78356: done dumping result, returning 18285 1726853398.78368: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-9200-7ca6-000000000029] 18285 1726853398.78380: sending task result for task 02083763-bbaf-9200-7ca6-000000000029 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853398.78595: no more pending results, returning what we have 18285 1726853398.78598: results queue empty 18285 1726853398.78599: checking for any_errors_fatal 18285 1726853398.78605: done checking for any_errors_fatal 18285 1726853398.78606: checking for max_fail_percentage 18285 1726853398.78608: done checking for max_fail_percentage 18285 1726853398.78609: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.78609: done checking to see if all hosts have failed 18285 1726853398.78610: getting the remaining hosts for this loop 18285 1726853398.78612: done getting the remaining hosts for this loop 18285 1726853398.78616: getting the next task for host managed_node1 18285 1726853398.78622: done getting next task for host managed_node1 18285 1726853398.78627: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18285 1726853398.78629: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.78641: getting variables 18285 1726853398.78642: in VariableManager get_vars() 18285 1726853398.78686: Calling all_inventory to load vars for managed_node1 18285 1726853398.78689: Calling groups_inventory to load vars for managed_node1 18285 1726853398.78692: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.78705: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.78708: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.78711: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.79040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.79400: done with get_vars() 18285 1726853398.79410: done getting variables 18285 1726853398.79440: done sending task result for task 02083763-bbaf-9200-7ca6-000000000029 18285 1726853398.79443: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:29:58 -0400 (0:00:00.077) 0:00:04.732 ****** 18285 1726853398.79625: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18285 1726853398.79627: Creating lock for fedora.linux_system_roles.network_state 18285 1726853398.80253: worker is 1 (out of 1 available) 18285 1726853398.80380: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18285 1726853398.80390: done queuing things up, now waiting for results queue to drain 18285 1726853398.80391: waiting for pending results... 18285 1726853398.80897: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18285 1726853398.81006: in run() - task 02083763-bbaf-9200-7ca6-00000000002a 18285 1726853398.81034: variable 'ansible_search_path' from source: unknown 18285 1726853398.81040: variable 'ansible_search_path' from source: unknown 18285 1726853398.81085: calling self._execute() 18285 1726853398.81183: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.81194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.81207: variable 'omit' from source: magic vars 18285 1726853398.81651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.84126: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.84198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.84239: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.84284: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.84311: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.84514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.84562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.84684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.84732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.84795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.85289: variable 'ansible_distribution' from source: facts 18285 1726853398.85292: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.85295: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.85297: when evaluation is False, skipping this task 18285 1726853398.85299: _execute() done 18285 1726853398.85301: dumping result to json 18285 1726853398.85303: done dumping result, returning 18285 1726853398.85305: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-9200-7ca6-00000000002a] 18285 1726853398.85311: sending task result for task 02083763-bbaf-9200-7ca6-00000000002a skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853398.85597: no more pending results, returning what we have 18285 1726853398.85600: results queue empty 18285 1726853398.85601: checking for any_errors_fatal 18285 1726853398.85606: done checking for any_errors_fatal 18285 1726853398.85607: checking for max_fail_percentage 18285 1726853398.85609: done checking for max_fail_percentage 18285 1726853398.85610: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.85610: done checking to see if all hosts have failed 18285 1726853398.85611: getting the remaining hosts for this loop 18285 1726853398.85613: done getting the remaining hosts for this loop 18285 1726853398.85617: getting the next task for host managed_node1 18285 1726853398.85623: done getting next task for host managed_node1 18285 1726853398.85626: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18285 1726853398.85628: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.85641: getting variables 18285 1726853398.85643: in VariableManager get_vars() 18285 1726853398.85812: Calling all_inventory to load vars for managed_node1 18285 1726853398.85815: Calling groups_inventory to load vars for managed_node1 18285 1726853398.85817: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.85830: done sending task result for task 02083763-bbaf-9200-7ca6-00000000002a 18285 1726853398.85833: WORKER PROCESS EXITING 18285 1726853398.85863: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.85867: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.85873: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.86276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.86475: done with get_vars() 18285 1726853398.86488: done getting variables 18285 1726853398.86550: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:29:58 -0400 (0:00:00.069) 0:00:04.802 ****** 18285 1726853398.86582: entering _queue_task() for managed_node1/debug 18285 1726853398.86909: worker is 1 (out of 1 available) 18285 1726853398.86920: exiting _queue_task() for managed_node1/debug 18285 1726853398.86931: done queuing things up, now waiting for results queue to drain 18285 1726853398.86976: waiting for pending results... 18285 1726853398.87273: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18285 1726853398.87288: in run() - task 02083763-bbaf-9200-7ca6-00000000002b 18285 1726853398.87308: variable 'ansible_search_path' from source: unknown 18285 1726853398.87315: variable 'ansible_search_path' from source: unknown 18285 1726853398.87356: calling self._execute() 18285 1726853398.87446: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.87457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.87473: variable 'omit' from source: magic vars 18285 1726853398.88426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.90639: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.90847: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.90850: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.90886: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.90923: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.91015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.91051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.91075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.91101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.91112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.91219: variable 'ansible_distribution' from source: facts 18285 1726853398.91222: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.91238: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.91242: when evaluation is False, skipping this task 18285 1726853398.91246: _execute() done 18285 1726853398.91251: dumping result to json 18285 1726853398.91254: done dumping result, returning 18285 1726853398.91260: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-9200-7ca6-00000000002b] 18285 1726853398.91263: sending task result for task 02083763-bbaf-9200-7ca6-00000000002b 18285 1726853398.91346: done sending task result for task 02083763-bbaf-9200-7ca6-00000000002b 18285 1726853398.91351: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853398.91446: no more pending results, returning what we have 18285 1726853398.91451: results queue empty 18285 1726853398.91452: checking for any_errors_fatal 18285 1726853398.91458: done checking for any_errors_fatal 18285 1726853398.91458: checking for max_fail_percentage 18285 1726853398.91460: done checking for max_fail_percentage 18285 1726853398.91461: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.91462: done checking to see if all hosts have failed 18285 1726853398.91462: getting the remaining hosts for this loop 18285 1726853398.91464: done getting the remaining hosts for this loop 18285 1726853398.91467: getting the next task for host managed_node1 18285 1726853398.91474: done getting next task for host managed_node1 18285 1726853398.91478: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18285 1726853398.91484: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.91503: getting variables 18285 1726853398.91505: in VariableManager get_vars() 18285 1726853398.91540: Calling all_inventory to load vars for managed_node1 18285 1726853398.91542: Calling groups_inventory to load vars for managed_node1 18285 1726853398.91544: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.91555: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.91557: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.91560: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.92081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.92293: done with get_vars() 18285 1726853398.92303: done getting variables 18285 1726853398.92380: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:29:58 -0400 (0:00:00.058) 0:00:04.860 ****** 18285 1726853398.92410: entering _queue_task() for managed_node1/debug 18285 1726853398.92754: worker is 1 (out of 1 available) 18285 1726853398.92767: exiting _queue_task() for managed_node1/debug 18285 1726853398.92781: done queuing things up, now waiting for results queue to drain 18285 1726853398.92783: waiting for pending results... 18285 1726853398.93081: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18285 1726853398.93277: in run() - task 02083763-bbaf-9200-7ca6-00000000002c 18285 1726853398.93281: variable 'ansible_search_path' from source: unknown 18285 1726853398.93284: variable 'ansible_search_path' from source: unknown 18285 1726853398.93287: calling self._execute() 18285 1726853398.93309: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.93320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.93333: variable 'omit' from source: magic vars 18285 1726853398.93757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853398.95531: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853398.95606: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853398.95647: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853398.95705: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853398.95736: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853398.95827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853398.96077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853398.96081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853398.96084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853398.96087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853398.96092: variable 'ansible_distribution' from source: facts 18285 1726853398.96104: variable 'ansible_distribution_major_version' from source: facts 18285 1726853398.96127: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853398.96137: when evaluation is False, skipping this task 18285 1726853398.96145: _execute() done 18285 1726853398.96155: dumping result to json 18285 1726853398.96164: done dumping result, returning 18285 1726853398.96178: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-9200-7ca6-00000000002c] 18285 1726853398.96188: sending task result for task 02083763-bbaf-9200-7ca6-00000000002c skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853398.96345: no more pending results, returning what we have 18285 1726853398.96349: results queue empty 18285 1726853398.96349: checking for any_errors_fatal 18285 1726853398.96355: done checking for any_errors_fatal 18285 1726853398.96355: checking for max_fail_percentage 18285 1726853398.96357: done checking for max_fail_percentage 18285 1726853398.96358: checking to see if all hosts have failed and the running result is not ok 18285 1726853398.96358: done checking to see if all hosts have failed 18285 1726853398.96359: getting the remaining hosts for this loop 18285 1726853398.96360: done getting the remaining hosts for this loop 18285 1726853398.96364: getting the next task for host managed_node1 18285 1726853398.96370: done getting next task for host managed_node1 18285 1726853398.96375: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18285 1726853398.96377: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853398.96453: done sending task result for task 02083763-bbaf-9200-7ca6-00000000002c 18285 1726853398.96456: WORKER PROCESS EXITING 18285 1726853398.96464: getting variables 18285 1726853398.96466: in VariableManager get_vars() 18285 1726853398.96506: Calling all_inventory to load vars for managed_node1 18285 1726853398.96508: Calling groups_inventory to load vars for managed_node1 18285 1726853398.96510: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853398.96521: Calling all_plugins_play to load vars for managed_node1 18285 1726853398.96523: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853398.96526: Calling groups_plugins_play to load vars for managed_node1 18285 1726853398.96769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853398.97002: done with get_vars() 18285 1726853398.97013: done getting variables 18285 1726853398.97075: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:29:58 -0400 (0:00:00.046) 0:00:04.907 ****** 18285 1726853398.97105: entering _queue_task() for managed_node1/debug 18285 1726853398.97381: worker is 1 (out of 1 available) 18285 1726853398.97393: exiting _queue_task() for managed_node1/debug 18285 1726853398.97403: done queuing things up, now waiting for results queue to drain 18285 1726853398.97405: waiting for pending results... 18285 1726853398.97672: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18285 1726853398.97752: in run() - task 02083763-bbaf-9200-7ca6-00000000002d 18285 1726853398.97772: variable 'ansible_search_path' from source: unknown 18285 1726853398.97779: variable 'ansible_search_path' from source: unknown 18285 1726853398.97809: calling self._execute() 18285 1726853398.97896: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853398.97908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853398.97921: variable 'omit' from source: magic vars 18285 1726853398.98346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.00574: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.00778: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.00781: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.00784: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.00786: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.00845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.00887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.00916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.00967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.01000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.01151: variable 'ansible_distribution' from source: facts 18285 1726853399.01163: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.01188: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.01196: when evaluation is False, skipping this task 18285 1726853399.01203: _execute() done 18285 1726853399.01210: dumping result to json 18285 1726853399.01219: done dumping result, returning 18285 1726853399.01231: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-9200-7ca6-00000000002d] 18285 1726853399.01240: sending task result for task 02083763-bbaf-9200-7ca6-00000000002d skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853399.01412: no more pending results, returning what we have 18285 1726853399.01415: results queue empty 18285 1726853399.01416: checking for any_errors_fatal 18285 1726853399.01420: done checking for any_errors_fatal 18285 1726853399.01420: checking for max_fail_percentage 18285 1726853399.01422: done checking for max_fail_percentage 18285 1726853399.01423: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.01423: done checking to see if all hosts have failed 18285 1726853399.01424: getting the remaining hosts for this loop 18285 1726853399.01426: done getting the remaining hosts for this loop 18285 1726853399.01430: getting the next task for host managed_node1 18285 1726853399.01436: done getting next task for host managed_node1 18285 1726853399.01439: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18285 1726853399.01441: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.01452: getting variables 18285 1726853399.01454: in VariableManager get_vars() 18285 1726853399.01491: Calling all_inventory to load vars for managed_node1 18285 1726853399.01493: Calling groups_inventory to load vars for managed_node1 18285 1726853399.01496: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.01509: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.01511: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.01515: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.01770: done sending task result for task 02083763-bbaf-9200-7ca6-00000000002d 18285 1726853399.01776: WORKER PROCESS EXITING 18285 1726853399.01800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.02003: done with get_vars() 18285 1726853399.02014: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:29:59 -0400 (0:00:00.050) 0:00:04.957 ****** 18285 1726853399.02112: entering _queue_task() for managed_node1/ping 18285 1726853399.02113: Creating lock for ping 18285 1726853399.02392: worker is 1 (out of 1 available) 18285 1726853399.02403: exiting _queue_task() for managed_node1/ping 18285 1726853399.02527: done queuing things up, now waiting for results queue to drain 18285 1726853399.02530: waiting for pending results... 18285 1726853399.02697: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18285 1726853399.02806: in run() - task 02083763-bbaf-9200-7ca6-00000000002e 18285 1726853399.02826: variable 'ansible_search_path' from source: unknown 18285 1726853399.02833: variable 'ansible_search_path' from source: unknown 18285 1726853399.02890: calling self._execute() 18285 1726853399.02980: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.02993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.03007: variable 'omit' from source: magic vars 18285 1726853399.03451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.05712: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.05805: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.05848: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.06126: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.06133: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.06161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.06198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.06230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.06289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.06310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.06464: variable 'ansible_distribution' from source: facts 18285 1726853399.06478: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.06501: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.06508: when evaluation is False, skipping this task 18285 1726853399.06514: _execute() done 18285 1726853399.06519: dumping result to json 18285 1726853399.06526: done dumping result, returning 18285 1726853399.06537: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-9200-7ca6-00000000002e] 18285 1726853399.06545: sending task result for task 02083763-bbaf-9200-7ca6-00000000002e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.06717: no more pending results, returning what we have 18285 1726853399.06721: results queue empty 18285 1726853399.06723: checking for any_errors_fatal 18285 1726853399.06728: done checking for any_errors_fatal 18285 1726853399.06729: checking for max_fail_percentage 18285 1726853399.06731: done checking for max_fail_percentage 18285 1726853399.06731: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.06732: done checking to see if all hosts have failed 18285 1726853399.06733: getting the remaining hosts for this loop 18285 1726853399.06735: done getting the remaining hosts for this loop 18285 1726853399.06739: getting the next task for host managed_node1 18285 1726853399.06746: done getting next task for host managed_node1 18285 1726853399.06747: ^ task is: TASK: meta (role_complete) 18285 1726853399.06752: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.06766: getting variables 18285 1726853399.06768: in VariableManager get_vars() 18285 1726853399.06912: Calling all_inventory to load vars for managed_node1 18285 1726853399.06914: Calling groups_inventory to load vars for managed_node1 18285 1726853399.06917: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.06929: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.06932: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.06934: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.07375: done sending task result for task 02083763-bbaf-9200-7ca6-00000000002e 18285 1726853399.07378: WORKER PROCESS EXITING 18285 1726853399.07401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.07611: done with get_vars() 18285 1726853399.07622: done getting variables 18285 1726853399.07715: done queuing things up, now waiting for results queue to drain 18285 1726853399.07717: results queue empty 18285 1726853399.07718: checking for any_errors_fatal 18285 1726853399.07720: done checking for any_errors_fatal 18285 1726853399.07721: checking for max_fail_percentage 18285 1726853399.07722: done checking for max_fail_percentage 18285 1726853399.07723: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.07724: done checking to see if all hosts have failed 18285 1726853399.07724: getting the remaining hosts for this loop 18285 1726853399.07725: done getting the remaining hosts for this loop 18285 1726853399.07728: getting the next task for host managed_node1 18285 1726853399.07731: done getting next task for host managed_node1 18285 1726853399.07734: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18285 1726853399.07740: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.07743: getting variables 18285 1726853399.07744: in VariableManager get_vars() 18285 1726853399.07758: Calling all_inventory to load vars for managed_node1 18285 1726853399.07760: Calling groups_inventory to load vars for managed_node1 18285 1726853399.07762: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.07769: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.07773: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.07776: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.07919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.08117: done with get_vars() 18285 1726853399.08126: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Friday 20 September 2024 13:29:59 -0400 (0:00:00.060) 0:00:05.018 ****** 18285 1726853399.08205: entering _queue_task() for managed_node1/include_tasks 18285 1726853399.08615: worker is 1 (out of 1 available) 18285 1726853399.08627: exiting _queue_task() for managed_node1/include_tasks 18285 1726853399.08637: done queuing things up, now waiting for results queue to drain 18285 1726853399.08639: waiting for pending results... 18285 1726853399.08816: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18285 1726853399.08926: in run() - task 02083763-bbaf-9200-7ca6-000000000030 18285 1726853399.08954: variable 'ansible_search_path' from source: unknown 18285 1726853399.08999: calling self._execute() 18285 1726853399.09098: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.09109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.09122: variable 'omit' from source: magic vars 18285 1726853399.09851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.12256: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.12377: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.12381: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.12438: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.12473: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.12564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.12600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.12743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.12748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.12753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.12838: variable 'ansible_distribution' from source: facts 18285 1726853399.12853: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.12883: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.12890: when evaluation is False, skipping this task 18285 1726853399.12896: _execute() done 18285 1726853399.12903: dumping result to json 18285 1726853399.12910: done dumping result, returning 18285 1726853399.12921: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [02083763-bbaf-9200-7ca6-000000000030] 18285 1726853399.12931: sending task result for task 02083763-bbaf-9200-7ca6-000000000030 18285 1726853399.13048: done sending task result for task 02083763-bbaf-9200-7ca6-000000000030 18285 1726853399.13054: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.13138: no more pending results, returning what we have 18285 1726853399.13141: results queue empty 18285 1726853399.13143: checking for any_errors_fatal 18285 1726853399.13144: done checking for any_errors_fatal 18285 1726853399.13145: checking for max_fail_percentage 18285 1726853399.13147: done checking for max_fail_percentage 18285 1726853399.13148: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.13151: done checking to see if all hosts have failed 18285 1726853399.13152: getting the remaining hosts for this loop 18285 1726853399.13154: done getting the remaining hosts for this loop 18285 1726853399.13159: getting the next task for host managed_node1 18285 1726853399.13167: done getting next task for host managed_node1 18285 1726853399.13172: ^ task is: TASK: meta (flush_handlers) 18285 1726853399.13174: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.13179: getting variables 18285 1726853399.13181: in VariableManager get_vars() 18285 1726853399.13400: Calling all_inventory to load vars for managed_node1 18285 1726853399.13403: Calling groups_inventory to load vars for managed_node1 18285 1726853399.13405: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.13416: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.13419: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.13421: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.13686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.14009: done with get_vars() 18285 1726853399.14023: done getting variables 18285 1726853399.14091: in VariableManager get_vars() 18285 1726853399.14103: Calling all_inventory to load vars for managed_node1 18285 1726853399.14105: Calling groups_inventory to load vars for managed_node1 18285 1726853399.14107: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.14111: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.14113: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.14116: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.14255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.14437: done with get_vars() 18285 1726853399.14456: done queuing things up, now waiting for results queue to drain 18285 1726853399.14458: results queue empty 18285 1726853399.14459: checking for any_errors_fatal 18285 1726853399.14461: done checking for any_errors_fatal 18285 1726853399.14462: checking for max_fail_percentage 18285 1726853399.14463: done checking for max_fail_percentage 18285 1726853399.14464: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.14464: done checking to see if all hosts have failed 18285 1726853399.14465: getting the remaining hosts for this loop 18285 1726853399.14466: done getting the remaining hosts for this loop 18285 1726853399.14468: getting the next task for host managed_node1 18285 1726853399.14473: done getting next task for host managed_node1 18285 1726853399.14475: ^ task is: TASK: meta (flush_handlers) 18285 1726853399.14476: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.14478: getting variables 18285 1726853399.14479: in VariableManager get_vars() 18285 1726853399.14488: Calling all_inventory to load vars for managed_node1 18285 1726853399.14490: Calling groups_inventory to load vars for managed_node1 18285 1726853399.14496: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.14500: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.14502: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.14504: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.14653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.15113: done with get_vars() 18285 1726853399.15120: done getting variables 18285 1726853399.15287: in VariableManager get_vars() 18285 1726853399.15298: Calling all_inventory to load vars for managed_node1 18285 1726853399.15300: Calling groups_inventory to load vars for managed_node1 18285 1726853399.15302: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.15306: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.15308: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.15311: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.15554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.15802: done with get_vars() 18285 1726853399.15814: done queuing things up, now waiting for results queue to drain 18285 1726853399.15816: results queue empty 18285 1726853399.15817: checking for any_errors_fatal 18285 1726853399.15818: done checking for any_errors_fatal 18285 1726853399.15818: checking for max_fail_percentage 18285 1726853399.15819: done checking for max_fail_percentage 18285 1726853399.15820: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.15820: done checking to see if all hosts have failed 18285 1726853399.15821: getting the remaining hosts for this loop 18285 1726853399.15822: done getting the remaining hosts for this loop 18285 1726853399.15824: getting the next task for host managed_node1 18285 1726853399.15828: done getting next task for host managed_node1 18285 1726853399.15829: ^ task is: None 18285 1726853399.15830: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.15839: done queuing things up, now waiting for results queue to drain 18285 1726853399.15841: results queue empty 18285 1726853399.15841: checking for any_errors_fatal 18285 1726853399.15842: done checking for any_errors_fatal 18285 1726853399.15843: checking for max_fail_percentage 18285 1726853399.15844: done checking for max_fail_percentage 18285 1726853399.15845: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.15845: done checking to see if all hosts have failed 18285 1726853399.15847: getting the next task for host managed_node1 18285 1726853399.15849: done getting next task for host managed_node1 18285 1726853399.15850: ^ task is: None 18285 1726853399.15851: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.15890: in VariableManager get_vars() 18285 1726853399.15905: done with get_vars() 18285 1726853399.15910: in VariableManager get_vars() 18285 1726853399.15918: done with get_vars() 18285 1726853399.15922: variable 'omit' from source: magic vars 18285 1726853399.15957: in VariableManager get_vars() 18285 1726853399.15967: done with get_vars() 18285 1726853399.15991: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 18285 1726853399.16166: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18285 1726853399.16189: getting the remaining hosts for this loop 18285 1726853399.16190: done getting the remaining hosts for this loop 18285 1726853399.16192: getting the next task for host managed_node1 18285 1726853399.16195: done getting next task for host managed_node1 18285 1726853399.16197: ^ task is: TASK: Gathering Facts 18285 1726853399.16198: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.16199: getting variables 18285 1726853399.16200: in VariableManager get_vars() 18285 1726853399.16207: Calling all_inventory to load vars for managed_node1 18285 1726853399.16209: Calling groups_inventory to load vars for managed_node1 18285 1726853399.16211: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.16215: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.16216: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.16219: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.16378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.16533: done with get_vars() 18285 1726853399.16540: done getting variables 18285 1726853399.16577: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Friday 20 September 2024 13:29:59 -0400 (0:00:00.083) 0:00:05.102 ****** 18285 1726853399.16604: entering _queue_task() for managed_node1/gather_facts 18285 1726853399.16887: worker is 1 (out of 1 available) 18285 1726853399.16900: exiting _queue_task() for managed_node1/gather_facts 18285 1726853399.16911: done queuing things up, now waiting for results queue to drain 18285 1726853399.16913: waiting for pending results... 18285 1726853399.17164: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18285 1726853399.17196: in run() - task 02083763-bbaf-9200-7ca6-000000000197 18285 1726853399.17217: variable 'ansible_search_path' from source: unknown 18285 1726853399.17266: calling self._execute() 18285 1726853399.17353: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.17381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.17396: variable 'omit' from source: magic vars 18285 1726853399.18300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.21432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.21510: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.21556: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.21604: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.21631: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.21721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.21772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.21807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.21854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.21876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.22025: variable 'ansible_distribution' from source: facts 18285 1726853399.22037: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.22063: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.22070: when evaluation is False, skipping this task 18285 1726853399.22080: _execute() done 18285 1726853399.22087: dumping result to json 18285 1726853399.22097: done dumping result, returning 18285 1726853399.22111: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-9200-7ca6-000000000197] 18285 1726853399.22220: sending task result for task 02083763-bbaf-9200-7ca6-000000000197 18285 1726853399.22287: done sending task result for task 02083763-bbaf-9200-7ca6-000000000197 18285 1726853399.22290: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.22337: no more pending results, returning what we have 18285 1726853399.22341: results queue empty 18285 1726853399.22342: checking for any_errors_fatal 18285 1726853399.22343: done checking for any_errors_fatal 18285 1726853399.22343: checking for max_fail_percentage 18285 1726853399.22345: done checking for max_fail_percentage 18285 1726853399.22346: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.22346: done checking to see if all hosts have failed 18285 1726853399.22347: getting the remaining hosts for this loop 18285 1726853399.22348: done getting the remaining hosts for this loop 18285 1726853399.22352: getting the next task for host managed_node1 18285 1726853399.22358: done getting next task for host managed_node1 18285 1726853399.22359: ^ task is: TASK: meta (flush_handlers) 18285 1726853399.22361: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.22364: getting variables 18285 1726853399.22366: in VariableManager get_vars() 18285 1726853399.22395: Calling all_inventory to load vars for managed_node1 18285 1726853399.22398: Calling groups_inventory to load vars for managed_node1 18285 1726853399.22402: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.22414: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.22417: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.22419: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.22687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.22977: done with get_vars() 18285 1726853399.22992: done getting variables 18285 1726853399.23062: in VariableManager get_vars() 18285 1726853399.23095: Calling all_inventory to load vars for managed_node1 18285 1726853399.23097: Calling groups_inventory to load vars for managed_node1 18285 1726853399.23099: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.23104: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.23106: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.23110: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.23491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.23895: done with get_vars() 18285 1726853399.23907: done queuing things up, now waiting for results queue to drain 18285 1726853399.23909: results queue empty 18285 1726853399.23910: checking for any_errors_fatal 18285 1726853399.23912: done checking for any_errors_fatal 18285 1726853399.23913: checking for max_fail_percentage 18285 1726853399.23914: done checking for max_fail_percentage 18285 1726853399.23914: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.23915: done checking to see if all hosts have failed 18285 1726853399.23916: getting the remaining hosts for this loop 18285 1726853399.23917: done getting the remaining hosts for this loop 18285 1726853399.23919: getting the next task for host managed_node1 18285 1726853399.23922: done getting next task for host managed_node1 18285 1726853399.23924: ^ task is: TASK: Show network_provider 18285 1726853399.23926: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.23927: getting variables 18285 1726853399.23928: in VariableManager get_vars() 18285 1726853399.23936: Calling all_inventory to load vars for managed_node1 18285 1726853399.23938: Calling groups_inventory to load vars for managed_node1 18285 1726853399.23940: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.23945: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.23956: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.23959: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.24285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.24537: done with get_vars() 18285 1726853399.24546: done getting variables 18285 1726853399.24593: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Friday 20 September 2024 13:29:59 -0400 (0:00:00.080) 0:00:05.182 ****** 18285 1726853399.24627: entering _queue_task() for managed_node1/debug 18285 1726853399.24955: worker is 1 (out of 1 available) 18285 1726853399.24968: exiting _queue_task() for managed_node1/debug 18285 1726853399.24982: done queuing things up, now waiting for results queue to drain 18285 1726853399.24983: waiting for pending results... 18285 1726853399.25202: running TaskExecutor() for managed_node1/TASK: Show network_provider 18285 1726853399.25385: in run() - task 02083763-bbaf-9200-7ca6-000000000033 18285 1726853399.25389: variable 'ansible_search_path' from source: unknown 18285 1726853399.25393: calling self._execute() 18285 1726853399.25497: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.25503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.25510: variable 'omit' from source: magic vars 18285 1726853399.25972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.27641: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.27688: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.27715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.27740: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.27762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.27822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.27843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.27863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.27925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.27928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.28014: variable 'ansible_distribution' from source: facts 18285 1726853399.28017: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.28033: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.28036: when evaluation is False, skipping this task 18285 1726853399.28039: _execute() done 18285 1726853399.28042: dumping result to json 18285 1726853399.28044: done dumping result, returning 18285 1726853399.28051: done running TaskExecutor() for managed_node1/TASK: Show network_provider [02083763-bbaf-9200-7ca6-000000000033] 18285 1726853399.28059: sending task result for task 02083763-bbaf-9200-7ca6-000000000033 18285 1726853399.28140: done sending task result for task 02083763-bbaf-9200-7ca6-000000000033 18285 1726853399.28143: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853399.28189: no more pending results, returning what we have 18285 1726853399.28193: results queue empty 18285 1726853399.28193: checking for any_errors_fatal 18285 1726853399.28195: done checking for any_errors_fatal 18285 1726853399.28196: checking for max_fail_percentage 18285 1726853399.28197: done checking for max_fail_percentage 18285 1726853399.28198: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.28199: done checking to see if all hosts have failed 18285 1726853399.28199: getting the remaining hosts for this loop 18285 1726853399.28201: done getting the remaining hosts for this loop 18285 1726853399.28204: getting the next task for host managed_node1 18285 1726853399.28210: done getting next task for host managed_node1 18285 1726853399.28211: ^ task is: TASK: meta (flush_handlers) 18285 1726853399.28213: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.28217: getting variables 18285 1726853399.28218: in VariableManager get_vars() 18285 1726853399.28245: Calling all_inventory to load vars for managed_node1 18285 1726853399.28247: Calling groups_inventory to load vars for managed_node1 18285 1726853399.28251: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.28264: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.28266: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.28269: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.28458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.28639: done with get_vars() 18285 1726853399.28651: done getting variables 18285 1726853399.28706: in VariableManager get_vars() 18285 1726853399.28713: Calling all_inventory to load vars for managed_node1 18285 1726853399.28714: Calling groups_inventory to load vars for managed_node1 18285 1726853399.28716: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.28719: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.28720: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.28722: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.28851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.28955: done with get_vars() 18285 1726853399.28964: done queuing things up, now waiting for results queue to drain 18285 1726853399.28965: results queue empty 18285 1726853399.28966: checking for any_errors_fatal 18285 1726853399.28967: done checking for any_errors_fatal 18285 1726853399.28968: checking for max_fail_percentage 18285 1726853399.28968: done checking for max_fail_percentage 18285 1726853399.28969: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.28969: done checking to see if all hosts have failed 18285 1726853399.28970: getting the remaining hosts for this loop 18285 1726853399.28972: done getting the remaining hosts for this loop 18285 1726853399.28974: getting the next task for host managed_node1 18285 1726853399.28976: done getting next task for host managed_node1 18285 1726853399.28977: ^ task is: TASK: meta (flush_handlers) 18285 1726853399.28978: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.28980: getting variables 18285 1726853399.28980: in VariableManager get_vars() 18285 1726853399.28985: Calling all_inventory to load vars for managed_node1 18285 1726853399.28986: Calling groups_inventory to load vars for managed_node1 18285 1726853399.28987: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.28990: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.28996: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.28997: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.29080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.29184: done with get_vars() 18285 1726853399.29190: done getting variables 18285 1726853399.29218: in VariableManager get_vars() 18285 1726853399.29224: Calling all_inventory to load vars for managed_node1 18285 1726853399.29226: Calling groups_inventory to load vars for managed_node1 18285 1726853399.29228: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.29231: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.29232: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.29234: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.29326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.29499: done with get_vars() 18285 1726853399.29509: done queuing things up, now waiting for results queue to drain 18285 1726853399.29510: results queue empty 18285 1726853399.29511: checking for any_errors_fatal 18285 1726853399.29512: done checking for any_errors_fatal 18285 1726853399.29512: checking for max_fail_percentage 18285 1726853399.29513: done checking for max_fail_percentage 18285 1726853399.29514: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.29515: done checking to see if all hosts have failed 18285 1726853399.29515: getting the remaining hosts for this loop 18285 1726853399.29516: done getting the remaining hosts for this loop 18285 1726853399.29518: getting the next task for host managed_node1 18285 1726853399.29520: done getting next task for host managed_node1 18285 1726853399.29521: ^ task is: None 18285 1726853399.29522: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.29523: done queuing things up, now waiting for results queue to drain 18285 1726853399.29524: results queue empty 18285 1726853399.29524: checking for any_errors_fatal 18285 1726853399.29525: done checking for any_errors_fatal 18285 1726853399.29526: checking for max_fail_percentage 18285 1726853399.29527: done checking for max_fail_percentage 18285 1726853399.29527: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.29528: done checking to see if all hosts have failed 18285 1726853399.29529: getting the next task for host managed_node1 18285 1726853399.29530: done getting next task for host managed_node1 18285 1726853399.29531: ^ task is: None 18285 1726853399.29532: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.29560: in VariableManager get_vars() 18285 1726853399.29580: done with get_vars() 18285 1726853399.29586: in VariableManager get_vars() 18285 1726853399.29596: done with get_vars() 18285 1726853399.29600: variable 'omit' from source: magic vars 18285 1726853399.29711: variable 'profile' from source: play vars 18285 1726853399.29837: in VariableManager get_vars() 18285 1726853399.29854: done with get_vars() 18285 1726853399.29876: variable 'omit' from source: magic vars 18285 1726853399.29943: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 18285 1726853399.30342: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18285 1726853399.30363: getting the remaining hosts for this loop 18285 1726853399.30364: done getting the remaining hosts for this loop 18285 1726853399.30366: getting the next task for host managed_node1 18285 1726853399.30367: done getting next task for host managed_node1 18285 1726853399.30369: ^ task is: TASK: Gathering Facts 18285 1726853399.30370: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.30375: getting variables 18285 1726853399.30376: in VariableManager get_vars() 18285 1726853399.30383: Calling all_inventory to load vars for managed_node1 18285 1726853399.30385: Calling groups_inventory to load vars for managed_node1 18285 1726853399.30386: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.30389: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.30391: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.30392: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.30495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.30599: done with get_vars() 18285 1726853399.30605: done getting variables 18285 1726853399.30630: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 13:29:59 -0400 (0:00:00.060) 0:00:05.242 ****** 18285 1726853399.30646: entering _queue_task() for managed_node1/gather_facts 18285 1726853399.30851: worker is 1 (out of 1 available) 18285 1726853399.30861: exiting _queue_task() for managed_node1/gather_facts 18285 1726853399.30873: done queuing things up, now waiting for results queue to drain 18285 1726853399.30875: waiting for pending results... 18285 1726853399.31029: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18285 1726853399.31088: in run() - task 02083763-bbaf-9200-7ca6-0000000001ac 18285 1726853399.31105: variable 'ansible_search_path' from source: unknown 18285 1726853399.31129: calling self._execute() 18285 1726853399.31190: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.31195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.31204: variable 'omit' from source: magic vars 18285 1726853399.31497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.33119: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.33160: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.33190: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.33215: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.33235: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.33296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.33316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.33333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.33361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.33373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.33462: variable 'ansible_distribution' from source: facts 18285 1726853399.33466: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.33482: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.33486: when evaluation is False, skipping this task 18285 1726853399.33488: _execute() done 18285 1726853399.33492: dumping result to json 18285 1726853399.33494: done dumping result, returning 18285 1726853399.33506: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-9200-7ca6-0000000001ac] 18285 1726853399.33509: sending task result for task 02083763-bbaf-9200-7ca6-0000000001ac 18285 1726853399.33577: done sending task result for task 02083763-bbaf-9200-7ca6-0000000001ac 18285 1726853399.33580: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.33651: no more pending results, returning what we have 18285 1726853399.33654: results queue empty 18285 1726853399.33655: checking for any_errors_fatal 18285 1726853399.33656: done checking for any_errors_fatal 18285 1726853399.33657: checking for max_fail_percentage 18285 1726853399.33658: done checking for max_fail_percentage 18285 1726853399.33659: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.33660: done checking to see if all hosts have failed 18285 1726853399.33661: getting the remaining hosts for this loop 18285 1726853399.33662: done getting the remaining hosts for this loop 18285 1726853399.33666: getting the next task for host managed_node1 18285 1726853399.33673: done getting next task for host managed_node1 18285 1726853399.33675: ^ task is: TASK: meta (flush_handlers) 18285 1726853399.33677: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.33680: getting variables 18285 1726853399.33681: in VariableManager get_vars() 18285 1726853399.33715: Calling all_inventory to load vars for managed_node1 18285 1726853399.33717: Calling groups_inventory to load vars for managed_node1 18285 1726853399.33719: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.33728: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.33730: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.33732: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.33861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.33976: done with get_vars() 18285 1726853399.33984: done getting variables 18285 1726853399.34029: in VariableManager get_vars() 18285 1726853399.34037: Calling all_inventory to load vars for managed_node1 18285 1726853399.34038: Calling groups_inventory to load vars for managed_node1 18285 1726853399.34039: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.34042: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.34043: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.34045: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.34145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.34257: done with get_vars() 18285 1726853399.34266: done queuing things up, now waiting for results queue to drain 18285 1726853399.34267: results queue empty 18285 1726853399.34267: checking for any_errors_fatal 18285 1726853399.34268: done checking for any_errors_fatal 18285 1726853399.34269: checking for max_fail_percentage 18285 1726853399.34270: done checking for max_fail_percentage 18285 1726853399.34270: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.34272: done checking to see if all hosts have failed 18285 1726853399.34273: getting the remaining hosts for this loop 18285 1726853399.34273: done getting the remaining hosts for this loop 18285 1726853399.34275: getting the next task for host managed_node1 18285 1726853399.34277: done getting next task for host managed_node1 18285 1726853399.34279: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18285 1726853399.34280: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.34286: getting variables 18285 1726853399.34287: in VariableManager get_vars() 18285 1726853399.34295: Calling all_inventory to load vars for managed_node1 18285 1726853399.34296: Calling groups_inventory to load vars for managed_node1 18285 1726853399.34297: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.34303: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.34305: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.34306: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.34386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.34503: done with get_vars() 18285 1726853399.34509: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:29:59 -0400 (0:00:00.039) 0:00:05.281 ****** 18285 1726853399.34557: entering _queue_task() for managed_node1/include_tasks 18285 1726853399.34753: worker is 1 (out of 1 available) 18285 1726853399.34767: exiting _queue_task() for managed_node1/include_tasks 18285 1726853399.34778: done queuing things up, now waiting for results queue to drain 18285 1726853399.34780: waiting for pending results... 18285 1726853399.34956: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18285 1726853399.35021: in run() - task 02083763-bbaf-9200-7ca6-00000000003c 18285 1726853399.35032: variable 'ansible_search_path' from source: unknown 18285 1726853399.35036: variable 'ansible_search_path' from source: unknown 18285 1726853399.35062: calling self._execute() 18285 1726853399.35126: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.35130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.35138: variable 'omit' from source: magic vars 18285 1726853399.35424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.37577: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.37581: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.37584: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.37586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.37589: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.37655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.37687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.37714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.37752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.37768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.37885: variable 'ansible_distribution' from source: facts 18285 1726853399.37894: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.37914: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.37922: when evaluation is False, skipping this task 18285 1726853399.37927: _execute() done 18285 1726853399.37933: dumping result to json 18285 1726853399.37939: done dumping result, returning 18285 1726853399.37948: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-9200-7ca6-00000000003c] 18285 1726853399.37956: sending task result for task 02083763-bbaf-9200-7ca6-00000000003c skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.38187: no more pending results, returning what we have 18285 1726853399.38191: results queue empty 18285 1726853399.38192: checking for any_errors_fatal 18285 1726853399.38193: done checking for any_errors_fatal 18285 1726853399.38194: checking for max_fail_percentage 18285 1726853399.38195: done checking for max_fail_percentage 18285 1726853399.38196: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.38196: done checking to see if all hosts have failed 18285 1726853399.38197: getting the remaining hosts for this loop 18285 1726853399.38199: done getting the remaining hosts for this loop 18285 1726853399.38203: getting the next task for host managed_node1 18285 1726853399.38207: done getting next task for host managed_node1 18285 1726853399.38210: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18285 1726853399.38212: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.38224: getting variables 18285 1726853399.38227: in VariableManager get_vars() 18285 1726853399.38260: Calling all_inventory to load vars for managed_node1 18285 1726853399.38262: Calling groups_inventory to load vars for managed_node1 18285 1726853399.38264: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.38274: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.38276: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.38279: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.38426: done sending task result for task 02083763-bbaf-9200-7ca6-00000000003c 18285 1726853399.38429: WORKER PROCESS EXITING 18285 1726853399.38444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.38594: done with get_vars() 18285 1726853399.38600: done getting variables 18285 1726853399.38638: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:29:59 -0400 (0:00:00.041) 0:00:05.323 ****** 18285 1726853399.38663: entering _queue_task() for managed_node1/debug 18285 1726853399.38855: worker is 1 (out of 1 available) 18285 1726853399.38869: exiting _queue_task() for managed_node1/debug 18285 1726853399.38880: done queuing things up, now waiting for results queue to drain 18285 1726853399.38882: waiting for pending results... 18285 1726853399.39032: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18285 1726853399.39089: in run() - task 02083763-bbaf-9200-7ca6-00000000003d 18285 1726853399.39102: variable 'ansible_search_path' from source: unknown 18285 1726853399.39106: variable 'ansible_search_path' from source: unknown 18285 1726853399.39132: calling self._execute() 18285 1726853399.39188: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.39192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.39200: variable 'omit' from source: magic vars 18285 1726853399.39488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.41058: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.41111: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.41137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.41162: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.41188: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.41239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.41259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.41279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.41308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.41319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.41411: variable 'ansible_distribution' from source: facts 18285 1726853399.41415: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.41427: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.41430: when evaluation is False, skipping this task 18285 1726853399.41433: _execute() done 18285 1726853399.41435: dumping result to json 18285 1726853399.41440: done dumping result, returning 18285 1726853399.41447: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-9200-7ca6-00000000003d] 18285 1726853399.41453: sending task result for task 02083763-bbaf-9200-7ca6-00000000003d 18285 1726853399.41541: done sending task result for task 02083763-bbaf-9200-7ca6-00000000003d 18285 1726853399.41543: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853399.41590: no more pending results, returning what we have 18285 1726853399.41593: results queue empty 18285 1726853399.41594: checking for any_errors_fatal 18285 1726853399.41600: done checking for any_errors_fatal 18285 1726853399.41600: checking for max_fail_percentage 18285 1726853399.41602: done checking for max_fail_percentage 18285 1726853399.41603: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.41603: done checking to see if all hosts have failed 18285 1726853399.41604: getting the remaining hosts for this loop 18285 1726853399.41605: done getting the remaining hosts for this loop 18285 1726853399.41609: getting the next task for host managed_node1 18285 1726853399.41614: done getting next task for host managed_node1 18285 1726853399.41618: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18285 1726853399.41619: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.41631: getting variables 18285 1726853399.41632: in VariableManager get_vars() 18285 1726853399.41669: Calling all_inventory to load vars for managed_node1 18285 1726853399.41678: Calling groups_inventory to load vars for managed_node1 18285 1726853399.41681: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.41692: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.41695: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.41697: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.41832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.41951: done with get_vars() 18285 1726853399.41958: done getting variables 18285 1726853399.41999: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:29:59 -0400 (0:00:00.033) 0:00:05.356 ****** 18285 1726853399.42022: entering _queue_task() for managed_node1/fail 18285 1726853399.42211: worker is 1 (out of 1 available) 18285 1726853399.42223: exiting _queue_task() for managed_node1/fail 18285 1726853399.42235: done queuing things up, now waiting for results queue to drain 18285 1726853399.42237: waiting for pending results... 18285 1726853399.42500: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18285 1726853399.42578: in run() - task 02083763-bbaf-9200-7ca6-00000000003e 18285 1726853399.42582: variable 'ansible_search_path' from source: unknown 18285 1726853399.42584: variable 'ansible_search_path' from source: unknown 18285 1726853399.42586: calling self._execute() 18285 1726853399.42670: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.42689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.42711: variable 'omit' from source: magic vars 18285 1726853399.43179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.45929: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.46023: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.46281: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.46284: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.46286: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.46414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.46450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.46485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.46544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.46566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.46731: variable 'ansible_distribution' from source: facts 18285 1726853399.46744: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.46768: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.46780: when evaluation is False, skipping this task 18285 1726853399.46790: _execute() done 18285 1726853399.46826: dumping result to json 18285 1726853399.46835: done dumping result, returning 18285 1726853399.46839: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-9200-7ca6-00000000003e] 18285 1726853399.46842: sending task result for task 02083763-bbaf-9200-7ca6-00000000003e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.47107: no more pending results, returning what we have 18285 1726853399.47111: results queue empty 18285 1726853399.47112: checking for any_errors_fatal 18285 1726853399.47118: done checking for any_errors_fatal 18285 1726853399.47119: checking for max_fail_percentage 18285 1726853399.47121: done checking for max_fail_percentage 18285 1726853399.47122: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.47123: done checking to see if all hosts have failed 18285 1726853399.47124: getting the remaining hosts for this loop 18285 1726853399.47125: done getting the remaining hosts for this loop 18285 1726853399.47129: getting the next task for host managed_node1 18285 1726853399.47135: done getting next task for host managed_node1 18285 1726853399.47138: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18285 1726853399.47140: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.47154: getting variables 18285 1726853399.47157: in VariableManager get_vars() 18285 1726853399.47198: Calling all_inventory to load vars for managed_node1 18285 1726853399.47201: Calling groups_inventory to load vars for managed_node1 18285 1726853399.47204: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.47217: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.47220: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.47224: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.47838: done sending task result for task 02083763-bbaf-9200-7ca6-00000000003e 18285 1726853399.47842: WORKER PROCESS EXITING 18285 1726853399.47916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.48113: done with get_vars() 18285 1726853399.48123: done getting variables 18285 1726853399.48191: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:29:59 -0400 (0:00:00.062) 0:00:05.418 ****** 18285 1726853399.48241: entering _queue_task() for managed_node1/fail 18285 1726853399.49154: worker is 1 (out of 1 available) 18285 1726853399.49166: exiting _queue_task() for managed_node1/fail 18285 1726853399.49177: done queuing things up, now waiting for results queue to drain 18285 1726853399.49178: waiting for pending results... 18285 1726853399.50070: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18285 1726853399.50076: in run() - task 02083763-bbaf-9200-7ca6-00000000003f 18285 1726853399.50078: variable 'ansible_search_path' from source: unknown 18285 1726853399.50081: variable 'ansible_search_path' from source: unknown 18285 1726853399.50083: calling self._execute() 18285 1726853399.50679: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.50683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.50686: variable 'omit' from source: magic vars 18285 1726853399.51491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.56252: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.56419: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.56500: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.56607: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.56705: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.56890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.56925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.56954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.57047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.57093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.57440: variable 'ansible_distribution' from source: facts 18285 1726853399.57443: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.57446: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.57448: when evaluation is False, skipping this task 18285 1726853399.57450: _execute() done 18285 1726853399.57452: dumping result to json 18285 1726853399.57454: done dumping result, returning 18285 1726853399.57680: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-9200-7ca6-00000000003f] 18285 1726853399.57683: sending task result for task 02083763-bbaf-9200-7ca6-00000000003f 18285 1726853399.57762: done sending task result for task 02083763-bbaf-9200-7ca6-00000000003f 18285 1726853399.57765: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.57827: no more pending results, returning what we have 18285 1726853399.57831: results queue empty 18285 1726853399.57832: checking for any_errors_fatal 18285 1726853399.57839: done checking for any_errors_fatal 18285 1726853399.57840: checking for max_fail_percentage 18285 1726853399.57842: done checking for max_fail_percentage 18285 1726853399.57843: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.57843: done checking to see if all hosts have failed 18285 1726853399.57844: getting the remaining hosts for this loop 18285 1726853399.57846: done getting the remaining hosts for this loop 18285 1726853399.57850: getting the next task for host managed_node1 18285 1726853399.57855: done getting next task for host managed_node1 18285 1726853399.57859: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18285 1726853399.57861: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.57876: getting variables 18285 1726853399.57879: in VariableManager get_vars() 18285 1726853399.57919: Calling all_inventory to load vars for managed_node1 18285 1726853399.57921: Calling groups_inventory to load vars for managed_node1 18285 1726853399.57924: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.57936: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.57939: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.57941: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.58277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.59024: done with get_vars() 18285 1726853399.59037: done getting variables 18285 1726853399.59103: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:29:59 -0400 (0:00:00.109) 0:00:05.528 ****** 18285 1726853399.59185: entering _queue_task() for managed_node1/fail 18285 1726853399.59718: worker is 1 (out of 1 available) 18285 1726853399.59731: exiting _queue_task() for managed_node1/fail 18285 1726853399.59742: done queuing things up, now waiting for results queue to drain 18285 1726853399.59743: waiting for pending results... 18285 1726853399.60040: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18285 1726853399.60178: in run() - task 02083763-bbaf-9200-7ca6-000000000040 18285 1726853399.60182: variable 'ansible_search_path' from source: unknown 18285 1726853399.60185: variable 'ansible_search_path' from source: unknown 18285 1726853399.60586: calling self._execute() 18285 1726853399.60590: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.60593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.60595: variable 'omit' from source: magic vars 18285 1726853399.61489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.67283: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.67364: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.67538: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.67732: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.67783: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.67990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.68025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.68141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.68213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.68235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.68636: variable 'ansible_distribution' from source: facts 18285 1726853399.68843: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.68847: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.68852: when evaluation is False, skipping this task 18285 1726853399.68855: _execute() done 18285 1726853399.68857: dumping result to json 18285 1726853399.68859: done dumping result, returning 18285 1726853399.68862: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-9200-7ca6-000000000040] 18285 1726853399.68864: sending task result for task 02083763-bbaf-9200-7ca6-000000000040 18285 1726853399.68939: done sending task result for task 02083763-bbaf-9200-7ca6-000000000040 18285 1726853399.68943: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.69108: no more pending results, returning what we have 18285 1726853399.69112: results queue empty 18285 1726853399.69113: checking for any_errors_fatal 18285 1726853399.69119: done checking for any_errors_fatal 18285 1726853399.69120: checking for max_fail_percentage 18285 1726853399.69122: done checking for max_fail_percentage 18285 1726853399.69123: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.69124: done checking to see if all hosts have failed 18285 1726853399.69125: getting the remaining hosts for this loop 18285 1726853399.69126: done getting the remaining hosts for this loop 18285 1726853399.69131: getting the next task for host managed_node1 18285 1726853399.69138: done getting next task for host managed_node1 18285 1726853399.69142: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18285 1726853399.69145: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.69161: getting variables 18285 1726853399.69163: in VariableManager get_vars() 18285 1726853399.69206: Calling all_inventory to load vars for managed_node1 18285 1726853399.69209: Calling groups_inventory to load vars for managed_node1 18285 1726853399.69212: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.69224: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.69227: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.69230: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.69666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.70275: done with get_vars() 18285 1726853399.70286: done getting variables 18285 1726853399.70334: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:29:59 -0400 (0:00:00.114) 0:00:05.642 ****** 18285 1726853399.70591: entering _queue_task() for managed_node1/dnf 18285 1726853399.71059: worker is 1 (out of 1 available) 18285 1726853399.71474: exiting _queue_task() for managed_node1/dnf 18285 1726853399.71483: done queuing things up, now waiting for results queue to drain 18285 1726853399.71485: waiting for pending results... 18285 1726853399.71787: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18285 1726853399.71792: in run() - task 02083763-bbaf-9200-7ca6-000000000041 18285 1726853399.71795: variable 'ansible_search_path' from source: unknown 18285 1726853399.71797: variable 'ansible_search_path' from source: unknown 18285 1726853399.71799: calling self._execute() 18285 1726853399.72176: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.72180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.72183: variable 'omit' from source: magic vars 18285 1726853399.72859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.77268: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.77438: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.77879: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.77884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.77886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.77923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.78377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.78381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.78385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.78388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.78412: variable 'ansible_distribution' from source: facts 18285 1726853399.78421: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.78442: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.78876: when evaluation is False, skipping this task 18285 1726853399.78879: _execute() done 18285 1726853399.78882: dumping result to json 18285 1726853399.78884: done dumping result, returning 18285 1726853399.78887: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-000000000041] 18285 1726853399.78889: sending task result for task 02083763-bbaf-9200-7ca6-000000000041 18285 1726853399.78964: done sending task result for task 02083763-bbaf-9200-7ca6-000000000041 18285 1726853399.78969: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.79017: no more pending results, returning what we have 18285 1726853399.79020: results queue empty 18285 1726853399.79021: checking for any_errors_fatal 18285 1726853399.79027: done checking for any_errors_fatal 18285 1726853399.79028: checking for max_fail_percentage 18285 1726853399.79029: done checking for max_fail_percentage 18285 1726853399.79030: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.79030: done checking to see if all hosts have failed 18285 1726853399.79031: getting the remaining hosts for this loop 18285 1726853399.79033: done getting the remaining hosts for this loop 18285 1726853399.79037: getting the next task for host managed_node1 18285 1726853399.79041: done getting next task for host managed_node1 18285 1726853399.79044: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18285 1726853399.79046: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.79058: getting variables 18285 1726853399.79060: in VariableManager get_vars() 18285 1726853399.79094: Calling all_inventory to load vars for managed_node1 18285 1726853399.79096: Calling groups_inventory to load vars for managed_node1 18285 1726853399.79098: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.79107: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.79109: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.79112: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.79380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.79788: done with get_vars() 18285 1726853399.79799: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18285 1726853399.79869: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:29:59 -0400 (0:00:00.095) 0:00:05.737 ****** 18285 1726853399.80103: entering _queue_task() for managed_node1/yum 18285 1726853399.80570: worker is 1 (out of 1 available) 18285 1726853399.80587: exiting _queue_task() for managed_node1/yum 18285 1726853399.80601: done queuing things up, now waiting for results queue to drain 18285 1726853399.80603: waiting for pending results... 18285 1726853399.81232: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18285 1726853399.81396: in run() - task 02083763-bbaf-9200-7ca6-000000000042 18285 1726853399.81497: variable 'ansible_search_path' from source: unknown 18285 1726853399.81506: variable 'ansible_search_path' from source: unknown 18285 1726853399.81550: calling self._execute() 18285 1726853399.81785: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.81798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.81816: variable 'omit' from source: magic vars 18285 1726853399.82733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.86277: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.86355: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.86425: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.86472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.86511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.86602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.86658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.86691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.86744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.86765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.86929: variable 'ansible_distribution' from source: facts 18285 1726853399.87056: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.87065: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.87068: when evaluation is False, skipping this task 18285 1726853399.87070: _execute() done 18285 1726853399.87074: dumping result to json 18285 1726853399.87076: done dumping result, returning 18285 1726853399.87079: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-000000000042] 18285 1726853399.87081: sending task result for task 02083763-bbaf-9200-7ca6-000000000042 18285 1726853399.87154: done sending task result for task 02083763-bbaf-9200-7ca6-000000000042 18285 1726853399.87268: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.87320: no more pending results, returning what we have 18285 1726853399.87325: results queue empty 18285 1726853399.87326: checking for any_errors_fatal 18285 1726853399.87332: done checking for any_errors_fatal 18285 1726853399.87333: checking for max_fail_percentage 18285 1726853399.87335: done checking for max_fail_percentage 18285 1726853399.87336: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.87336: done checking to see if all hosts have failed 18285 1726853399.87337: getting the remaining hosts for this loop 18285 1726853399.87339: done getting the remaining hosts for this loop 18285 1726853399.87343: getting the next task for host managed_node1 18285 1726853399.87349: done getting next task for host managed_node1 18285 1726853399.87353: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18285 1726853399.87355: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.87369: getting variables 18285 1726853399.87372: in VariableManager get_vars() 18285 1726853399.87411: Calling all_inventory to load vars for managed_node1 18285 1726853399.87414: Calling groups_inventory to load vars for managed_node1 18285 1726853399.87417: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.87430: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.87434: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.87437: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.88019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.88402: done with get_vars() 18285 1726853399.88412: done getting variables 18285 1726853399.88467: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:29:59 -0400 (0:00:00.088) 0:00:05.826 ****** 18285 1726853399.88998: entering _queue_task() for managed_node1/fail 18285 1726853399.89267: worker is 1 (out of 1 available) 18285 1726853399.89281: exiting _queue_task() for managed_node1/fail 18285 1726853399.89291: done queuing things up, now waiting for results queue to drain 18285 1726853399.89293: waiting for pending results... 18285 1726853399.90009: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18285 1726853399.90056: in run() - task 02083763-bbaf-9200-7ca6-000000000043 18285 1726853399.90110: variable 'ansible_search_path' from source: unknown 18285 1726853399.90313: variable 'ansible_search_path' from source: unknown 18285 1726853399.90317: calling self._execute() 18285 1726853399.90399: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.90412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.90678: variable 'omit' from source: magic vars 18285 1726853399.91150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853399.95918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853399.96065: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853399.96112: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853399.96150: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853399.96229: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853399.96389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853399.96450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853399.96554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853399.96603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853399.96653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853399.96972: variable 'ansible_distribution' from source: facts 18285 1726853399.96984: variable 'ansible_distribution_major_version' from source: facts 18285 1726853399.97004: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853399.97038: when evaluation is False, skipping this task 18285 1726853399.97078: _execute() done 18285 1726853399.97086: dumping result to json 18285 1726853399.97095: done dumping result, returning 18285 1726853399.97106: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-000000000043] 18285 1726853399.97183: sending task result for task 02083763-bbaf-9200-7ca6-000000000043 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853399.97431: no more pending results, returning what we have 18285 1726853399.97437: results queue empty 18285 1726853399.97439: checking for any_errors_fatal 18285 1726853399.97447: done checking for any_errors_fatal 18285 1726853399.97448: checking for max_fail_percentage 18285 1726853399.97450: done checking for max_fail_percentage 18285 1726853399.97451: checking to see if all hosts have failed and the running result is not ok 18285 1726853399.97451: done checking to see if all hosts have failed 18285 1726853399.97452: getting the remaining hosts for this loop 18285 1726853399.97453: done getting the remaining hosts for this loop 18285 1726853399.97458: getting the next task for host managed_node1 18285 1726853399.97463: done getting next task for host managed_node1 18285 1726853399.97467: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18285 1726853399.97469: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853399.97483: getting variables 18285 1726853399.97485: in VariableManager get_vars() 18285 1726853399.97528: Calling all_inventory to load vars for managed_node1 18285 1726853399.97530: Calling groups_inventory to load vars for managed_node1 18285 1726853399.97533: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853399.97545: Calling all_plugins_play to load vars for managed_node1 18285 1726853399.97548: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853399.97551: Calling groups_plugins_play to load vars for managed_node1 18285 1726853399.97790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853399.98176: done with get_vars() 18285 1726853399.98187: done getting variables 18285 1726853399.98227: done sending task result for task 02083763-bbaf-9200-7ca6-000000000043 18285 1726853399.98230: WORKER PROCESS EXITING 18285 1726853399.98263: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:29:59 -0400 (0:00:00.092) 0:00:05.919 ****** 18285 1726853399.98294: entering _queue_task() for managed_node1/package 18285 1726853399.98555: worker is 1 (out of 1 available) 18285 1726853399.98567: exiting _queue_task() for managed_node1/package 18285 1726853399.98579: done queuing things up, now waiting for results queue to drain 18285 1726853399.98580: waiting for pending results... 18285 1726853399.98867: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18285 1726853399.99005: in run() - task 02083763-bbaf-9200-7ca6-000000000044 18285 1726853399.99096: variable 'ansible_search_path' from source: unknown 18285 1726853399.99109: variable 'ansible_search_path' from source: unknown 18285 1726853399.99149: calling self._execute() 18285 1726853399.99245: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853399.99257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853399.99270: variable 'omit' from source: magic vars 18285 1726853399.99683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.02865: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.02942: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.03043: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.03047: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.03069: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.03158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.03193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.03223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.03272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.03368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.03424: variable 'ansible_distribution' from source: facts 18285 1726853400.03435: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.03456: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.03462: when evaluation is False, skipping this task 18285 1726853400.03467: _execute() done 18285 1726853400.03478: dumping result to json 18285 1726853400.03484: done dumping result, returning 18285 1726853400.03495: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-9200-7ca6-000000000044] 18285 1726853400.03504: sending task result for task 02083763-bbaf-9200-7ca6-000000000044 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.03803: no more pending results, returning what we have 18285 1726853400.03807: results queue empty 18285 1726853400.03808: checking for any_errors_fatal 18285 1726853400.03814: done checking for any_errors_fatal 18285 1726853400.03815: checking for max_fail_percentage 18285 1726853400.03817: done checking for max_fail_percentage 18285 1726853400.03818: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.03819: done checking to see if all hosts have failed 18285 1726853400.03820: getting the remaining hosts for this loop 18285 1726853400.03822: done getting the remaining hosts for this loop 18285 1726853400.03826: getting the next task for host managed_node1 18285 1726853400.03833: done getting next task for host managed_node1 18285 1726853400.03837: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18285 1726853400.03839: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.03853: getting variables 18285 1726853400.03855: in VariableManager get_vars() 18285 1726853400.03897: Calling all_inventory to load vars for managed_node1 18285 1726853400.03900: Calling groups_inventory to load vars for managed_node1 18285 1726853400.03903: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.03917: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.03920: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.03923: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.04738: done sending task result for task 02083763-bbaf-9200-7ca6-000000000044 18285 1726853400.04742: WORKER PROCESS EXITING 18285 1726853400.04760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.05267: done with get_vars() 18285 1726853400.05280: done getting variables 18285 1726853400.05336: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:30:00 -0400 (0:00:00.070) 0:00:05.990 ****** 18285 1726853400.05367: entering _queue_task() for managed_node1/package 18285 1726853400.05919: worker is 1 (out of 1 available) 18285 1726853400.05931: exiting _queue_task() for managed_node1/package 18285 1726853400.05942: done queuing things up, now waiting for results queue to drain 18285 1726853400.05944: waiting for pending results... 18285 1726853400.06445: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18285 1726853400.06645: in run() - task 02083763-bbaf-9200-7ca6-000000000045 18285 1726853400.06662: variable 'ansible_search_path' from source: unknown 18285 1726853400.06666: variable 'ansible_search_path' from source: unknown 18285 1726853400.06735: calling self._execute() 18285 1726853400.06928: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.06935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.06946: variable 'omit' from source: magic vars 18285 1726853400.07920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.13181: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.13211: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.13256: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.13516: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.13520: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.13563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.13660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.13764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.13811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.14078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.14208: variable 'ansible_distribution' from source: facts 18285 1726853400.14220: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.14245: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.14257: when evaluation is False, skipping this task 18285 1726853400.14266: _execute() done 18285 1726853400.14276: dumping result to json 18285 1726853400.14285: done dumping result, returning 18285 1726853400.14303: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-9200-7ca6-000000000045] 18285 1726853400.14402: sending task result for task 02083763-bbaf-9200-7ca6-000000000045 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.14548: no more pending results, returning what we have 18285 1726853400.14551: results queue empty 18285 1726853400.14552: checking for any_errors_fatal 18285 1726853400.14558: done checking for any_errors_fatal 18285 1726853400.14559: checking for max_fail_percentage 18285 1726853400.14560: done checking for max_fail_percentage 18285 1726853400.14561: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.14562: done checking to see if all hosts have failed 18285 1726853400.14563: getting the remaining hosts for this loop 18285 1726853400.14564: done getting the remaining hosts for this loop 18285 1726853400.14568: getting the next task for host managed_node1 18285 1726853400.14575: done getting next task for host managed_node1 18285 1726853400.14578: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18285 1726853400.14581: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.14594: getting variables 18285 1726853400.14596: in VariableManager get_vars() 18285 1726853400.14634: Calling all_inventory to load vars for managed_node1 18285 1726853400.14637: Calling groups_inventory to load vars for managed_node1 18285 1726853400.14639: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.14652: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.14655: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.14658: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.15094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.15481: done with get_vars() 18285 1726853400.15492: done getting variables 18285 1726853400.15519: done sending task result for task 02083763-bbaf-9200-7ca6-000000000045 18285 1726853400.15521: WORKER PROCESS EXITING 18285 1726853400.15548: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:30:00 -0400 (0:00:00.102) 0:00:06.092 ****** 18285 1726853400.15779: entering _queue_task() for managed_node1/package 18285 1726853400.16111: worker is 1 (out of 1 available) 18285 1726853400.16126: exiting _queue_task() for managed_node1/package 18285 1726853400.16138: done queuing things up, now waiting for results queue to drain 18285 1726853400.16140: waiting for pending results... 18285 1726853400.16740: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18285 1726853400.16869: in run() - task 02083763-bbaf-9200-7ca6-000000000046 18285 1726853400.17021: variable 'ansible_search_path' from source: unknown 18285 1726853400.17038: variable 'ansible_search_path' from source: unknown 18285 1726853400.17081: calling self._execute() 18285 1726853400.17232: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.17245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.17305: variable 'omit' from source: magic vars 18285 1726853400.18261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.21805: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.21858: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.21897: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.21924: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.21945: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.22004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.22024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.22041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.22074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.22085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.22175: variable 'ansible_distribution' from source: facts 18285 1726853400.22181: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.22197: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.22200: when evaluation is False, skipping this task 18285 1726853400.22202: _execute() done 18285 1726853400.22205: dumping result to json 18285 1726853400.22209: done dumping result, returning 18285 1726853400.22216: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-9200-7ca6-000000000046] 18285 1726853400.22221: sending task result for task 02083763-bbaf-9200-7ca6-000000000046 18285 1726853400.22313: done sending task result for task 02083763-bbaf-9200-7ca6-000000000046 18285 1726853400.22315: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.22359: no more pending results, returning what we have 18285 1726853400.22362: results queue empty 18285 1726853400.22363: checking for any_errors_fatal 18285 1726853400.22369: done checking for any_errors_fatal 18285 1726853400.22370: checking for max_fail_percentage 18285 1726853400.22373: done checking for max_fail_percentage 18285 1726853400.22374: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.22374: done checking to see if all hosts have failed 18285 1726853400.22375: getting the remaining hosts for this loop 18285 1726853400.22376: done getting the remaining hosts for this loop 18285 1726853400.22380: getting the next task for host managed_node1 18285 1726853400.22384: done getting next task for host managed_node1 18285 1726853400.22388: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18285 1726853400.22390: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.22401: getting variables 18285 1726853400.22403: in VariableManager get_vars() 18285 1726853400.22442: Calling all_inventory to load vars for managed_node1 18285 1726853400.22445: Calling groups_inventory to load vars for managed_node1 18285 1726853400.22447: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.22459: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.22462: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.22464: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.22820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.23012: done with get_vars() 18285 1726853400.23023: done getting variables 18285 1726853400.23080: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:30:00 -0400 (0:00:00.075) 0:00:06.167 ****** 18285 1726853400.23108: entering _queue_task() for managed_node1/service 18285 1726853400.23363: worker is 1 (out of 1 available) 18285 1726853400.23555: exiting _queue_task() for managed_node1/service 18285 1726853400.23566: done queuing things up, now waiting for results queue to drain 18285 1726853400.23568: waiting for pending results... 18285 1726853400.23757: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18285 1726853400.23977: in run() - task 02083763-bbaf-9200-7ca6-000000000047 18285 1726853400.23981: variable 'ansible_search_path' from source: unknown 18285 1726853400.23983: variable 'ansible_search_path' from source: unknown 18285 1726853400.23986: calling self._execute() 18285 1726853400.23990: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.23993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.23995: variable 'omit' from source: magic vars 18285 1726853400.24388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.26643: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.26668: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.26711: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.26757: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.26790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.26880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.26913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.26937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.26970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.26984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.27085: variable 'ansible_distribution' from source: facts 18285 1726853400.27089: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.27106: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.27110: when evaluation is False, skipping this task 18285 1726853400.27112: _execute() done 18285 1726853400.27115: dumping result to json 18285 1726853400.27118: done dumping result, returning 18285 1726853400.27126: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-000000000047] 18285 1726853400.27130: sending task result for task 02083763-bbaf-9200-7ca6-000000000047 18285 1726853400.27214: done sending task result for task 02083763-bbaf-9200-7ca6-000000000047 18285 1726853400.27217: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.27264: no more pending results, returning what we have 18285 1726853400.27267: results queue empty 18285 1726853400.27268: checking for any_errors_fatal 18285 1726853400.27276: done checking for any_errors_fatal 18285 1726853400.27277: checking for max_fail_percentage 18285 1726853400.27279: done checking for max_fail_percentage 18285 1726853400.27280: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.27280: done checking to see if all hosts have failed 18285 1726853400.27281: getting the remaining hosts for this loop 18285 1726853400.27282: done getting the remaining hosts for this loop 18285 1726853400.27286: getting the next task for host managed_node1 18285 1726853400.27291: done getting next task for host managed_node1 18285 1726853400.27294: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18285 1726853400.27296: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.27307: getting variables 18285 1726853400.27308: in VariableManager get_vars() 18285 1726853400.27343: Calling all_inventory to load vars for managed_node1 18285 1726853400.27346: Calling groups_inventory to load vars for managed_node1 18285 1726853400.27348: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.27357: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.27359: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.27361: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.27493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.27649: done with get_vars() 18285 1726853400.27660: done getting variables 18285 1726853400.27718: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:30:00 -0400 (0:00:00.046) 0:00:06.213 ****** 18285 1726853400.27745: entering _queue_task() for managed_node1/service 18285 1726853400.27994: worker is 1 (out of 1 available) 18285 1726853400.28008: exiting _queue_task() for managed_node1/service 18285 1726853400.28020: done queuing things up, now waiting for results queue to drain 18285 1726853400.28021: waiting for pending results... 18285 1726853400.28388: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18285 1726853400.28393: in run() - task 02083763-bbaf-9200-7ca6-000000000048 18285 1726853400.28398: variable 'ansible_search_path' from source: unknown 18285 1726853400.28405: variable 'ansible_search_path' from source: unknown 18285 1726853400.28445: calling self._execute() 18285 1726853400.28533: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.28546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.28565: variable 'omit' from source: magic vars 18285 1726853400.29017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.31080: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.31121: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.31158: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.31186: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.31205: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.31265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.31286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.31303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.31329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.31339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.31432: variable 'ansible_distribution' from source: facts 18285 1726853400.31435: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.31452: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.31456: when evaluation is False, skipping this task 18285 1726853400.31458: _execute() done 18285 1726853400.31462: dumping result to json 18285 1726853400.31464: done dumping result, returning 18285 1726853400.31469: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-9200-7ca6-000000000048] 18285 1726853400.31480: sending task result for task 02083763-bbaf-9200-7ca6-000000000048 18285 1726853400.31558: done sending task result for task 02083763-bbaf-9200-7ca6-000000000048 18285 1726853400.31560: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18285 1726853400.31616: no more pending results, returning what we have 18285 1726853400.31619: results queue empty 18285 1726853400.31620: checking for any_errors_fatal 18285 1726853400.31625: done checking for any_errors_fatal 18285 1726853400.31626: checking for max_fail_percentage 18285 1726853400.31628: done checking for max_fail_percentage 18285 1726853400.31629: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.31629: done checking to see if all hosts have failed 18285 1726853400.31630: getting the remaining hosts for this loop 18285 1726853400.31631: done getting the remaining hosts for this loop 18285 1726853400.31635: getting the next task for host managed_node1 18285 1726853400.31640: done getting next task for host managed_node1 18285 1726853400.31643: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18285 1726853400.31645: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.31659: getting variables 18285 1726853400.31660: in VariableManager get_vars() 18285 1726853400.31693: Calling all_inventory to load vars for managed_node1 18285 1726853400.31695: Calling groups_inventory to load vars for managed_node1 18285 1726853400.31697: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.31705: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.31707: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.31710: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.31874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.31986: done with get_vars() 18285 1726853400.31994: done getting variables 18285 1726853400.32031: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:30:00 -0400 (0:00:00.043) 0:00:06.256 ****** 18285 1726853400.32053: entering _queue_task() for managed_node1/service 18285 1726853400.32236: worker is 1 (out of 1 available) 18285 1726853400.32254: exiting _queue_task() for managed_node1/service 18285 1726853400.32267: done queuing things up, now waiting for results queue to drain 18285 1726853400.32268: waiting for pending results... 18285 1726853400.32420: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18285 1726853400.32476: in run() - task 02083763-bbaf-9200-7ca6-000000000049 18285 1726853400.32487: variable 'ansible_search_path' from source: unknown 18285 1726853400.32493: variable 'ansible_search_path' from source: unknown 18285 1726853400.32524: calling self._execute() 18285 1726853400.32777: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.32781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.32784: variable 'omit' from source: magic vars 18285 1726853400.33056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.34777: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.34820: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.34849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.34878: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.34899: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.34961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.34989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.35007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.35034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.35045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.35139: variable 'ansible_distribution' from source: facts 18285 1726853400.35143: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.35159: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.35162: when evaluation is False, skipping this task 18285 1726853400.35166: _execute() done 18285 1726853400.35168: dumping result to json 18285 1726853400.35172: done dumping result, returning 18285 1726853400.35183: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-9200-7ca6-000000000049] 18285 1726853400.35185: sending task result for task 02083763-bbaf-9200-7ca6-000000000049 18285 1726853400.35264: done sending task result for task 02083763-bbaf-9200-7ca6-000000000049 18285 1726853400.35266: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.35328: no more pending results, returning what we have 18285 1726853400.35331: results queue empty 18285 1726853400.35332: checking for any_errors_fatal 18285 1726853400.35338: done checking for any_errors_fatal 18285 1726853400.35339: checking for max_fail_percentage 18285 1726853400.35340: done checking for max_fail_percentage 18285 1726853400.35341: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.35342: done checking to see if all hosts have failed 18285 1726853400.35343: getting the remaining hosts for this loop 18285 1726853400.35344: done getting the remaining hosts for this loop 18285 1726853400.35347: getting the next task for host managed_node1 18285 1726853400.35354: done getting next task for host managed_node1 18285 1726853400.35358: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18285 1726853400.35359: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.35373: getting variables 18285 1726853400.35374: in VariableManager get_vars() 18285 1726853400.35404: Calling all_inventory to load vars for managed_node1 18285 1726853400.35406: Calling groups_inventory to load vars for managed_node1 18285 1726853400.35408: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.35416: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.35418: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.35421: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.35538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.35657: done with get_vars() 18285 1726853400.35663: done getting variables 18285 1726853400.35703: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:30:00 -0400 (0:00:00.036) 0:00:06.293 ****** 18285 1726853400.35723: entering _queue_task() for managed_node1/service 18285 1726853400.35906: worker is 1 (out of 1 available) 18285 1726853400.35921: exiting _queue_task() for managed_node1/service 18285 1726853400.35932: done queuing things up, now waiting for results queue to drain 18285 1726853400.35933: waiting for pending results... 18285 1726853400.36085: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18285 1726853400.36141: in run() - task 02083763-bbaf-9200-7ca6-00000000004a 18285 1726853400.36156: variable 'ansible_search_path' from source: unknown 18285 1726853400.36161: variable 'ansible_search_path' from source: unknown 18285 1726853400.36187: calling self._execute() 18285 1726853400.36242: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.36246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.36255: variable 'omit' from source: magic vars 18285 1726853400.36544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.37981: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.38026: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.38061: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.38089: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.38109: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.38165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.38187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.38203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.38230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.38243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.38330: variable 'ansible_distribution' from source: facts 18285 1726853400.38334: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.38353: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.38356: when evaluation is False, skipping this task 18285 1726853400.38358: _execute() done 18285 1726853400.38361: dumping result to json 18285 1726853400.38363: done dumping result, returning 18285 1726853400.38369: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-9200-7ca6-00000000004a] 18285 1726853400.38375: sending task result for task 02083763-bbaf-9200-7ca6-00000000004a 18285 1726853400.38454: done sending task result for task 02083763-bbaf-9200-7ca6-00000000004a 18285 1726853400.38457: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18285 1726853400.38498: no more pending results, returning what we have 18285 1726853400.38501: results queue empty 18285 1726853400.38502: checking for any_errors_fatal 18285 1726853400.38507: done checking for any_errors_fatal 18285 1726853400.38507: checking for max_fail_percentage 18285 1726853400.38509: done checking for max_fail_percentage 18285 1726853400.38510: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.38510: done checking to see if all hosts have failed 18285 1726853400.38511: getting the remaining hosts for this loop 18285 1726853400.38512: done getting the remaining hosts for this loop 18285 1726853400.38516: getting the next task for host managed_node1 18285 1726853400.38520: done getting next task for host managed_node1 18285 1726853400.38523: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18285 1726853400.38525: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.38537: getting variables 18285 1726853400.38538: in VariableManager get_vars() 18285 1726853400.38577: Calling all_inventory to load vars for managed_node1 18285 1726853400.38580: Calling groups_inventory to load vars for managed_node1 18285 1726853400.38582: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.38591: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.38593: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.38595: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.38747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.38862: done with get_vars() 18285 1726853400.38869: done getting variables 18285 1726853400.38910: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:30:00 -0400 (0:00:00.032) 0:00:06.325 ****** 18285 1726853400.38929: entering _queue_task() for managed_node1/copy 18285 1726853400.39108: worker is 1 (out of 1 available) 18285 1726853400.39121: exiting _queue_task() for managed_node1/copy 18285 1726853400.39132: done queuing things up, now waiting for results queue to drain 18285 1726853400.39133: waiting for pending results... 18285 1726853400.39287: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18285 1726853400.39338: in run() - task 02083763-bbaf-9200-7ca6-00000000004b 18285 1726853400.39351: variable 'ansible_search_path' from source: unknown 18285 1726853400.39357: variable 'ansible_search_path' from source: unknown 18285 1726853400.39388: calling self._execute() 18285 1726853400.39444: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.39448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.39459: variable 'omit' from source: magic vars 18285 1726853400.39766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.41231: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.41276: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.41303: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.41331: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.41351: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.41411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.41435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.41453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.41482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.41492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.41587: variable 'ansible_distribution' from source: facts 18285 1726853400.41590: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.41606: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.41609: when evaluation is False, skipping this task 18285 1726853400.41611: _execute() done 18285 1726853400.41614: dumping result to json 18285 1726853400.41617: done dumping result, returning 18285 1726853400.41625: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-9200-7ca6-00000000004b] 18285 1726853400.41630: sending task result for task 02083763-bbaf-9200-7ca6-00000000004b 18285 1726853400.41721: done sending task result for task 02083763-bbaf-9200-7ca6-00000000004b 18285 1726853400.41723: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.41797: no more pending results, returning what we have 18285 1726853400.41801: results queue empty 18285 1726853400.41802: checking for any_errors_fatal 18285 1726853400.41808: done checking for any_errors_fatal 18285 1726853400.41809: checking for max_fail_percentage 18285 1726853400.41810: done checking for max_fail_percentage 18285 1726853400.41811: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.41812: done checking to see if all hosts have failed 18285 1726853400.41813: getting the remaining hosts for this loop 18285 1726853400.41814: done getting the remaining hosts for this loop 18285 1726853400.41817: getting the next task for host managed_node1 18285 1726853400.41822: done getting next task for host managed_node1 18285 1726853400.41825: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18285 1726853400.41827: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.41839: getting variables 18285 1726853400.41840: in VariableManager get_vars() 18285 1726853400.41873: Calling all_inventory to load vars for managed_node1 18285 1726853400.41876: Calling groups_inventory to load vars for managed_node1 18285 1726853400.41878: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.41886: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.41888: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.41891: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.42019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.42138: done with get_vars() 18285 1726853400.42145: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:30:00 -0400 (0:00:00.032) 0:00:06.358 ****** 18285 1726853400.42205: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18285 1726853400.42402: worker is 1 (out of 1 available) 18285 1726853400.42415: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18285 1726853400.42426: done queuing things up, now waiting for results queue to drain 18285 1726853400.42428: waiting for pending results... 18285 1726853400.42586: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18285 1726853400.42650: in run() - task 02083763-bbaf-9200-7ca6-00000000004c 18285 1726853400.42665: variable 'ansible_search_path' from source: unknown 18285 1726853400.42668: variable 'ansible_search_path' from source: unknown 18285 1726853400.42697: calling self._execute() 18285 1726853400.42758: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.42763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.42777: variable 'omit' from source: magic vars 18285 1726853400.43070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.44596: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.44641: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.44681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.44707: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.44728: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.44790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.44810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.44827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.44859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.44870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.44968: variable 'ansible_distribution' from source: facts 18285 1726853400.44974: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.44990: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.44993: when evaluation is False, skipping this task 18285 1726853400.44996: _execute() done 18285 1726853400.44998: dumping result to json 18285 1726853400.45002: done dumping result, returning 18285 1726853400.45010: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-9200-7ca6-00000000004c] 18285 1726853400.45015: sending task result for task 02083763-bbaf-9200-7ca6-00000000004c 18285 1726853400.45111: done sending task result for task 02083763-bbaf-9200-7ca6-00000000004c 18285 1726853400.45114: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.45163: no more pending results, returning what we have 18285 1726853400.45166: results queue empty 18285 1726853400.45167: checking for any_errors_fatal 18285 1726853400.45174: done checking for any_errors_fatal 18285 1726853400.45174: checking for max_fail_percentage 18285 1726853400.45176: done checking for max_fail_percentage 18285 1726853400.45177: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.45178: done checking to see if all hosts have failed 18285 1726853400.45178: getting the remaining hosts for this loop 18285 1726853400.45180: done getting the remaining hosts for this loop 18285 1726853400.45184: getting the next task for host managed_node1 18285 1726853400.45189: done getting next task for host managed_node1 18285 1726853400.45193: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18285 1726853400.45195: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.45206: getting variables 18285 1726853400.45208: in VariableManager get_vars() 18285 1726853400.45245: Calling all_inventory to load vars for managed_node1 18285 1726853400.45247: Calling groups_inventory to load vars for managed_node1 18285 1726853400.45250: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.45260: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.45263: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.45265: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.45482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.45669: done with get_vars() 18285 1726853400.45696: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:30:00 -0400 (0:00:00.035) 0:00:06.394 ****** 18285 1726853400.45783: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18285 1726853400.46113: worker is 1 (out of 1 available) 18285 1726853400.46283: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18285 1726853400.46295: done queuing things up, now waiting for results queue to drain 18285 1726853400.46297: waiting for pending results... 18285 1726853400.46445: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18285 1726853400.46482: in run() - task 02083763-bbaf-9200-7ca6-00000000004d 18285 1726853400.46515: variable 'ansible_search_path' from source: unknown 18285 1726853400.46524: variable 'ansible_search_path' from source: unknown 18285 1726853400.46545: calling self._execute() 18285 1726853400.46612: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.46616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.46624: variable 'omit' from source: magic vars 18285 1726853400.46937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.48456: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.48676: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.48681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.48683: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.48686: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.48708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.48741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.48773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.48812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.48832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.48981: variable 'ansible_distribution' from source: facts 18285 1726853400.48991: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.49011: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.49016: when evaluation is False, skipping this task 18285 1726853400.49021: _execute() done 18285 1726853400.49028: dumping result to json 18285 1726853400.49035: done dumping result, returning 18285 1726853400.49045: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-9200-7ca6-00000000004d] 18285 1726853400.49052: sending task result for task 02083763-bbaf-9200-7ca6-00000000004d 18285 1726853400.49146: done sending task result for task 02083763-bbaf-9200-7ca6-00000000004d 18285 1726853400.49152: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.49232: no more pending results, returning what we have 18285 1726853400.49235: results queue empty 18285 1726853400.49236: checking for any_errors_fatal 18285 1726853400.49243: done checking for any_errors_fatal 18285 1726853400.49243: checking for max_fail_percentage 18285 1726853400.49245: done checking for max_fail_percentage 18285 1726853400.49246: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.49247: done checking to see if all hosts have failed 18285 1726853400.49247: getting the remaining hosts for this loop 18285 1726853400.49251: done getting the remaining hosts for this loop 18285 1726853400.49255: getting the next task for host managed_node1 18285 1726853400.49260: done getting next task for host managed_node1 18285 1726853400.49264: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18285 1726853400.49266: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.49279: getting variables 18285 1726853400.49281: in VariableManager get_vars() 18285 1726853400.49313: Calling all_inventory to load vars for managed_node1 18285 1726853400.49316: Calling groups_inventory to load vars for managed_node1 18285 1726853400.49318: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.49326: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.49328: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.49330: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.49756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.49975: done with get_vars() 18285 1726853400.49986: done getting variables 18285 1726853400.50053: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:30:00 -0400 (0:00:00.043) 0:00:06.437 ****** 18285 1726853400.50085: entering _queue_task() for managed_node1/debug 18285 1726853400.50494: worker is 1 (out of 1 available) 18285 1726853400.50505: exiting _queue_task() for managed_node1/debug 18285 1726853400.50515: done queuing things up, now waiting for results queue to drain 18285 1726853400.50517: waiting for pending results... 18285 1726853400.50709: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18285 1726853400.50901: in run() - task 02083763-bbaf-9200-7ca6-00000000004e 18285 1726853400.50905: variable 'ansible_search_path' from source: unknown 18285 1726853400.50907: variable 'ansible_search_path' from source: unknown 18285 1726853400.50911: calling self._execute() 18285 1726853400.51064: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.51081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.51098: variable 'omit' from source: magic vars 18285 1726853400.51575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.54012: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.54170: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.54177: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.54199: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.54231: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.54325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.54366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.54408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.54455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.54716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.55181: variable 'ansible_distribution' from source: facts 18285 1726853400.55186: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.55189: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.55191: when evaluation is False, skipping this task 18285 1726853400.55193: _execute() done 18285 1726853400.55195: dumping result to json 18285 1726853400.55198: done dumping result, returning 18285 1726853400.55200: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-9200-7ca6-00000000004e] 18285 1726853400.55202: sending task result for task 02083763-bbaf-9200-7ca6-00000000004e 18285 1726853400.55280: done sending task result for task 02083763-bbaf-9200-7ca6-00000000004e 18285 1726853400.55283: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853400.55333: no more pending results, returning what we have 18285 1726853400.55338: results queue empty 18285 1726853400.55339: checking for any_errors_fatal 18285 1726853400.55346: done checking for any_errors_fatal 18285 1726853400.55347: checking for max_fail_percentage 18285 1726853400.55351: done checking for max_fail_percentage 18285 1726853400.55353: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.55353: done checking to see if all hosts have failed 18285 1726853400.55354: getting the remaining hosts for this loop 18285 1726853400.55356: done getting the remaining hosts for this loop 18285 1726853400.55360: getting the next task for host managed_node1 18285 1726853400.55366: done getting next task for host managed_node1 18285 1726853400.55370: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18285 1726853400.55374: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.55388: getting variables 18285 1726853400.55390: in VariableManager get_vars() 18285 1726853400.55431: Calling all_inventory to load vars for managed_node1 18285 1726853400.55433: Calling groups_inventory to load vars for managed_node1 18285 1726853400.55436: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.55448: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.55453: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.55457: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.56313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.56779: done with get_vars() 18285 1726853400.56793: done getting variables 18285 1726853400.56857: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:30:00 -0400 (0:00:00.069) 0:00:06.506 ****** 18285 1726853400.56997: entering _queue_task() for managed_node1/debug 18285 1726853400.57312: worker is 1 (out of 1 available) 18285 1726853400.57438: exiting _queue_task() for managed_node1/debug 18285 1726853400.57448: done queuing things up, now waiting for results queue to drain 18285 1726853400.57452: waiting for pending results... 18285 1726853400.57663: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18285 1726853400.57759: in run() - task 02083763-bbaf-9200-7ca6-00000000004f 18285 1726853400.57762: variable 'ansible_search_path' from source: unknown 18285 1726853400.57765: variable 'ansible_search_path' from source: unknown 18285 1726853400.57795: calling self._execute() 18285 1726853400.57886: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.57903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.57977: variable 'omit' from source: magic vars 18285 1726853400.58380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.60858: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.60963: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.61016: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.61067: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.61102: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.61199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.61246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.61356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.61359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.61361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.61486: variable 'ansible_distribution' from source: facts 18285 1726853400.61498: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.61520: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.61526: when evaluation is False, skipping this task 18285 1726853400.61533: _execute() done 18285 1726853400.61539: dumping result to json 18285 1726853400.61546: done dumping result, returning 18285 1726853400.61560: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-9200-7ca6-00000000004f] 18285 1726853400.61581: sending task result for task 02083763-bbaf-9200-7ca6-00000000004f skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853400.61821: no more pending results, returning what we have 18285 1726853400.61824: results queue empty 18285 1726853400.61825: checking for any_errors_fatal 18285 1726853400.61834: done checking for any_errors_fatal 18285 1726853400.61834: checking for max_fail_percentage 18285 1726853400.61837: done checking for max_fail_percentage 18285 1726853400.61837: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.61838: done checking to see if all hosts have failed 18285 1726853400.61839: getting the remaining hosts for this loop 18285 1726853400.61840: done getting the remaining hosts for this loop 18285 1726853400.61844: getting the next task for host managed_node1 18285 1726853400.61851: done getting next task for host managed_node1 18285 1726853400.61855: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18285 1726853400.61857: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.61868: getting variables 18285 1726853400.61870: in VariableManager get_vars() 18285 1726853400.61908: Calling all_inventory to load vars for managed_node1 18285 1726853400.61911: Calling groups_inventory to load vars for managed_node1 18285 1726853400.61913: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.61922: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.61924: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.61927: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.62115: done sending task result for task 02083763-bbaf-9200-7ca6-00000000004f 18285 1726853400.62123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.62245: done with get_vars() 18285 1726853400.62255: done getting variables 18285 1726853400.62289: WORKER PROCESS EXITING 18285 1726853400.62316: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:30:00 -0400 (0:00:00.053) 0:00:06.559 ****** 18285 1726853400.62339: entering _queue_task() for managed_node1/debug 18285 1726853400.62552: worker is 1 (out of 1 available) 18285 1726853400.62567: exiting _queue_task() for managed_node1/debug 18285 1726853400.62581: done queuing things up, now waiting for results queue to drain 18285 1726853400.62583: waiting for pending results... 18285 1726853400.62740: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18285 1726853400.62796: in run() - task 02083763-bbaf-9200-7ca6-000000000050 18285 1726853400.62813: variable 'ansible_search_path' from source: unknown 18285 1726853400.62817: variable 'ansible_search_path' from source: unknown 18285 1726853400.62841: calling self._execute() 18285 1726853400.62902: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.62906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.62920: variable 'omit' from source: magic vars 18285 1726853400.63224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.65174: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.65217: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.65419: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.65444: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.65466: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.65528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.65548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.65567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.65594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.65607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.65698: variable 'ansible_distribution' from source: facts 18285 1726853400.65702: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.65720: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.65723: when evaluation is False, skipping this task 18285 1726853400.65726: _execute() done 18285 1726853400.65729: dumping result to json 18285 1726853400.65731: done dumping result, returning 18285 1726853400.65739: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-9200-7ca6-000000000050] 18285 1726853400.65744: sending task result for task 02083763-bbaf-9200-7ca6-000000000050 18285 1726853400.65828: done sending task result for task 02083763-bbaf-9200-7ca6-000000000050 18285 1726853400.65831: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853400.65879: no more pending results, returning what we have 18285 1726853400.65882: results queue empty 18285 1726853400.65883: checking for any_errors_fatal 18285 1726853400.65889: done checking for any_errors_fatal 18285 1726853400.65890: checking for max_fail_percentage 18285 1726853400.65891: done checking for max_fail_percentage 18285 1726853400.65892: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.65893: done checking to see if all hosts have failed 18285 1726853400.65893: getting the remaining hosts for this loop 18285 1726853400.65895: done getting the remaining hosts for this loop 18285 1726853400.65898: getting the next task for host managed_node1 18285 1726853400.65904: done getting next task for host managed_node1 18285 1726853400.65907: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18285 1726853400.65908: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.65921: getting variables 18285 1726853400.65922: in VariableManager get_vars() 18285 1726853400.65959: Calling all_inventory to load vars for managed_node1 18285 1726853400.65961: Calling groups_inventory to load vars for managed_node1 18285 1726853400.65964: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.65976: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.65978: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.65981: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.66160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.66278: done with get_vars() 18285 1726853400.66286: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:30:00 -0400 (0:00:00.040) 0:00:06.599 ****** 18285 1726853400.66348: entering _queue_task() for managed_node1/ping 18285 1726853400.66537: worker is 1 (out of 1 available) 18285 1726853400.66554: exiting _queue_task() for managed_node1/ping 18285 1726853400.66564: done queuing things up, now waiting for results queue to drain 18285 1726853400.66566: waiting for pending results... 18285 1726853400.66726: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18285 1726853400.66786: in run() - task 02083763-bbaf-9200-7ca6-000000000051 18285 1726853400.66800: variable 'ansible_search_path' from source: unknown 18285 1726853400.66804: variable 'ansible_search_path' from source: unknown 18285 1726853400.66829: calling self._execute() 18285 1726853400.66906: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.66910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.66916: variable 'omit' from source: magic vars 18285 1726853400.67328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.69349: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.69397: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.69424: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.69449: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.69476: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.69531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.69551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.69570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.69601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.69612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.69705: variable 'ansible_distribution' from source: facts 18285 1726853400.69709: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.69725: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.69728: when evaluation is False, skipping this task 18285 1726853400.69730: _execute() done 18285 1726853400.69733: dumping result to json 18285 1726853400.69737: done dumping result, returning 18285 1726853400.69744: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-9200-7ca6-000000000051] 18285 1726853400.69749: sending task result for task 02083763-bbaf-9200-7ca6-000000000051 18285 1726853400.69834: done sending task result for task 02083763-bbaf-9200-7ca6-000000000051 18285 1726853400.69837: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.69883: no more pending results, returning what we have 18285 1726853400.69886: results queue empty 18285 1726853400.69887: checking for any_errors_fatal 18285 1726853400.69893: done checking for any_errors_fatal 18285 1726853400.69893: checking for max_fail_percentage 18285 1726853400.69895: done checking for max_fail_percentage 18285 1726853400.69896: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.69896: done checking to see if all hosts have failed 18285 1726853400.69897: getting the remaining hosts for this loop 18285 1726853400.69898: done getting the remaining hosts for this loop 18285 1726853400.69902: getting the next task for host managed_node1 18285 1726853400.69908: done getting next task for host managed_node1 18285 1726853400.69910: ^ task is: TASK: meta (role_complete) 18285 1726853400.69912: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.69925: getting variables 18285 1726853400.69927: in VariableManager get_vars() 18285 1726853400.69963: Calling all_inventory to load vars for managed_node1 18285 1726853400.69965: Calling groups_inventory to load vars for managed_node1 18285 1726853400.69967: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.69980: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.69983: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.69985: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.70169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.70340: done with get_vars() 18285 1726853400.70353: done getting variables 18285 1726853400.70431: done queuing things up, now waiting for results queue to drain 18285 1726853400.70433: results queue empty 18285 1726853400.70434: checking for any_errors_fatal 18285 1726853400.70436: done checking for any_errors_fatal 18285 1726853400.70436: checking for max_fail_percentage 18285 1726853400.70438: done checking for max_fail_percentage 18285 1726853400.70438: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.70439: done checking to see if all hosts have failed 18285 1726853400.70439: getting the remaining hosts for this loop 18285 1726853400.70440: done getting the remaining hosts for this loop 18285 1726853400.70442: getting the next task for host managed_node1 18285 1726853400.70445: done getting next task for host managed_node1 18285 1726853400.70446: ^ task is: TASK: meta (flush_handlers) 18285 1726853400.70447: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.70452: getting variables 18285 1726853400.70452: in VariableManager get_vars() 18285 1726853400.70463: Calling all_inventory to load vars for managed_node1 18285 1726853400.70465: Calling groups_inventory to load vars for managed_node1 18285 1726853400.70466: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.70472: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.70475: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.70477: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.70612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.70845: done with get_vars() 18285 1726853400.70856: done getting variables 18285 1726853400.70911: in VariableManager get_vars() 18285 1726853400.70919: Calling all_inventory to load vars for managed_node1 18285 1726853400.70924: Calling groups_inventory to load vars for managed_node1 18285 1726853400.70926: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.70929: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.70930: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.70932: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.71034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.71177: done with get_vars() 18285 1726853400.71189: done queuing things up, now waiting for results queue to drain 18285 1726853400.71190: results queue empty 18285 1726853400.71191: checking for any_errors_fatal 18285 1726853400.71192: done checking for any_errors_fatal 18285 1726853400.71193: checking for max_fail_percentage 18285 1726853400.71194: done checking for max_fail_percentage 18285 1726853400.71194: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.71195: done checking to see if all hosts have failed 18285 1726853400.71195: getting the remaining hosts for this loop 18285 1726853400.71196: done getting the remaining hosts for this loop 18285 1726853400.71198: getting the next task for host managed_node1 18285 1726853400.71201: done getting next task for host managed_node1 18285 1726853400.71203: ^ task is: TASK: meta (flush_handlers) 18285 1726853400.71204: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.71206: getting variables 18285 1726853400.71207: in VariableManager get_vars() 18285 1726853400.71215: Calling all_inventory to load vars for managed_node1 18285 1726853400.71217: Calling groups_inventory to load vars for managed_node1 18285 1726853400.71219: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.71223: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.71225: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.71228: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.71356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.71560: done with get_vars() 18285 1726853400.71567: done getting variables 18285 1726853400.71611: in VariableManager get_vars() 18285 1726853400.71621: Calling all_inventory to load vars for managed_node1 18285 1726853400.71622: Calling groups_inventory to load vars for managed_node1 18285 1726853400.71624: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.71628: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.71630: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.71633: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.71763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.71946: done with get_vars() 18285 1726853400.71960: done queuing things up, now waiting for results queue to drain 18285 1726853400.71962: results queue empty 18285 1726853400.71963: checking for any_errors_fatal 18285 1726853400.71964: done checking for any_errors_fatal 18285 1726853400.71965: checking for max_fail_percentage 18285 1726853400.71966: done checking for max_fail_percentage 18285 1726853400.71967: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.71967: done checking to see if all hosts have failed 18285 1726853400.71968: getting the remaining hosts for this loop 18285 1726853400.71969: done getting the remaining hosts for this loop 18285 1726853400.71973: getting the next task for host managed_node1 18285 1726853400.71976: done getting next task for host managed_node1 18285 1726853400.71977: ^ task is: None 18285 1726853400.71979: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.71980: done queuing things up, now waiting for results queue to drain 18285 1726853400.71981: results queue empty 18285 1726853400.71981: checking for any_errors_fatal 18285 1726853400.71982: done checking for any_errors_fatal 18285 1726853400.71983: checking for max_fail_percentage 18285 1726853400.71984: done checking for max_fail_percentage 18285 1726853400.71985: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.71985: done checking to see if all hosts have failed 18285 1726853400.71986: getting the next task for host managed_node1 18285 1726853400.71989: done getting next task for host managed_node1 18285 1726853400.71990: ^ task is: None 18285 1726853400.71991: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.72025: in VariableManager get_vars() 18285 1726853400.72039: done with get_vars() 18285 1726853400.72045: in VariableManager get_vars() 18285 1726853400.72058: done with get_vars() 18285 1726853400.72062: variable 'omit' from source: magic vars 18285 1726853400.72096: in VariableManager get_vars() 18285 1726853400.72106: done with get_vars() 18285 1726853400.72126: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 18285 1726853400.72319: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18285 1726853400.72338: getting the remaining hosts for this loop 18285 1726853400.72339: done getting the remaining hosts for this loop 18285 1726853400.72340: getting the next task for host managed_node1 18285 1726853400.72342: done getting next task for host managed_node1 18285 1726853400.72343: ^ task is: TASK: Gathering Facts 18285 1726853400.72344: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.72346: getting variables 18285 1726853400.72346: in VariableManager get_vars() 18285 1726853400.72354: Calling all_inventory to load vars for managed_node1 18285 1726853400.72355: Calling groups_inventory to load vars for managed_node1 18285 1726853400.72357: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.72361: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.72363: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.72365: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.72444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.72722: done with get_vars() 18285 1726853400.72727: done getting variables 18285 1726853400.72756: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 13:30:00 -0400 (0:00:00.064) 0:00:06.664 ****** 18285 1726853400.72778: entering _queue_task() for managed_node1/gather_facts 18285 1726853400.72999: worker is 1 (out of 1 available) 18285 1726853400.73011: exiting _queue_task() for managed_node1/gather_facts 18285 1726853400.73022: done queuing things up, now waiting for results queue to drain 18285 1726853400.73024: waiting for pending results... 18285 1726853400.73188: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18285 1726853400.73260: in run() - task 02083763-bbaf-9200-7ca6-000000000231 18285 1726853400.73264: variable 'ansible_search_path' from source: unknown 18285 1726853400.73294: calling self._execute() 18285 1726853400.73352: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.73357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.73362: variable 'omit' from source: magic vars 18285 1726853400.73673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.75639: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.75690: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.75717: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.75742: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.75762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.75828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.75848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.75866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.75897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.75909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.76010: variable 'ansible_distribution' from source: facts 18285 1726853400.76014: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.76030: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.76033: when evaluation is False, skipping this task 18285 1726853400.76036: _execute() done 18285 1726853400.76039: dumping result to json 18285 1726853400.76041: done dumping result, returning 18285 1726853400.76052: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-9200-7ca6-000000000231] 18285 1726853400.76054: sending task result for task 02083763-bbaf-9200-7ca6-000000000231 18285 1726853400.76138: done sending task result for task 02083763-bbaf-9200-7ca6-000000000231 18285 1726853400.76141: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.76189: no more pending results, returning what we have 18285 1726853400.76193: results queue empty 18285 1726853400.76193: checking for any_errors_fatal 18285 1726853400.76195: done checking for any_errors_fatal 18285 1726853400.76195: checking for max_fail_percentage 18285 1726853400.76197: done checking for max_fail_percentage 18285 1726853400.76197: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.76198: done checking to see if all hosts have failed 18285 1726853400.76199: getting the remaining hosts for this loop 18285 1726853400.76200: done getting the remaining hosts for this loop 18285 1726853400.76204: getting the next task for host managed_node1 18285 1726853400.76209: done getting next task for host managed_node1 18285 1726853400.76210: ^ task is: TASK: meta (flush_handlers) 18285 1726853400.76212: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.76215: getting variables 18285 1726853400.76217: in VariableManager get_vars() 18285 1726853400.76245: Calling all_inventory to load vars for managed_node1 18285 1726853400.76247: Calling groups_inventory to load vars for managed_node1 18285 1726853400.76252: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.76264: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.76267: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.76269: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.76425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.76539: done with get_vars() 18285 1726853400.76546: done getting variables 18285 1726853400.76597: in VariableManager get_vars() 18285 1726853400.76605: Calling all_inventory to load vars for managed_node1 18285 1726853400.76606: Calling groups_inventory to load vars for managed_node1 18285 1726853400.76608: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.76611: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.76612: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.76614: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.76716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.76819: done with get_vars() 18285 1726853400.76828: done queuing things up, now waiting for results queue to drain 18285 1726853400.76829: results queue empty 18285 1726853400.76830: checking for any_errors_fatal 18285 1726853400.76831: done checking for any_errors_fatal 18285 1726853400.76832: checking for max_fail_percentage 18285 1726853400.76832: done checking for max_fail_percentage 18285 1726853400.76833: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.76833: done checking to see if all hosts have failed 18285 1726853400.76834: getting the remaining hosts for this loop 18285 1726853400.76834: done getting the remaining hosts for this loop 18285 1726853400.76836: getting the next task for host managed_node1 18285 1726853400.76839: done getting next task for host managed_node1 18285 1726853400.76840: ^ task is: TASK: Include the task 'delete_interface.yml' 18285 1726853400.76841: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.76843: getting variables 18285 1726853400.76843: in VariableManager get_vars() 18285 1726853400.76848: Calling all_inventory to load vars for managed_node1 18285 1726853400.76851: Calling groups_inventory to load vars for managed_node1 18285 1726853400.76853: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.76860: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.76862: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.76863: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.76957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.77066: done with get_vars() 18285 1726853400.77073: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 13:30:00 -0400 (0:00:00.043) 0:00:06.707 ****** 18285 1726853400.77119: entering _queue_task() for managed_node1/include_tasks 18285 1726853400.77352: worker is 1 (out of 1 available) 18285 1726853400.77365: exiting _queue_task() for managed_node1/include_tasks 18285 1726853400.77378: done queuing things up, now waiting for results queue to drain 18285 1726853400.77380: waiting for pending results... 18285 1726853400.77544: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 18285 1726853400.77604: in run() - task 02083763-bbaf-9200-7ca6-000000000054 18285 1726853400.77617: variable 'ansible_search_path' from source: unknown 18285 1726853400.77644: calling self._execute() 18285 1726853400.77705: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.77711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.77726: variable 'omit' from source: magic vars 18285 1726853400.78031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.79556: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.79606: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.79634: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.79660: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.79681: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.79743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.79763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.79783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.79813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.79823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.79921: variable 'ansible_distribution' from source: facts 18285 1726853400.79925: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.79941: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.79944: when evaluation is False, skipping this task 18285 1726853400.79947: _execute() done 18285 1726853400.79952: dumping result to json 18285 1726853400.79955: done dumping result, returning 18285 1726853400.79960: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [02083763-bbaf-9200-7ca6-000000000054] 18285 1726853400.79965: sending task result for task 02083763-bbaf-9200-7ca6-000000000054 18285 1726853400.80054: done sending task result for task 02083763-bbaf-9200-7ca6-000000000054 18285 1726853400.80056: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.80104: no more pending results, returning what we have 18285 1726853400.80107: results queue empty 18285 1726853400.80108: checking for any_errors_fatal 18285 1726853400.80110: done checking for any_errors_fatal 18285 1726853400.80110: checking for max_fail_percentage 18285 1726853400.80112: done checking for max_fail_percentage 18285 1726853400.80112: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.80113: done checking to see if all hosts have failed 18285 1726853400.80114: getting the remaining hosts for this loop 18285 1726853400.80115: done getting the remaining hosts for this loop 18285 1726853400.80119: getting the next task for host managed_node1 18285 1726853400.80125: done getting next task for host managed_node1 18285 1726853400.80127: ^ task is: TASK: meta (flush_handlers) 18285 1726853400.80129: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.80132: getting variables 18285 1726853400.80133: in VariableManager get_vars() 18285 1726853400.80162: Calling all_inventory to load vars for managed_node1 18285 1726853400.80166: Calling groups_inventory to load vars for managed_node1 18285 1726853400.80170: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.80183: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.80185: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.80188: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.80336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.80456: done with get_vars() 18285 1726853400.80464: done getting variables 18285 1726853400.80516: in VariableManager get_vars() 18285 1726853400.80522: Calling all_inventory to load vars for managed_node1 18285 1726853400.80524: Calling groups_inventory to load vars for managed_node1 18285 1726853400.80525: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.80528: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.80529: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.80531: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.80647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.80755: done with get_vars() 18285 1726853400.80764: done queuing things up, now waiting for results queue to drain 18285 1726853400.80766: results queue empty 18285 1726853400.80766: checking for any_errors_fatal 18285 1726853400.80767: done checking for any_errors_fatal 18285 1726853400.80768: checking for max_fail_percentage 18285 1726853400.80769: done checking for max_fail_percentage 18285 1726853400.80769: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.80769: done checking to see if all hosts have failed 18285 1726853400.80770: getting the remaining hosts for this loop 18285 1726853400.80773: done getting the remaining hosts for this loop 18285 1726853400.80775: getting the next task for host managed_node1 18285 1726853400.80777: done getting next task for host managed_node1 18285 1726853400.80778: ^ task is: TASK: meta (flush_handlers) 18285 1726853400.80779: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.80781: getting variables 18285 1726853400.80782: in VariableManager get_vars() 18285 1726853400.80787: Calling all_inventory to load vars for managed_node1 18285 1726853400.80788: Calling groups_inventory to load vars for managed_node1 18285 1726853400.80789: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.80796: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.80798: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.80800: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.80881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.80987: done with get_vars() 18285 1726853400.80992: done getting variables 18285 1726853400.81022: in VariableManager get_vars() 18285 1726853400.81028: Calling all_inventory to load vars for managed_node1 18285 1726853400.81029: Calling groups_inventory to load vars for managed_node1 18285 1726853400.81031: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.81034: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.81036: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.81038: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.81137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.81239: done with get_vars() 18285 1726853400.81247: done queuing things up, now waiting for results queue to drain 18285 1726853400.81250: results queue empty 18285 1726853400.81251: checking for any_errors_fatal 18285 1726853400.81252: done checking for any_errors_fatal 18285 1726853400.81252: checking for max_fail_percentage 18285 1726853400.81253: done checking for max_fail_percentage 18285 1726853400.81254: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.81255: done checking to see if all hosts have failed 18285 1726853400.81255: getting the remaining hosts for this loop 18285 1726853400.81256: done getting the remaining hosts for this loop 18285 1726853400.81258: getting the next task for host managed_node1 18285 1726853400.81260: done getting next task for host managed_node1 18285 1726853400.81261: ^ task is: None 18285 1726853400.81262: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.81262: done queuing things up, now waiting for results queue to drain 18285 1726853400.81263: results queue empty 18285 1726853400.81263: checking for any_errors_fatal 18285 1726853400.81264: done checking for any_errors_fatal 18285 1726853400.81264: checking for max_fail_percentage 18285 1726853400.81265: done checking for max_fail_percentage 18285 1726853400.81265: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.81266: done checking to see if all hosts have failed 18285 1726853400.81266: getting the next task for host managed_node1 18285 1726853400.81268: done getting next task for host managed_node1 18285 1726853400.81268: ^ task is: None 18285 1726853400.81269: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.81300: in VariableManager get_vars() 18285 1726853400.81315: done with get_vars() 18285 1726853400.81318: in VariableManager get_vars() 18285 1726853400.81325: done with get_vars() 18285 1726853400.81328: variable 'omit' from source: magic vars 18285 1726853400.81414: variable 'profile' from source: play vars 18285 1726853400.81485: in VariableManager get_vars() 18285 1726853400.81495: done with get_vars() 18285 1726853400.81509: variable 'omit' from source: magic vars 18285 1726853400.81553: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 18285 1726853400.81953: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18285 1726853400.81975: getting the remaining hosts for this loop 18285 1726853400.81976: done getting the remaining hosts for this loop 18285 1726853400.81978: getting the next task for host managed_node1 18285 1726853400.81980: done getting next task for host managed_node1 18285 1726853400.81981: ^ task is: TASK: Gathering Facts 18285 1726853400.81982: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.81983: getting variables 18285 1726853400.81984: in VariableManager get_vars() 18285 1726853400.81991: Calling all_inventory to load vars for managed_node1 18285 1726853400.81993: Calling groups_inventory to load vars for managed_node1 18285 1726853400.81994: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.81997: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.81999: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.82000: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.82084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.82215: done with get_vars() 18285 1726853400.82221: done getting variables 18285 1726853400.82253: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 13:30:00 -0400 (0:00:00.051) 0:00:06.759 ****** 18285 1726853400.82272: entering _queue_task() for managed_node1/gather_facts 18285 1726853400.82491: worker is 1 (out of 1 available) 18285 1726853400.82504: exiting _queue_task() for managed_node1/gather_facts 18285 1726853400.82515: done queuing things up, now waiting for results queue to drain 18285 1726853400.82517: waiting for pending results... 18285 1726853400.82679: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18285 1726853400.82741: in run() - task 02083763-bbaf-9200-7ca6-000000000246 18285 1726853400.82753: variable 'ansible_search_path' from source: unknown 18285 1726853400.82780: calling self._execute() 18285 1726853400.82843: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.82852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.82856: variable 'omit' from source: magic vars 18285 1726853400.83157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.84677: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.84722: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.84751: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.84779: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.84799: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.84858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.84882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.84900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.84928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.84939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.85036: variable 'ansible_distribution' from source: facts 18285 1726853400.85041: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.85057: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.85060: when evaluation is False, skipping this task 18285 1726853400.85062: _execute() done 18285 1726853400.85065: dumping result to json 18285 1726853400.85069: done dumping result, returning 18285 1726853400.85078: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-9200-7ca6-000000000246] 18285 1726853400.85087: sending task result for task 02083763-bbaf-9200-7ca6-000000000246 18285 1726853400.85160: done sending task result for task 02083763-bbaf-9200-7ca6-000000000246 18285 1726853400.85162: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.85210: no more pending results, returning what we have 18285 1726853400.85214: results queue empty 18285 1726853400.85215: checking for any_errors_fatal 18285 1726853400.85216: done checking for any_errors_fatal 18285 1726853400.85217: checking for max_fail_percentage 18285 1726853400.85218: done checking for max_fail_percentage 18285 1726853400.85219: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.85220: done checking to see if all hosts have failed 18285 1726853400.85220: getting the remaining hosts for this loop 18285 1726853400.85222: done getting the remaining hosts for this loop 18285 1726853400.85225: getting the next task for host managed_node1 18285 1726853400.85231: done getting next task for host managed_node1 18285 1726853400.85232: ^ task is: TASK: meta (flush_handlers) 18285 1726853400.85234: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.85237: getting variables 18285 1726853400.85239: in VariableManager get_vars() 18285 1726853400.85278: Calling all_inventory to load vars for managed_node1 18285 1726853400.85281: Calling groups_inventory to load vars for managed_node1 18285 1726853400.85283: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.85294: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.85297: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.85299: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.85454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.85576: done with get_vars() 18285 1726853400.85584: done getting variables 18285 1726853400.85633: in VariableManager get_vars() 18285 1726853400.85641: Calling all_inventory to load vars for managed_node1 18285 1726853400.85642: Calling groups_inventory to load vars for managed_node1 18285 1726853400.85644: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.85647: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.85648: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.85652: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.85758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.85867: done with get_vars() 18285 1726853400.85879: done queuing things up, now waiting for results queue to drain 18285 1726853400.85880: results queue empty 18285 1726853400.85881: checking for any_errors_fatal 18285 1726853400.85883: done checking for any_errors_fatal 18285 1726853400.85884: checking for max_fail_percentage 18285 1726853400.85884: done checking for max_fail_percentage 18285 1726853400.85885: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.85885: done checking to see if all hosts have failed 18285 1726853400.85886: getting the remaining hosts for this loop 18285 1726853400.85886: done getting the remaining hosts for this loop 18285 1726853400.85888: getting the next task for host managed_node1 18285 1726853400.85890: done getting next task for host managed_node1 18285 1726853400.85892: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18285 1726853400.85893: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.85900: getting variables 18285 1726853400.85900: in VariableManager get_vars() 18285 1726853400.85908: Calling all_inventory to load vars for managed_node1 18285 1726853400.85909: Calling groups_inventory to load vars for managed_node1 18285 1726853400.85910: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.85918: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.85921: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.85923: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.86001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.86113: done with get_vars() 18285 1726853400.86119: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:30:00 -0400 (0:00:00.038) 0:00:06.798 ****** 18285 1726853400.86172: entering _queue_task() for managed_node1/include_tasks 18285 1726853400.86387: worker is 1 (out of 1 available) 18285 1726853400.86401: exiting _queue_task() for managed_node1/include_tasks 18285 1726853400.86411: done queuing things up, now waiting for results queue to drain 18285 1726853400.86413: waiting for pending results... 18285 1726853400.86575: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18285 1726853400.86637: in run() - task 02083763-bbaf-9200-7ca6-00000000005c 18285 1726853400.86653: variable 'ansible_search_path' from source: unknown 18285 1726853400.86656: variable 'ansible_search_path' from source: unknown 18285 1726853400.86682: calling self._execute() 18285 1726853400.86744: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.86753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.86757: variable 'omit' from source: magic vars 18285 1726853400.87052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.88561: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.88607: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.88635: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.88662: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.88683: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.88744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.88765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.88785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.88810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.88821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.88914: variable 'ansible_distribution' from source: facts 18285 1726853400.88918: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.88935: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.88939: when evaluation is False, skipping this task 18285 1726853400.88941: _execute() done 18285 1726853400.88944: dumping result to json 18285 1726853400.88946: done dumping result, returning 18285 1726853400.88959: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-9200-7ca6-00000000005c] 18285 1726853400.88962: sending task result for task 02083763-bbaf-9200-7ca6-00000000005c 18285 1726853400.89042: done sending task result for task 02083763-bbaf-9200-7ca6-00000000005c 18285 1726853400.89044: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.89103: no more pending results, returning what we have 18285 1726853400.89106: results queue empty 18285 1726853400.89107: checking for any_errors_fatal 18285 1726853400.89109: done checking for any_errors_fatal 18285 1726853400.89110: checking for max_fail_percentage 18285 1726853400.89111: done checking for max_fail_percentage 18285 1726853400.89112: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.89113: done checking to see if all hosts have failed 18285 1726853400.89114: getting the remaining hosts for this loop 18285 1726853400.89115: done getting the remaining hosts for this loop 18285 1726853400.89119: getting the next task for host managed_node1 18285 1726853400.89123: done getting next task for host managed_node1 18285 1726853400.89126: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18285 1726853400.89128: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.89140: getting variables 18285 1726853400.89141: in VariableManager get_vars() 18285 1726853400.89181: Calling all_inventory to load vars for managed_node1 18285 1726853400.89184: Calling groups_inventory to load vars for managed_node1 18285 1726853400.89186: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.89196: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.89199: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.89201: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.89341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.89467: done with get_vars() 18285 1726853400.89477: done getting variables 18285 1726853400.89520: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:30:00 -0400 (0:00:00.033) 0:00:06.831 ****** 18285 1726853400.89542: entering _queue_task() for managed_node1/debug 18285 1726853400.89754: worker is 1 (out of 1 available) 18285 1726853400.89765: exiting _queue_task() for managed_node1/debug 18285 1726853400.89777: done queuing things up, now waiting for results queue to drain 18285 1726853400.89779: waiting for pending results... 18285 1726853400.89939: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18285 1726853400.89995: in run() - task 02083763-bbaf-9200-7ca6-00000000005d 18285 1726853400.90017: variable 'ansible_search_path' from source: unknown 18285 1726853400.90021: variable 'ansible_search_path' from source: unknown 18285 1726853400.90046: calling self._execute() 18285 1726853400.90118: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.90122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.90131: variable 'omit' from source: magic vars 18285 1726853400.90512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.91970: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.92021: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.92047: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.92078: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.92098: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.92156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.92179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.92199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.92224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.92235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.92330: variable 'ansible_distribution' from source: facts 18285 1726853400.92334: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.92349: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.92354: when evaluation is False, skipping this task 18285 1726853400.92357: _execute() done 18285 1726853400.92361: dumping result to json 18285 1726853400.92365: done dumping result, returning 18285 1726853400.92374: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-9200-7ca6-00000000005d] 18285 1726853400.92379: sending task result for task 02083763-bbaf-9200-7ca6-00000000005d 18285 1726853400.92463: done sending task result for task 02083763-bbaf-9200-7ca6-00000000005d 18285 1726853400.92466: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853400.92548: no more pending results, returning what we have 18285 1726853400.92551: results queue empty 18285 1726853400.92552: checking for any_errors_fatal 18285 1726853400.92558: done checking for any_errors_fatal 18285 1726853400.92559: checking for max_fail_percentage 18285 1726853400.92560: done checking for max_fail_percentage 18285 1726853400.92561: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.92562: done checking to see if all hosts have failed 18285 1726853400.92562: getting the remaining hosts for this loop 18285 1726853400.92564: done getting the remaining hosts for this loop 18285 1726853400.92567: getting the next task for host managed_node1 18285 1726853400.92574: done getting next task for host managed_node1 18285 1726853400.92578: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18285 1726853400.92579: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.92592: getting variables 18285 1726853400.92594: in VariableManager get_vars() 18285 1726853400.92627: Calling all_inventory to load vars for managed_node1 18285 1726853400.92630: Calling groups_inventory to load vars for managed_node1 18285 1726853400.92632: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.92640: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.92642: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.92644: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.92838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.92953: done with get_vars() 18285 1726853400.92960: done getting variables 18285 1726853400.93003: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:30:00 -0400 (0:00:00.034) 0:00:06.866 ****** 18285 1726853400.93024: entering _queue_task() for managed_node1/fail 18285 1726853400.93229: worker is 1 (out of 1 available) 18285 1726853400.93241: exiting _queue_task() for managed_node1/fail 18285 1726853400.93252: done queuing things up, now waiting for results queue to drain 18285 1726853400.93254: waiting for pending results... 18285 1726853400.93419: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18285 1726853400.93475: in run() - task 02083763-bbaf-9200-7ca6-00000000005e 18285 1726853400.93492: variable 'ansible_search_path' from source: unknown 18285 1726853400.93496: variable 'ansible_search_path' from source: unknown 18285 1726853400.93524: calling self._execute() 18285 1726853400.93588: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.93592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.93603: variable 'omit' from source: magic vars 18285 1726853400.93895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.95756: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.95803: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853400.95831: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853400.95867: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853400.95887: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853400.95951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853400.95969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853400.95989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853400.96017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853400.96030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853400.96125: variable 'ansible_distribution' from source: facts 18285 1726853400.96130: variable 'ansible_distribution_major_version' from source: facts 18285 1726853400.96146: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853400.96151: when evaluation is False, skipping this task 18285 1726853400.96154: _execute() done 18285 1726853400.96156: dumping result to json 18285 1726853400.96158: done dumping result, returning 18285 1726853400.96165: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-9200-7ca6-00000000005e] 18285 1726853400.96170: sending task result for task 02083763-bbaf-9200-7ca6-00000000005e 18285 1726853400.96259: done sending task result for task 02083763-bbaf-9200-7ca6-00000000005e 18285 1726853400.96263: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853400.96311: no more pending results, returning what we have 18285 1726853400.96314: results queue empty 18285 1726853400.96315: checking for any_errors_fatal 18285 1726853400.96321: done checking for any_errors_fatal 18285 1726853400.96322: checking for max_fail_percentage 18285 1726853400.96323: done checking for max_fail_percentage 18285 1726853400.96324: checking to see if all hosts have failed and the running result is not ok 18285 1726853400.96325: done checking to see if all hosts have failed 18285 1726853400.96326: getting the remaining hosts for this loop 18285 1726853400.96327: done getting the remaining hosts for this loop 18285 1726853400.96330: getting the next task for host managed_node1 18285 1726853400.96336: done getting next task for host managed_node1 18285 1726853400.96339: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18285 1726853400.96341: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853400.96356: getting variables 18285 1726853400.96357: in VariableManager get_vars() 18285 1726853400.96400: Calling all_inventory to load vars for managed_node1 18285 1726853400.96402: Calling groups_inventory to load vars for managed_node1 18285 1726853400.96405: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853400.96414: Calling all_plugins_play to load vars for managed_node1 18285 1726853400.96417: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853400.96419: Calling groups_plugins_play to load vars for managed_node1 18285 1726853400.96559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853400.96683: done with get_vars() 18285 1726853400.96691: done getting variables 18285 1726853400.96734: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:30:00 -0400 (0:00:00.037) 0:00:06.903 ****** 18285 1726853400.96757: entering _queue_task() for managed_node1/fail 18285 1726853400.96968: worker is 1 (out of 1 available) 18285 1726853400.96981: exiting _queue_task() for managed_node1/fail 18285 1726853400.96993: done queuing things up, now waiting for results queue to drain 18285 1726853400.96994: waiting for pending results... 18285 1726853400.97168: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18285 1726853400.97224: in run() - task 02083763-bbaf-9200-7ca6-00000000005f 18285 1726853400.97237: variable 'ansible_search_path' from source: unknown 18285 1726853400.97241: variable 'ansible_search_path' from source: unknown 18285 1726853400.97273: calling self._execute() 18285 1726853400.97332: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853400.97338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853400.97345: variable 'omit' from source: magic vars 18285 1726853400.97978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853400.99894: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853400.99973: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.00013: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.00050: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.00082: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.00162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.00197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.00226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.00270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.00301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.00429: variable 'ansible_distribution' from source: facts 18285 1726853401.00439: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.00460: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.00468: when evaluation is False, skipping this task 18285 1726853401.00477: _execute() done 18285 1726853401.00484: dumping result to json 18285 1726853401.00491: done dumping result, returning 18285 1726853401.00502: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-9200-7ca6-00000000005f] 18285 1726853401.00511: sending task result for task 02083763-bbaf-9200-7ca6-00000000005f skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.00655: no more pending results, returning what we have 18285 1726853401.00658: results queue empty 18285 1726853401.00659: checking for any_errors_fatal 18285 1726853401.00665: done checking for any_errors_fatal 18285 1726853401.00665: checking for max_fail_percentage 18285 1726853401.00667: done checking for max_fail_percentage 18285 1726853401.00667: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.00668: done checking to see if all hosts have failed 18285 1726853401.00669: getting the remaining hosts for this loop 18285 1726853401.00670: done getting the remaining hosts for this loop 18285 1726853401.00677: getting the next task for host managed_node1 18285 1726853401.00683: done getting next task for host managed_node1 18285 1726853401.00687: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18285 1726853401.00689: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.00701: getting variables 18285 1726853401.00703: in VariableManager get_vars() 18285 1726853401.00738: Calling all_inventory to load vars for managed_node1 18285 1726853401.00740: Calling groups_inventory to load vars for managed_node1 18285 1726853401.00742: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.00755: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.00758: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.00760: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.01004: done sending task result for task 02083763-bbaf-9200-7ca6-00000000005f 18285 1726853401.01007: WORKER PROCESS EXITING 18285 1726853401.01030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.01226: done with get_vars() 18285 1726853401.01237: done getting variables 18285 1726853401.01293: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:30:01 -0400 (0:00:00.045) 0:00:06.949 ****** 18285 1726853401.01321: entering _queue_task() for managed_node1/fail 18285 1726853401.01582: worker is 1 (out of 1 available) 18285 1726853401.01595: exiting _queue_task() for managed_node1/fail 18285 1726853401.01604: done queuing things up, now waiting for results queue to drain 18285 1726853401.01605: waiting for pending results... 18285 1726853401.01863: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18285 1726853401.01960: in run() - task 02083763-bbaf-9200-7ca6-000000000060 18285 1726853401.01987: variable 'ansible_search_path' from source: unknown 18285 1726853401.01996: variable 'ansible_search_path' from source: unknown 18285 1726853401.02035: calling self._execute() 18285 1726853401.02118: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.02129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.02140: variable 'omit' from source: magic vars 18285 1726853401.02575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.04827: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.04899: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.04943: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.04995: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.05034: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.05120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.05157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.05228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.05235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.05248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.05366: variable 'ansible_distribution' from source: facts 18285 1726853401.05369: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.05387: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.05390: when evaluation is False, skipping this task 18285 1726853401.05393: _execute() done 18285 1726853401.05395: dumping result to json 18285 1726853401.05399: done dumping result, returning 18285 1726853401.05407: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-9200-7ca6-000000000060] 18285 1726853401.05411: sending task result for task 02083763-bbaf-9200-7ca6-000000000060 18285 1726853401.05511: done sending task result for task 02083763-bbaf-9200-7ca6-000000000060 18285 1726853401.05514: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.05560: no more pending results, returning what we have 18285 1726853401.05564: results queue empty 18285 1726853401.05565: checking for any_errors_fatal 18285 1726853401.05583: done checking for any_errors_fatal 18285 1726853401.05584: checking for max_fail_percentage 18285 1726853401.05586: done checking for max_fail_percentage 18285 1726853401.05587: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.05588: done checking to see if all hosts have failed 18285 1726853401.05588: getting the remaining hosts for this loop 18285 1726853401.05590: done getting the remaining hosts for this loop 18285 1726853401.05594: getting the next task for host managed_node1 18285 1726853401.05599: done getting next task for host managed_node1 18285 1726853401.05602: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18285 1726853401.05604: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.05617: getting variables 18285 1726853401.05618: in VariableManager get_vars() 18285 1726853401.05656: Calling all_inventory to load vars for managed_node1 18285 1726853401.05659: Calling groups_inventory to load vars for managed_node1 18285 1726853401.05661: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.05670: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.05674: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.05677: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.05818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.05961: done with get_vars() 18285 1726853401.05969: done getting variables 18285 1726853401.06011: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:30:01 -0400 (0:00:00.047) 0:00:06.996 ****** 18285 1726853401.06033: entering _queue_task() for managed_node1/dnf 18285 1726853401.06247: worker is 1 (out of 1 available) 18285 1726853401.06263: exiting _queue_task() for managed_node1/dnf 18285 1726853401.06277: done queuing things up, now waiting for results queue to drain 18285 1726853401.06278: waiting for pending results... 18285 1726853401.06436: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18285 1726853401.06495: in run() - task 02083763-bbaf-9200-7ca6-000000000061 18285 1726853401.06509: variable 'ansible_search_path' from source: unknown 18285 1726853401.06512: variable 'ansible_search_path' from source: unknown 18285 1726853401.06539: calling self._execute() 18285 1726853401.06601: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.06605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.06617: variable 'omit' from source: magic vars 18285 1726853401.06915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.08788: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.08838: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.08866: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.08893: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.08916: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.08976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.08996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.09015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.09045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.09056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.09153: variable 'ansible_distribution' from source: facts 18285 1726853401.09164: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.09182: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.09185: when evaluation is False, skipping this task 18285 1726853401.09187: _execute() done 18285 1726853401.09190: dumping result to json 18285 1726853401.09194: done dumping result, returning 18285 1726853401.09202: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-000000000061] 18285 1726853401.09206: sending task result for task 02083763-bbaf-9200-7ca6-000000000061 18285 1726853401.09300: done sending task result for task 02083763-bbaf-9200-7ca6-000000000061 18285 1726853401.09302: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.09354: no more pending results, returning what we have 18285 1726853401.09358: results queue empty 18285 1726853401.09359: checking for any_errors_fatal 18285 1726853401.09368: done checking for any_errors_fatal 18285 1726853401.09369: checking for max_fail_percentage 18285 1726853401.09372: done checking for max_fail_percentage 18285 1726853401.09373: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.09373: done checking to see if all hosts have failed 18285 1726853401.09374: getting the remaining hosts for this loop 18285 1726853401.09375: done getting the remaining hosts for this loop 18285 1726853401.09380: getting the next task for host managed_node1 18285 1726853401.09385: done getting next task for host managed_node1 18285 1726853401.09389: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18285 1726853401.09391: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.09403: getting variables 18285 1726853401.09404: in VariableManager get_vars() 18285 1726853401.09443: Calling all_inventory to load vars for managed_node1 18285 1726853401.09445: Calling groups_inventory to load vars for managed_node1 18285 1726853401.09448: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.09459: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.09462: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.09464: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.09608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.09735: done with get_vars() 18285 1726853401.09743: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18285 1726853401.09799: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:30:01 -0400 (0:00:00.037) 0:00:07.034 ****** 18285 1726853401.09820: entering _queue_task() for managed_node1/yum 18285 1726853401.10027: worker is 1 (out of 1 available) 18285 1726853401.10040: exiting _queue_task() for managed_node1/yum 18285 1726853401.10055: done queuing things up, now waiting for results queue to drain 18285 1726853401.10057: waiting for pending results... 18285 1726853401.10215: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18285 1726853401.10270: in run() - task 02083763-bbaf-9200-7ca6-000000000062 18285 1726853401.10290: variable 'ansible_search_path' from source: unknown 18285 1726853401.10293: variable 'ansible_search_path' from source: unknown 18285 1726853401.10316: calling self._execute() 18285 1726853401.10397: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.10401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.10407: variable 'omit' from source: magic vars 18285 1726853401.10923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.12702: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.12745: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.12954: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.12979: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.13002: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.13061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.13083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.13103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.13129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.13140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.13236: variable 'ansible_distribution' from source: facts 18285 1726853401.13239: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.13256: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.13259: when evaluation is False, skipping this task 18285 1726853401.13261: _execute() done 18285 1726853401.13264: dumping result to json 18285 1726853401.13268: done dumping result, returning 18285 1726853401.13277: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-000000000062] 18285 1726853401.13280: sending task result for task 02083763-bbaf-9200-7ca6-000000000062 18285 1726853401.13370: done sending task result for task 02083763-bbaf-9200-7ca6-000000000062 18285 1726853401.13375: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.13423: no more pending results, returning what we have 18285 1726853401.13427: results queue empty 18285 1726853401.13428: checking for any_errors_fatal 18285 1726853401.13434: done checking for any_errors_fatal 18285 1726853401.13435: checking for max_fail_percentage 18285 1726853401.13436: done checking for max_fail_percentage 18285 1726853401.13437: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.13437: done checking to see if all hosts have failed 18285 1726853401.13438: getting the remaining hosts for this loop 18285 1726853401.13439: done getting the remaining hosts for this loop 18285 1726853401.13443: getting the next task for host managed_node1 18285 1726853401.13451: done getting next task for host managed_node1 18285 1726853401.13455: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18285 1726853401.13457: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.13469: getting variables 18285 1726853401.13472: in VariableManager get_vars() 18285 1726853401.13516: Calling all_inventory to load vars for managed_node1 18285 1726853401.13519: Calling groups_inventory to load vars for managed_node1 18285 1726853401.13521: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.13530: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.13533: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.13535: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.13763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.13914: done with get_vars() 18285 1726853401.13923: done getting variables 18285 1726853401.13983: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:30:01 -0400 (0:00:00.041) 0:00:07.076 ****** 18285 1726853401.14010: entering _queue_task() for managed_node1/fail 18285 1726853401.14285: worker is 1 (out of 1 available) 18285 1726853401.14297: exiting _queue_task() for managed_node1/fail 18285 1726853401.14307: done queuing things up, now waiting for results queue to drain 18285 1726853401.14308: waiting for pending results... 18285 1726853401.14583: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18285 1726853401.14641: in run() - task 02083763-bbaf-9200-7ca6-000000000063 18285 1726853401.14654: variable 'ansible_search_path' from source: unknown 18285 1726853401.14657: variable 'ansible_search_path' from source: unknown 18285 1726853401.14692: calling self._execute() 18285 1726853401.14876: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.14880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.14882: variable 'omit' from source: magic vars 18285 1726853401.15200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.17130: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.17177: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.17205: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.17231: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.17254: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.17313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.17336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.17358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.17384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.17395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.17491: variable 'ansible_distribution' from source: facts 18285 1726853401.17494: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.17511: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.17514: when evaluation is False, skipping this task 18285 1726853401.17516: _execute() done 18285 1726853401.17519: dumping result to json 18285 1726853401.17523: done dumping result, returning 18285 1726853401.17530: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-000000000063] 18285 1726853401.17536: sending task result for task 02083763-bbaf-9200-7ca6-000000000063 18285 1726853401.17627: done sending task result for task 02083763-bbaf-9200-7ca6-000000000063 18285 1726853401.17630: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.17693: no more pending results, returning what we have 18285 1726853401.17696: results queue empty 18285 1726853401.17697: checking for any_errors_fatal 18285 1726853401.17702: done checking for any_errors_fatal 18285 1726853401.17703: checking for max_fail_percentage 18285 1726853401.17704: done checking for max_fail_percentage 18285 1726853401.17705: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.17706: done checking to see if all hosts have failed 18285 1726853401.17706: getting the remaining hosts for this loop 18285 1726853401.17707: done getting the remaining hosts for this loop 18285 1726853401.17711: getting the next task for host managed_node1 18285 1726853401.17717: done getting next task for host managed_node1 18285 1726853401.17720: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18285 1726853401.17722: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.17734: getting variables 18285 1726853401.17736: in VariableManager get_vars() 18285 1726853401.17801: Calling all_inventory to load vars for managed_node1 18285 1726853401.17804: Calling groups_inventory to load vars for managed_node1 18285 1726853401.17806: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.17814: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.17816: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.17819: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.17999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.18184: done with get_vars() 18285 1726853401.18197: done getting variables 18285 1726853401.18259: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:30:01 -0400 (0:00:00.042) 0:00:07.119 ****** 18285 1726853401.18294: entering _queue_task() for managed_node1/package 18285 1726853401.18594: worker is 1 (out of 1 available) 18285 1726853401.18607: exiting _queue_task() for managed_node1/package 18285 1726853401.18619: done queuing things up, now waiting for results queue to drain 18285 1726853401.18620: waiting for pending results... 18285 1726853401.18992: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18285 1726853401.19032: in run() - task 02083763-bbaf-9200-7ca6-000000000064 18285 1726853401.19037: variable 'ansible_search_path' from source: unknown 18285 1726853401.19047: variable 'ansible_search_path' from source: unknown 18285 1726853401.19064: calling self._execute() 18285 1726853401.19147: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.19160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.19163: variable 'omit' from source: magic vars 18285 1726853401.19488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.21236: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.21284: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.21311: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.21338: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.21358: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.21432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.21455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.21473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.21498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.21510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.21605: variable 'ansible_distribution' from source: facts 18285 1726853401.21610: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.21626: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.21629: when evaluation is False, skipping this task 18285 1726853401.21631: _execute() done 18285 1726853401.21634: dumping result to json 18285 1726853401.21638: done dumping result, returning 18285 1726853401.21646: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-9200-7ca6-000000000064] 18285 1726853401.21654: sending task result for task 02083763-bbaf-9200-7ca6-000000000064 18285 1726853401.21741: done sending task result for task 02083763-bbaf-9200-7ca6-000000000064 18285 1726853401.21744: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.21795: no more pending results, returning what we have 18285 1726853401.21798: results queue empty 18285 1726853401.21799: checking for any_errors_fatal 18285 1726853401.21806: done checking for any_errors_fatal 18285 1726853401.21807: checking for max_fail_percentage 18285 1726853401.21808: done checking for max_fail_percentage 18285 1726853401.21809: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.21810: done checking to see if all hosts have failed 18285 1726853401.21811: getting the remaining hosts for this loop 18285 1726853401.21812: done getting the remaining hosts for this loop 18285 1726853401.21815: getting the next task for host managed_node1 18285 1726853401.21821: done getting next task for host managed_node1 18285 1726853401.21825: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18285 1726853401.21828: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.21839: getting variables 18285 1726853401.21841: in VariableManager get_vars() 18285 1726853401.21880: Calling all_inventory to load vars for managed_node1 18285 1726853401.21882: Calling groups_inventory to load vars for managed_node1 18285 1726853401.21884: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.21895: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.21898: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.21900: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.22090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.22208: done with get_vars() 18285 1726853401.22216: done getting variables 18285 1726853401.22259: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:30:01 -0400 (0:00:00.039) 0:00:07.159 ****** 18285 1726853401.22283: entering _queue_task() for managed_node1/package 18285 1726853401.22504: worker is 1 (out of 1 available) 18285 1726853401.22518: exiting _queue_task() for managed_node1/package 18285 1726853401.22529: done queuing things up, now waiting for results queue to drain 18285 1726853401.22531: waiting for pending results... 18285 1726853401.22705: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18285 1726853401.22754: in run() - task 02083763-bbaf-9200-7ca6-000000000065 18285 1726853401.22774: variable 'ansible_search_path' from source: unknown 18285 1726853401.22778: variable 'ansible_search_path' from source: unknown 18285 1726853401.22808: calling self._execute() 18285 1726853401.22875: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.22884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.22892: variable 'omit' from source: magic vars 18285 1726853401.23204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.24911: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.24956: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.24984: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.25009: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.25030: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.25093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.25113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.25131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.25161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.25173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.25265: variable 'ansible_distribution' from source: facts 18285 1726853401.25278: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.25294: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.25297: when evaluation is False, skipping this task 18285 1726853401.25299: _execute() done 18285 1726853401.25301: dumping result to json 18285 1726853401.25306: done dumping result, returning 18285 1726853401.25313: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-9200-7ca6-000000000065] 18285 1726853401.25318: sending task result for task 02083763-bbaf-9200-7ca6-000000000065 18285 1726853401.25411: done sending task result for task 02083763-bbaf-9200-7ca6-000000000065 18285 1726853401.25414: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.25460: no more pending results, returning what we have 18285 1726853401.25463: results queue empty 18285 1726853401.25464: checking for any_errors_fatal 18285 1726853401.25468: done checking for any_errors_fatal 18285 1726853401.25469: checking for max_fail_percentage 18285 1726853401.25472: done checking for max_fail_percentage 18285 1726853401.25473: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.25474: done checking to see if all hosts have failed 18285 1726853401.25474: getting the remaining hosts for this loop 18285 1726853401.25476: done getting the remaining hosts for this loop 18285 1726853401.25480: getting the next task for host managed_node1 18285 1726853401.25485: done getting next task for host managed_node1 18285 1726853401.25488: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18285 1726853401.25490: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.25502: getting variables 18285 1726853401.25504: in VariableManager get_vars() 18285 1726853401.25539: Calling all_inventory to load vars for managed_node1 18285 1726853401.25542: Calling groups_inventory to load vars for managed_node1 18285 1726853401.25544: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.25555: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.25557: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.25560: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.25719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.25838: done with get_vars() 18285 1726853401.25846: done getting variables 18285 1726853401.25889: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:30:01 -0400 (0:00:00.036) 0:00:07.195 ****** 18285 1726853401.25913: entering _queue_task() for managed_node1/package 18285 1726853401.26135: worker is 1 (out of 1 available) 18285 1726853401.26149: exiting _queue_task() for managed_node1/package 18285 1726853401.26162: done queuing things up, now waiting for results queue to drain 18285 1726853401.26163: waiting for pending results... 18285 1726853401.26333: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18285 1726853401.26395: in run() - task 02083763-bbaf-9200-7ca6-000000000066 18285 1726853401.26403: variable 'ansible_search_path' from source: unknown 18285 1726853401.26407: variable 'ansible_search_path' from source: unknown 18285 1726853401.26434: calling self._execute() 18285 1726853401.26500: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.26504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.26515: variable 'omit' from source: magic vars 18285 1726853401.26830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.28587: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.28632: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.28662: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.28692: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.28722: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.28787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.28808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.28825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.28850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.28863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.28962: variable 'ansible_distribution' from source: facts 18285 1726853401.28966: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.28983: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.28986: when evaluation is False, skipping this task 18285 1726853401.28991: _execute() done 18285 1726853401.28993: dumping result to json 18285 1726853401.28997: done dumping result, returning 18285 1726853401.29007: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-9200-7ca6-000000000066] 18285 1726853401.29009: sending task result for task 02083763-bbaf-9200-7ca6-000000000066 18285 1726853401.29102: done sending task result for task 02083763-bbaf-9200-7ca6-000000000066 18285 1726853401.29106: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.29150: no more pending results, returning what we have 18285 1726853401.29153: results queue empty 18285 1726853401.29154: checking for any_errors_fatal 18285 1726853401.29160: done checking for any_errors_fatal 18285 1726853401.29161: checking for max_fail_percentage 18285 1726853401.29162: done checking for max_fail_percentage 18285 1726853401.29163: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.29164: done checking to see if all hosts have failed 18285 1726853401.29164: getting the remaining hosts for this loop 18285 1726853401.29166: done getting the remaining hosts for this loop 18285 1726853401.29170: getting the next task for host managed_node1 18285 1726853401.29178: done getting next task for host managed_node1 18285 1726853401.29181: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18285 1726853401.29183: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.29194: getting variables 18285 1726853401.29196: in VariableManager get_vars() 18285 1726853401.29233: Calling all_inventory to load vars for managed_node1 18285 1726853401.29235: Calling groups_inventory to load vars for managed_node1 18285 1726853401.29237: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.29248: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.29250: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.29253: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.29443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.29560: done with get_vars() 18285 1726853401.29568: done getting variables 18285 1726853401.29613: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:30:01 -0400 (0:00:00.037) 0:00:07.232 ****** 18285 1726853401.29635: entering _queue_task() for managed_node1/service 18285 1726853401.29848: worker is 1 (out of 1 available) 18285 1726853401.29862: exiting _queue_task() for managed_node1/service 18285 1726853401.29876: done queuing things up, now waiting for results queue to drain 18285 1726853401.29878: waiting for pending results... 18285 1726853401.30046: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18285 1726853401.30177: in run() - task 02083763-bbaf-9200-7ca6-000000000067 18285 1726853401.30181: variable 'ansible_search_path' from source: unknown 18285 1726853401.30183: variable 'ansible_search_path' from source: unknown 18285 1726853401.30185: calling self._execute() 18285 1726853401.30222: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.30226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.30235: variable 'omit' from source: magic vars 18285 1726853401.30548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.32282: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.32336: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.32367: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.32396: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.32417: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.32478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.32500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.32520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.32545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.32559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.32658: variable 'ansible_distribution' from source: facts 18285 1726853401.32664: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.32681: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.32684: when evaluation is False, skipping this task 18285 1726853401.32686: _execute() done 18285 1726853401.32689: dumping result to json 18285 1726853401.32692: done dumping result, returning 18285 1726853401.32700: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-9200-7ca6-000000000067] 18285 1726853401.32706: sending task result for task 02083763-bbaf-9200-7ca6-000000000067 18285 1726853401.32797: done sending task result for task 02083763-bbaf-9200-7ca6-000000000067 18285 1726853401.32800: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.32858: no more pending results, returning what we have 18285 1726853401.32862: results queue empty 18285 1726853401.32863: checking for any_errors_fatal 18285 1726853401.32868: done checking for any_errors_fatal 18285 1726853401.32869: checking for max_fail_percentage 18285 1726853401.32872: done checking for max_fail_percentage 18285 1726853401.32873: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.32874: done checking to see if all hosts have failed 18285 1726853401.32874: getting the remaining hosts for this loop 18285 1726853401.32876: done getting the remaining hosts for this loop 18285 1726853401.32880: getting the next task for host managed_node1 18285 1726853401.32885: done getting next task for host managed_node1 18285 1726853401.32889: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18285 1726853401.32890: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.32902: getting variables 18285 1726853401.32904: in VariableManager get_vars() 18285 1726853401.32940: Calling all_inventory to load vars for managed_node1 18285 1726853401.32942: Calling groups_inventory to load vars for managed_node1 18285 1726853401.32944: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.32955: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.32957: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.32960: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.33107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.33223: done with get_vars() 18285 1726853401.33231: done getting variables 18285 1726853401.33274: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:30:01 -0400 (0:00:00.036) 0:00:07.269 ****** 18285 1726853401.33297: entering _queue_task() for managed_node1/service 18285 1726853401.33509: worker is 1 (out of 1 available) 18285 1726853401.33522: exiting _queue_task() for managed_node1/service 18285 1726853401.33534: done queuing things up, now waiting for results queue to drain 18285 1726853401.33536: waiting for pending results... 18285 1726853401.33705: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18285 1726853401.33759: in run() - task 02083763-bbaf-9200-7ca6-000000000068 18285 1726853401.33775: variable 'ansible_search_path' from source: unknown 18285 1726853401.33778: variable 'ansible_search_path' from source: unknown 18285 1726853401.33808: calling self._execute() 18285 1726853401.33870: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.33875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.33886: variable 'omit' from source: magic vars 18285 1726853401.34184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.35874: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.35918: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.35959: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.35983: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.36002: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.36060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.36088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.36106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.36131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.36141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.36243: variable 'ansible_distribution' from source: facts 18285 1726853401.36246: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.36264: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.36267: when evaluation is False, skipping this task 18285 1726853401.36269: _execute() done 18285 1726853401.36274: dumping result to json 18285 1726853401.36277: done dumping result, returning 18285 1726853401.36289: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-9200-7ca6-000000000068] 18285 1726853401.36292: sending task result for task 02083763-bbaf-9200-7ca6-000000000068 18285 1726853401.36377: done sending task result for task 02083763-bbaf-9200-7ca6-000000000068 18285 1726853401.36380: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18285 1726853401.36425: no more pending results, returning what we have 18285 1726853401.36428: results queue empty 18285 1726853401.36429: checking for any_errors_fatal 18285 1726853401.36436: done checking for any_errors_fatal 18285 1726853401.36437: checking for max_fail_percentage 18285 1726853401.36438: done checking for max_fail_percentage 18285 1726853401.36439: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.36440: done checking to see if all hosts have failed 18285 1726853401.36440: getting the remaining hosts for this loop 18285 1726853401.36442: done getting the remaining hosts for this loop 18285 1726853401.36446: getting the next task for host managed_node1 18285 1726853401.36451: done getting next task for host managed_node1 18285 1726853401.36455: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18285 1726853401.36459: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.36473: getting variables 18285 1726853401.36475: in VariableManager get_vars() 18285 1726853401.36510: Calling all_inventory to load vars for managed_node1 18285 1726853401.36512: Calling groups_inventory to load vars for managed_node1 18285 1726853401.36514: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.36524: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.36527: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.36529: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.36905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.37018: done with get_vars() 18285 1726853401.37027: done getting variables 18285 1726853401.37068: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:30:01 -0400 (0:00:00.037) 0:00:07.307 ****** 18285 1726853401.37091: entering _queue_task() for managed_node1/service 18285 1726853401.37301: worker is 1 (out of 1 available) 18285 1726853401.37312: exiting _queue_task() for managed_node1/service 18285 1726853401.37324: done queuing things up, now waiting for results queue to drain 18285 1726853401.37325: waiting for pending results... 18285 1726853401.37497: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18285 1726853401.37563: in run() - task 02083763-bbaf-9200-7ca6-000000000069 18285 1726853401.37576: variable 'ansible_search_path' from source: unknown 18285 1726853401.37580: variable 'ansible_search_path' from source: unknown 18285 1726853401.37613: calling self._execute() 18285 1726853401.37674: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.37679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.37689: variable 'omit' from source: magic vars 18285 1726853401.37996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.39880: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.39929: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.39957: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.39985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.40008: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.40066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.40089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.40107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.40135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.40145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.40244: variable 'ansible_distribution' from source: facts 18285 1726853401.40251: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.40265: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.40269: when evaluation is False, skipping this task 18285 1726853401.40273: _execute() done 18285 1726853401.40275: dumping result to json 18285 1726853401.40279: done dumping result, returning 18285 1726853401.40287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-9200-7ca6-000000000069] 18285 1726853401.40291: sending task result for task 02083763-bbaf-9200-7ca6-000000000069 18285 1726853401.40381: done sending task result for task 02083763-bbaf-9200-7ca6-000000000069 18285 1726853401.40383: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.40427: no more pending results, returning what we have 18285 1726853401.40430: results queue empty 18285 1726853401.40431: checking for any_errors_fatal 18285 1726853401.40438: done checking for any_errors_fatal 18285 1726853401.40439: checking for max_fail_percentage 18285 1726853401.40441: done checking for max_fail_percentage 18285 1726853401.40442: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.40442: done checking to see if all hosts have failed 18285 1726853401.40443: getting the remaining hosts for this loop 18285 1726853401.40444: done getting the remaining hosts for this loop 18285 1726853401.40448: getting the next task for host managed_node1 18285 1726853401.40455: done getting next task for host managed_node1 18285 1726853401.40459: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18285 1726853401.40461: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.40478: getting variables 18285 1726853401.40480: in VariableManager get_vars() 18285 1726853401.40517: Calling all_inventory to load vars for managed_node1 18285 1726853401.40520: Calling groups_inventory to load vars for managed_node1 18285 1726853401.40522: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.40532: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.40534: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.40537: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.40685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.40808: done with get_vars() 18285 1726853401.40816: done getting variables 18285 1726853401.40858: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:30:01 -0400 (0:00:00.037) 0:00:07.345 ****** 18285 1726853401.40881: entering _queue_task() for managed_node1/service 18285 1726853401.41093: worker is 1 (out of 1 available) 18285 1726853401.41105: exiting _queue_task() for managed_node1/service 18285 1726853401.41118: done queuing things up, now waiting for results queue to drain 18285 1726853401.41119: waiting for pending results... 18285 1726853401.41280: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18285 1726853401.41377: in run() - task 02083763-bbaf-9200-7ca6-00000000006a 18285 1726853401.41476: variable 'ansible_search_path' from source: unknown 18285 1726853401.41480: variable 'ansible_search_path' from source: unknown 18285 1726853401.41483: calling self._execute() 18285 1726853401.41527: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.41539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.41552: variable 'omit' from source: magic vars 18285 1726853401.41962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.44162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.44234: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.44291: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.44329: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.44358: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.44439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.44474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.44676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.44679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.44682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.44692: variable 'ansible_distribution' from source: facts 18285 1726853401.44702: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.44722: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.44729: when evaluation is False, skipping this task 18285 1726853401.44735: _execute() done 18285 1726853401.44741: dumping result to json 18285 1726853401.44749: done dumping result, returning 18285 1726853401.44762: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-9200-7ca6-00000000006a] 18285 1726853401.44772: sending task result for task 02083763-bbaf-9200-7ca6-00000000006a skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18285 1726853401.44917: no more pending results, returning what we have 18285 1726853401.44920: results queue empty 18285 1726853401.44921: checking for any_errors_fatal 18285 1726853401.44927: done checking for any_errors_fatal 18285 1726853401.44927: checking for max_fail_percentage 18285 1726853401.44929: done checking for max_fail_percentage 18285 1726853401.44929: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.44930: done checking to see if all hosts have failed 18285 1726853401.44931: getting the remaining hosts for this loop 18285 1726853401.44932: done getting the remaining hosts for this loop 18285 1726853401.44936: getting the next task for host managed_node1 18285 1726853401.44941: done getting next task for host managed_node1 18285 1726853401.44944: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18285 1726853401.44946: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.44958: done sending task result for task 02083763-bbaf-9200-7ca6-00000000006a 18285 1726853401.44960: WORKER PROCESS EXITING 18285 1726853401.44973: getting variables 18285 1726853401.44975: in VariableManager get_vars() 18285 1726853401.45009: Calling all_inventory to load vars for managed_node1 18285 1726853401.45012: Calling groups_inventory to load vars for managed_node1 18285 1726853401.45014: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.45025: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.45028: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.45031: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.45380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.45601: done with get_vars() 18285 1726853401.45611: done getting variables 18285 1726853401.45676: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:30:01 -0400 (0:00:00.048) 0:00:07.393 ****** 18285 1726853401.45706: entering _queue_task() for managed_node1/copy 18285 1726853401.46005: worker is 1 (out of 1 available) 18285 1726853401.46017: exiting _queue_task() for managed_node1/copy 18285 1726853401.46027: done queuing things up, now waiting for results queue to drain 18285 1726853401.46029: waiting for pending results... 18285 1726853401.46330: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18285 1726853401.46585: in run() - task 02083763-bbaf-9200-7ca6-00000000006b 18285 1726853401.46589: variable 'ansible_search_path' from source: unknown 18285 1726853401.46592: variable 'ansible_search_path' from source: unknown 18285 1726853401.46595: calling self._execute() 18285 1726853401.46684: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.46696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.46716: variable 'omit' from source: magic vars 18285 1726853401.47248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.49890: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.50173: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.50179: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.50182: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.50184: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.50187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.50198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.50228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.50276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.50295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.50445: variable 'ansible_distribution' from source: facts 18285 1726853401.50459: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.50482: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.50489: when evaluation is False, skipping this task 18285 1726853401.50496: _execute() done 18285 1726853401.50502: dumping result to json 18285 1726853401.50509: done dumping result, returning 18285 1726853401.50527: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-9200-7ca6-00000000006b] 18285 1726853401.50536: sending task result for task 02083763-bbaf-9200-7ca6-00000000006b 18285 1726853401.50898: done sending task result for task 02083763-bbaf-9200-7ca6-00000000006b 18285 1726853401.50902: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.50939: no more pending results, returning what we have 18285 1726853401.50942: results queue empty 18285 1726853401.50943: checking for any_errors_fatal 18285 1726853401.50947: done checking for any_errors_fatal 18285 1726853401.50948: checking for max_fail_percentage 18285 1726853401.50952: done checking for max_fail_percentage 18285 1726853401.50953: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.50954: done checking to see if all hosts have failed 18285 1726853401.50954: getting the remaining hosts for this loop 18285 1726853401.50956: done getting the remaining hosts for this loop 18285 1726853401.50959: getting the next task for host managed_node1 18285 1726853401.50964: done getting next task for host managed_node1 18285 1726853401.50967: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18285 1726853401.50969: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.50983: getting variables 18285 1726853401.50985: in VariableManager get_vars() 18285 1726853401.51018: Calling all_inventory to load vars for managed_node1 18285 1726853401.51020: Calling groups_inventory to load vars for managed_node1 18285 1726853401.51023: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.51032: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.51034: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.51037: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.51293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.51502: done with get_vars() 18285 1726853401.51517: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:30:01 -0400 (0:00:00.058) 0:00:07.452 ****** 18285 1726853401.51601: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18285 1726853401.51964: worker is 1 (out of 1 available) 18285 1726853401.51979: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18285 1726853401.51990: done queuing things up, now waiting for results queue to drain 18285 1726853401.51991: waiting for pending results... 18285 1726853401.52191: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18285 1726853401.52401: in run() - task 02083763-bbaf-9200-7ca6-00000000006c 18285 1726853401.52404: variable 'ansible_search_path' from source: unknown 18285 1726853401.52407: variable 'ansible_search_path' from source: unknown 18285 1726853401.52423: calling self._execute() 18285 1726853401.52542: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.52557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.52574: variable 'omit' from source: magic vars 18285 1726853401.53122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.55778: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.55979: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.55982: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.55985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.55996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.56092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.56132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.56167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.56225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.56245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.56401: variable 'ansible_distribution' from source: facts 18285 1726853401.56418: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.56444: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.56454: when evaluation is False, skipping this task 18285 1726853401.56461: _execute() done 18285 1726853401.56520: dumping result to json 18285 1726853401.56523: done dumping result, returning 18285 1726853401.56526: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-9200-7ca6-00000000006c] 18285 1726853401.56533: sending task result for task 02083763-bbaf-9200-7ca6-00000000006c 18285 1726853401.56740: done sending task result for task 02083763-bbaf-9200-7ca6-00000000006c 18285 1726853401.56743: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.56796: no more pending results, returning what we have 18285 1726853401.56799: results queue empty 18285 1726853401.56800: checking for any_errors_fatal 18285 1726853401.56808: done checking for any_errors_fatal 18285 1726853401.56809: checking for max_fail_percentage 18285 1726853401.56811: done checking for max_fail_percentage 18285 1726853401.56812: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.56812: done checking to see if all hosts have failed 18285 1726853401.56813: getting the remaining hosts for this loop 18285 1726853401.56814: done getting the remaining hosts for this loop 18285 1726853401.56818: getting the next task for host managed_node1 18285 1726853401.56824: done getting next task for host managed_node1 18285 1726853401.56828: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18285 1726853401.56830: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.56956: getting variables 18285 1726853401.56959: in VariableManager get_vars() 18285 1726853401.57000: Calling all_inventory to load vars for managed_node1 18285 1726853401.57003: Calling groups_inventory to load vars for managed_node1 18285 1726853401.57005: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.57016: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.57018: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.57021: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.57343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.57540: done with get_vars() 18285 1726853401.57554: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:30:01 -0400 (0:00:00.060) 0:00:07.512 ****** 18285 1726853401.57643: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18285 1726853401.57933: worker is 1 (out of 1 available) 18285 1726853401.57947: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18285 1726853401.58076: done queuing things up, now waiting for results queue to drain 18285 1726853401.58078: waiting for pending results... 18285 1726853401.58293: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18285 1726853401.58377: in run() - task 02083763-bbaf-9200-7ca6-00000000006d 18285 1726853401.58412: variable 'ansible_search_path' from source: unknown 18285 1726853401.58415: variable 'ansible_search_path' from source: unknown 18285 1726853401.58504: calling self._execute() 18285 1726853401.58543: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.58558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.58573: variable 'omit' from source: magic vars 18285 1726853401.59029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.61505: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.61593: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.61642: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.61751: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.61755: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.61820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.61866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.61905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.61955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.61985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.62135: variable 'ansible_distribution' from source: facts 18285 1726853401.62147: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.62177: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.62194: when evaluation is False, skipping this task 18285 1726853401.62203: _execute() done 18285 1726853401.62211: dumping result to json 18285 1726853401.62220: done dumping result, returning 18285 1726853401.62235: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-9200-7ca6-00000000006d] 18285 1726853401.62247: sending task result for task 02083763-bbaf-9200-7ca6-00000000006d 18285 1726853401.62545: done sending task result for task 02083763-bbaf-9200-7ca6-00000000006d 18285 1726853401.62553: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.62609: no more pending results, returning what we have 18285 1726853401.62612: results queue empty 18285 1726853401.62613: checking for any_errors_fatal 18285 1726853401.62618: done checking for any_errors_fatal 18285 1726853401.62619: checking for max_fail_percentage 18285 1726853401.62621: done checking for max_fail_percentage 18285 1726853401.62622: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.62622: done checking to see if all hosts have failed 18285 1726853401.62623: getting the remaining hosts for this loop 18285 1726853401.62625: done getting the remaining hosts for this loop 18285 1726853401.62630: getting the next task for host managed_node1 18285 1726853401.62635: done getting next task for host managed_node1 18285 1726853401.62639: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18285 1726853401.62641: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.62656: getting variables 18285 1726853401.62658: in VariableManager get_vars() 18285 1726853401.62701: Calling all_inventory to load vars for managed_node1 18285 1726853401.62704: Calling groups_inventory to load vars for managed_node1 18285 1726853401.62706: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.62718: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.62722: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.62725: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.63233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.63597: done with get_vars() 18285 1726853401.63611: done getting variables 18285 1726853401.63791: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:30:01 -0400 (0:00:00.061) 0:00:07.574 ****** 18285 1726853401.63822: entering _queue_task() for managed_node1/debug 18285 1726853401.64628: worker is 1 (out of 1 available) 18285 1726853401.64641: exiting _queue_task() for managed_node1/debug 18285 1726853401.64656: done queuing things up, now waiting for results queue to drain 18285 1726853401.64657: waiting for pending results... 18285 1726853401.65195: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18285 1726853401.65313: in run() - task 02083763-bbaf-9200-7ca6-00000000006e 18285 1726853401.65455: variable 'ansible_search_path' from source: unknown 18285 1726853401.65462: variable 'ansible_search_path' from source: unknown 18285 1726853401.65506: calling self._execute() 18285 1726853401.65677: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.65680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.65682: variable 'omit' from source: magic vars 18285 1726853401.66043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.69153: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.69228: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.69284: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.69325: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.69355: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.69439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.69474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.69504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.69547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.69676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.69710: variable 'ansible_distribution' from source: facts 18285 1726853401.69722: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.69743: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.69752: when evaluation is False, skipping this task 18285 1726853401.69759: _execute() done 18285 1726853401.69765: dumping result to json 18285 1726853401.69775: done dumping result, returning 18285 1726853401.69787: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-9200-7ca6-00000000006e] 18285 1726853401.69796: sending task result for task 02083763-bbaf-9200-7ca6-00000000006e skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853401.69940: no more pending results, returning what we have 18285 1726853401.69944: results queue empty 18285 1726853401.69944: checking for any_errors_fatal 18285 1726853401.69952: done checking for any_errors_fatal 18285 1726853401.69953: checking for max_fail_percentage 18285 1726853401.69954: done checking for max_fail_percentage 18285 1726853401.69955: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.69956: done checking to see if all hosts have failed 18285 1726853401.69957: getting the remaining hosts for this loop 18285 1726853401.69958: done getting the remaining hosts for this loop 18285 1726853401.69966: getting the next task for host managed_node1 18285 1726853401.69973: done getting next task for host managed_node1 18285 1726853401.69977: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18285 1726853401.69979: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.70180: getting variables 18285 1726853401.70183: in VariableManager get_vars() 18285 1726853401.70215: Calling all_inventory to load vars for managed_node1 18285 1726853401.70218: Calling groups_inventory to load vars for managed_node1 18285 1726853401.70220: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.70230: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.70233: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.70236: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.70456: done sending task result for task 02083763-bbaf-9200-7ca6-00000000006e 18285 1726853401.70459: WORKER PROCESS EXITING 18285 1726853401.70483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.70698: done with get_vars() 18285 1726853401.70713: done getting variables 18285 1726853401.70773: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:30:01 -0400 (0:00:00.069) 0:00:07.644 ****** 18285 1726853401.70802: entering _queue_task() for managed_node1/debug 18285 1726853401.71112: worker is 1 (out of 1 available) 18285 1726853401.71124: exiting _queue_task() for managed_node1/debug 18285 1726853401.71135: done queuing things up, now waiting for results queue to drain 18285 1726853401.71136: waiting for pending results... 18285 1726853401.71431: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18285 1726853401.71537: in run() - task 02083763-bbaf-9200-7ca6-00000000006f 18285 1726853401.71562: variable 'ansible_search_path' from source: unknown 18285 1726853401.71569: variable 'ansible_search_path' from source: unknown 18285 1726853401.71616: calling self._execute() 18285 1726853401.71712: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.71725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.71741: variable 'omit' from source: magic vars 18285 1726853401.72178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.74943: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.75018: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.75070: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.75147: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.75153: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.75227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.75270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.75302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.75366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.75375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.75521: variable 'ansible_distribution' from source: facts 18285 1726853401.75580: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.75584: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.75586: when evaluation is False, skipping this task 18285 1726853401.75588: _execute() done 18285 1726853401.75590: dumping result to json 18285 1726853401.75592: done dumping result, returning 18285 1726853401.75687: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-9200-7ca6-00000000006f] 18285 1726853401.75690: sending task result for task 02083763-bbaf-9200-7ca6-00000000006f 18285 1726853401.75763: done sending task result for task 02083763-bbaf-9200-7ca6-00000000006f 18285 1726853401.75765: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853401.75816: no more pending results, returning what we have 18285 1726853401.75820: results queue empty 18285 1726853401.75821: checking for any_errors_fatal 18285 1726853401.75826: done checking for any_errors_fatal 18285 1726853401.75827: checking for max_fail_percentage 18285 1726853401.75829: done checking for max_fail_percentage 18285 1726853401.75830: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.75831: done checking to see if all hosts have failed 18285 1726853401.75832: getting the remaining hosts for this loop 18285 1726853401.75833: done getting the remaining hosts for this loop 18285 1726853401.75837: getting the next task for host managed_node1 18285 1726853401.75843: done getting next task for host managed_node1 18285 1726853401.75847: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18285 1726853401.75852: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.75865: getting variables 18285 1726853401.75867: in VariableManager get_vars() 18285 1726853401.75906: Calling all_inventory to load vars for managed_node1 18285 1726853401.75908: Calling groups_inventory to load vars for managed_node1 18285 1726853401.75911: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.75922: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.75925: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.75928: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.76400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.76618: done with get_vars() 18285 1726853401.76631: done getting variables 18285 1726853401.76690: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:30:01 -0400 (0:00:00.059) 0:00:07.703 ****** 18285 1726853401.76719: entering _queue_task() for managed_node1/debug 18285 1726853401.77003: worker is 1 (out of 1 available) 18285 1726853401.77015: exiting _queue_task() for managed_node1/debug 18285 1726853401.77028: done queuing things up, now waiting for results queue to drain 18285 1726853401.77029: waiting for pending results... 18285 1726853401.77394: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18285 1726853401.77476: in run() - task 02083763-bbaf-9200-7ca6-000000000070 18285 1726853401.77485: variable 'ansible_search_path' from source: unknown 18285 1726853401.77491: variable 'ansible_search_path' from source: unknown 18285 1726853401.77494: calling self._execute() 18285 1726853401.77583: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.77603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.77617: variable 'omit' from source: magic vars 18285 1726853401.78070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.80511: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.80642: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.80647: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.80694: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.80730: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.80828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.80870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.80902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.80944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.81079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.81120: variable 'ansible_distribution' from source: facts 18285 1726853401.81130: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.81154: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.81162: when evaluation is False, skipping this task 18285 1726853401.81168: _execute() done 18285 1726853401.81177: dumping result to json 18285 1726853401.81192: done dumping result, returning 18285 1726853401.81204: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-9200-7ca6-000000000070] 18285 1726853401.81213: sending task result for task 02083763-bbaf-9200-7ca6-000000000070 skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18285 1726853401.81457: no more pending results, returning what we have 18285 1726853401.81461: results queue empty 18285 1726853401.81462: checking for any_errors_fatal 18285 1726853401.81468: done checking for any_errors_fatal 18285 1726853401.81469: checking for max_fail_percentage 18285 1726853401.81473: done checking for max_fail_percentage 18285 1726853401.81473: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.81474: done checking to see if all hosts have failed 18285 1726853401.81475: getting the remaining hosts for this loop 18285 1726853401.81477: done getting the remaining hosts for this loop 18285 1726853401.81481: getting the next task for host managed_node1 18285 1726853401.81487: done getting next task for host managed_node1 18285 1726853401.81491: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18285 1726853401.81493: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.81511: done sending task result for task 02083763-bbaf-9200-7ca6-000000000070 18285 1726853401.81514: WORKER PROCESS EXITING 18285 1726853401.81579: getting variables 18285 1726853401.81581: in VariableManager get_vars() 18285 1726853401.81624: Calling all_inventory to load vars for managed_node1 18285 1726853401.81627: Calling groups_inventory to load vars for managed_node1 18285 1726853401.81630: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.81641: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.81644: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.81647: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.82081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.82293: done with get_vars() 18285 1726853401.82304: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:30:01 -0400 (0:00:00.056) 0:00:07.760 ****** 18285 1726853401.82399: entering _queue_task() for managed_node1/ping 18285 1726853401.82734: worker is 1 (out of 1 available) 18285 1726853401.82745: exiting _queue_task() for managed_node1/ping 18285 1726853401.82757: done queuing things up, now waiting for results queue to drain 18285 1726853401.82759: waiting for pending results... 18285 1726853401.82982: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18285 1726853401.83140: in run() - task 02083763-bbaf-9200-7ca6-000000000071 18285 1726853401.83144: variable 'ansible_search_path' from source: unknown 18285 1726853401.83146: variable 'ansible_search_path' from source: unknown 18285 1726853401.83162: calling self._execute() 18285 1726853401.83255: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.83266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.83281: variable 'omit' from source: magic vars 18285 1726853401.83718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.86095: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.86260: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.86263: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.86266: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.86292: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.86385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.86420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.86453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.86507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.86527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.86674: variable 'ansible_distribution' from source: facts 18285 1726853401.86694: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.86876: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.86880: when evaluation is False, skipping this task 18285 1726853401.86882: _execute() done 18285 1726853401.86885: dumping result to json 18285 1726853401.86887: done dumping result, returning 18285 1726853401.86889: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-9200-7ca6-000000000071] 18285 1726853401.86891: sending task result for task 02083763-bbaf-9200-7ca6-000000000071 18285 1726853401.86958: done sending task result for task 02083763-bbaf-9200-7ca6-000000000071 18285 1726853401.86961: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.87014: no more pending results, returning what we have 18285 1726853401.87018: results queue empty 18285 1726853401.87019: checking for any_errors_fatal 18285 1726853401.87026: done checking for any_errors_fatal 18285 1726853401.87027: checking for max_fail_percentage 18285 1726853401.87029: done checking for max_fail_percentage 18285 1726853401.87030: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.87031: done checking to see if all hosts have failed 18285 1726853401.87032: getting the remaining hosts for this loop 18285 1726853401.87033: done getting the remaining hosts for this loop 18285 1726853401.87037: getting the next task for host managed_node1 18285 1726853401.87044: done getting next task for host managed_node1 18285 1726853401.87047: ^ task is: TASK: meta (role_complete) 18285 1726853401.87052: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.87067: getting variables 18285 1726853401.87069: in VariableManager get_vars() 18285 1726853401.87114: Calling all_inventory to load vars for managed_node1 18285 1726853401.87117: Calling groups_inventory to load vars for managed_node1 18285 1726853401.87119: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.87132: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.87135: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.87139: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.87530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.87743: done with get_vars() 18285 1726853401.87756: done getting variables 18285 1726853401.87838: done queuing things up, now waiting for results queue to drain 18285 1726853401.87841: results queue empty 18285 1726853401.87841: checking for any_errors_fatal 18285 1726853401.87844: done checking for any_errors_fatal 18285 1726853401.87844: checking for max_fail_percentage 18285 1726853401.87846: done checking for max_fail_percentage 18285 1726853401.87846: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.87847: done checking to see if all hosts have failed 18285 1726853401.87848: getting the remaining hosts for this loop 18285 1726853401.87851: done getting the remaining hosts for this loop 18285 1726853401.87854: getting the next task for host managed_node1 18285 1726853401.87857: done getting next task for host managed_node1 18285 1726853401.87859: ^ task is: TASK: meta (flush_handlers) 18285 1726853401.87860: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.87862: getting variables 18285 1726853401.87863: in VariableManager get_vars() 18285 1726853401.87876: Calling all_inventory to load vars for managed_node1 18285 1726853401.87879: Calling groups_inventory to load vars for managed_node1 18285 1726853401.87881: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.87885: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.87888: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.87890: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.88044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.88279: done with get_vars() 18285 1726853401.88287: done getting variables 18285 1726853401.88330: in VariableManager get_vars() 18285 1726853401.88340: Calling all_inventory to load vars for managed_node1 18285 1726853401.88355: Calling groups_inventory to load vars for managed_node1 18285 1726853401.88357: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.88361: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.88364: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.88366: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.88510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.88709: done with get_vars() 18285 1726853401.88722: done queuing things up, now waiting for results queue to drain 18285 1726853401.88723: results queue empty 18285 1726853401.88724: checking for any_errors_fatal 18285 1726853401.88725: done checking for any_errors_fatal 18285 1726853401.88726: checking for max_fail_percentage 18285 1726853401.88727: done checking for max_fail_percentage 18285 1726853401.88728: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.88728: done checking to see if all hosts have failed 18285 1726853401.88729: getting the remaining hosts for this loop 18285 1726853401.88730: done getting the remaining hosts for this loop 18285 1726853401.88732: getting the next task for host managed_node1 18285 1726853401.88735: done getting next task for host managed_node1 18285 1726853401.88736: ^ task is: TASK: meta (flush_handlers) 18285 1726853401.88738: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.88740: getting variables 18285 1726853401.88741: in VariableManager get_vars() 18285 1726853401.88752: Calling all_inventory to load vars for managed_node1 18285 1726853401.88754: Calling groups_inventory to load vars for managed_node1 18285 1726853401.88756: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.88760: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.88763: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.88765: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.88910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.89135: done with get_vars() 18285 1726853401.89142: done getting variables 18285 1726853401.89189: in VariableManager get_vars() 18285 1726853401.89199: Calling all_inventory to load vars for managed_node1 18285 1726853401.89201: Calling groups_inventory to load vars for managed_node1 18285 1726853401.89203: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.89207: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.89209: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.89217: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.89356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.89555: done with get_vars() 18285 1726853401.89567: done queuing things up, now waiting for results queue to drain 18285 1726853401.89569: results queue empty 18285 1726853401.89569: checking for any_errors_fatal 18285 1726853401.89572: done checking for any_errors_fatal 18285 1726853401.89573: checking for max_fail_percentage 18285 1726853401.89574: done checking for max_fail_percentage 18285 1726853401.89575: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.89575: done checking to see if all hosts have failed 18285 1726853401.89576: getting the remaining hosts for this loop 18285 1726853401.89577: done getting the remaining hosts for this loop 18285 1726853401.89579: getting the next task for host managed_node1 18285 1726853401.89582: done getting next task for host managed_node1 18285 1726853401.89583: ^ task is: None 18285 1726853401.89584: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.89585: done queuing things up, now waiting for results queue to drain 18285 1726853401.89586: results queue empty 18285 1726853401.89587: checking for any_errors_fatal 18285 1726853401.89587: done checking for any_errors_fatal 18285 1726853401.89588: checking for max_fail_percentage 18285 1726853401.89589: done checking for max_fail_percentage 18285 1726853401.89589: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.89590: done checking to see if all hosts have failed 18285 1726853401.89591: getting the next task for host managed_node1 18285 1726853401.89593: done getting next task for host managed_node1 18285 1726853401.89594: ^ task is: None 18285 1726853401.89595: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.89628: in VariableManager get_vars() 18285 1726853401.89641: done with get_vars() 18285 1726853401.89657: in VariableManager get_vars() 18285 1726853401.89667: done with get_vars() 18285 1726853401.89673: variable 'omit' from source: magic vars 18285 1726853401.89701: in VariableManager get_vars() 18285 1726853401.89710: done with get_vars() 18285 1726853401.89730: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 18285 1726853401.89926: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18285 1726853401.89946: getting the remaining hosts for this loop 18285 1726853401.89947: done getting the remaining hosts for this loop 18285 1726853401.89951: getting the next task for host managed_node1 18285 1726853401.89954: done getting next task for host managed_node1 18285 1726853401.89955: ^ task is: TASK: Gathering Facts 18285 1726853401.89956: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.89958: getting variables 18285 1726853401.89959: in VariableManager get_vars() 18285 1726853401.89965: Calling all_inventory to load vars for managed_node1 18285 1726853401.89967: Calling groups_inventory to load vars for managed_node1 18285 1726853401.89969: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.89978: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.89980: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.89982: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.90097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.90401: done with get_vars() 18285 1726853401.90409: done getting variables 18285 1726853401.90448: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Friday 20 September 2024 13:30:01 -0400 (0:00:00.080) 0:00:07.841 ****** 18285 1726853401.90478: entering _queue_task() for managed_node1/gather_facts 18285 1726853401.90793: worker is 1 (out of 1 available) 18285 1726853401.90810: exiting _queue_task() for managed_node1/gather_facts 18285 1726853401.90823: done queuing things up, now waiting for results queue to drain 18285 1726853401.90825: waiting for pending results... 18285 1726853401.91156: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18285 1726853401.91165: in run() - task 02083763-bbaf-9200-7ca6-0000000002cb 18285 1726853401.91187: variable 'ansible_search_path' from source: unknown 18285 1726853401.91226: calling self._execute() 18285 1726853401.91309: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.91321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.91336: variable 'omit' from source: magic vars 18285 1726853401.91795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.93739: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.93788: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.93814: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.93840: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.93865: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.93923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.93943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.93962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.93997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.94007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.94106: variable 'ansible_distribution' from source: facts 18285 1726853401.94110: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.94125: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.94128: when evaluation is False, skipping this task 18285 1726853401.94130: _execute() done 18285 1726853401.94133: dumping result to json 18285 1726853401.94137: done dumping result, returning 18285 1726853401.94143: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-9200-7ca6-0000000002cb] 18285 1726853401.94151: sending task result for task 02083763-bbaf-9200-7ca6-0000000002cb 18285 1726853401.94227: done sending task result for task 02083763-bbaf-9200-7ca6-0000000002cb 18285 1726853401.94230: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.94281: no more pending results, returning what we have 18285 1726853401.94285: results queue empty 18285 1726853401.94286: checking for any_errors_fatal 18285 1726853401.94287: done checking for any_errors_fatal 18285 1726853401.94288: checking for max_fail_percentage 18285 1726853401.94289: done checking for max_fail_percentage 18285 1726853401.94290: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.94291: done checking to see if all hosts have failed 18285 1726853401.94291: getting the remaining hosts for this loop 18285 1726853401.94293: done getting the remaining hosts for this loop 18285 1726853401.94296: getting the next task for host managed_node1 18285 1726853401.94302: done getting next task for host managed_node1 18285 1726853401.94304: ^ task is: TASK: meta (flush_handlers) 18285 1726853401.94306: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.94309: getting variables 18285 1726853401.94311: in VariableManager get_vars() 18285 1726853401.94341: Calling all_inventory to load vars for managed_node1 18285 1726853401.94344: Calling groups_inventory to load vars for managed_node1 18285 1726853401.94347: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.94361: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.94363: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.94366: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.94524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.94639: done with get_vars() 18285 1726853401.94646: done getting variables 18285 1726853401.94697: in VariableManager get_vars() 18285 1726853401.94705: Calling all_inventory to load vars for managed_node1 18285 1726853401.94707: Calling groups_inventory to load vars for managed_node1 18285 1726853401.94708: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.94711: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.94713: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.94715: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.94829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.94988: done with get_vars() 18285 1726853401.95000: done queuing things up, now waiting for results queue to drain 18285 1726853401.95002: results queue empty 18285 1726853401.95003: checking for any_errors_fatal 18285 1726853401.95005: done checking for any_errors_fatal 18285 1726853401.95006: checking for max_fail_percentage 18285 1726853401.95008: done checking for max_fail_percentage 18285 1726853401.95008: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.95009: done checking to see if all hosts have failed 18285 1726853401.95010: getting the remaining hosts for this loop 18285 1726853401.95010: done getting the remaining hosts for this loop 18285 1726853401.95013: getting the next task for host managed_node1 18285 1726853401.95016: done getting next task for host managed_node1 18285 1726853401.95019: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 18285 1726853401.95020: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.95022: getting variables 18285 1726853401.95023: in VariableManager get_vars() 18285 1726853401.95032: Calling all_inventory to load vars for managed_node1 18285 1726853401.95034: Calling groups_inventory to load vars for managed_node1 18285 1726853401.95036: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.95045: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.95048: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.95054: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.95185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.95368: done with get_vars() 18285 1726853401.95378: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Friday 20 September 2024 13:30:01 -0400 (0:00:00.049) 0:00:07.890 ****** 18285 1726853401.95446: entering _queue_task() for managed_node1/include_tasks 18285 1726853401.95737: worker is 1 (out of 1 available) 18285 1726853401.95753: exiting _queue_task() for managed_node1/include_tasks 18285 1726853401.95764: done queuing things up, now waiting for results queue to drain 18285 1726853401.95766: waiting for pending results... 18285 1726853401.96058: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' 18285 1726853401.96108: in run() - task 02083763-bbaf-9200-7ca6-000000000074 18285 1726853401.96119: variable 'ansible_search_path' from source: unknown 18285 1726853401.96149: calling self._execute() 18285 1726853401.96215: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853401.96219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853401.96229: variable 'omit' from source: magic vars 18285 1726853401.96544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853401.98094: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853401.98377: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853401.98381: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853401.98384: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853401.98386: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853401.98390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853401.98393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853401.98410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853401.98458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853401.98478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853401.98616: variable 'ansible_distribution' from source: facts 18285 1726853401.98631: variable 'ansible_distribution_major_version' from source: facts 18285 1726853401.98661: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853401.98669: when evaluation is False, skipping this task 18285 1726853401.98680: _execute() done 18285 1726853401.98687: dumping result to json 18285 1726853401.98695: done dumping result, returning 18285 1726853401.98709: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' [02083763-bbaf-9200-7ca6-000000000074] 18285 1726853401.98719: sending task result for task 02083763-bbaf-9200-7ca6-000000000074 18285 1726853401.98833: done sending task result for task 02083763-bbaf-9200-7ca6-000000000074 18285 1726853401.98835: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853401.98886: no more pending results, returning what we have 18285 1726853401.98892: results queue empty 18285 1726853401.98894: checking for any_errors_fatal 18285 1726853401.98899: done checking for any_errors_fatal 18285 1726853401.98900: checking for max_fail_percentage 18285 1726853401.98902: done checking for max_fail_percentage 18285 1726853401.98903: checking to see if all hosts have failed and the running result is not ok 18285 1726853401.98904: done checking to see if all hosts have failed 18285 1726853401.98905: getting the remaining hosts for this loop 18285 1726853401.98906: done getting the remaining hosts for this loop 18285 1726853401.98914: getting the next task for host managed_node1 18285 1726853401.98921: done getting next task for host managed_node1 18285 1726853401.98927: ^ task is: TASK: Include the task 'assert_device_absent.yml' 18285 1726853401.98929: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853401.98934: getting variables 18285 1726853401.98936: in VariableManager get_vars() 18285 1726853401.99079: Calling all_inventory to load vars for managed_node1 18285 1726853401.99083: Calling groups_inventory to load vars for managed_node1 18285 1726853401.99087: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853401.99101: Calling all_plugins_play to load vars for managed_node1 18285 1726853401.99104: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853401.99107: Calling groups_plugins_play to load vars for managed_node1 18285 1726853401.99516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853401.99732: done with get_vars() 18285 1726853401.99741: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Friday 20 September 2024 13:30:01 -0400 (0:00:00.043) 0:00:07.934 ****** 18285 1726853401.99829: entering _queue_task() for managed_node1/include_tasks 18285 1726853402.00049: worker is 1 (out of 1 available) 18285 1726853402.00061: exiting _queue_task() for managed_node1/include_tasks 18285 1726853402.00076: done queuing things up, now waiting for results queue to drain 18285 1726853402.00078: waiting for pending results... 18285 1726853402.00243: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 18285 1726853402.00301: in run() - task 02083763-bbaf-9200-7ca6-000000000075 18285 1726853402.00315: variable 'ansible_search_path' from source: unknown 18285 1726853402.00340: calling self._execute() 18285 1726853402.00399: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853402.00404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853402.00424: variable 'omit' from source: magic vars 18285 1726853402.00991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853402.03363: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853402.03664: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853402.03693: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853402.03724: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853402.03751: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853402.03821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853402.03843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853402.03866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853402.03906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853402.04075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853402.04079: variable 'ansible_distribution' from source: facts 18285 1726853402.04082: variable 'ansible_distribution_major_version' from source: facts 18285 1726853402.04084: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853402.04085: when evaluation is False, skipping this task 18285 1726853402.04092: _execute() done 18285 1726853402.04101: dumping result to json 18285 1726853402.04109: done dumping result, returning 18285 1726853402.04120: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [02083763-bbaf-9200-7ca6-000000000075] 18285 1726853402.04130: sending task result for task 02083763-bbaf-9200-7ca6-000000000075 18285 1726853402.04341: done sending task result for task 02083763-bbaf-9200-7ca6-000000000075 18285 1726853402.04344: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853402.04421: no more pending results, returning what we have 18285 1726853402.04424: results queue empty 18285 1726853402.04425: checking for any_errors_fatal 18285 1726853402.04430: done checking for any_errors_fatal 18285 1726853402.04431: checking for max_fail_percentage 18285 1726853402.04432: done checking for max_fail_percentage 18285 1726853402.04433: checking to see if all hosts have failed and the running result is not ok 18285 1726853402.04434: done checking to see if all hosts have failed 18285 1726853402.04435: getting the remaining hosts for this loop 18285 1726853402.04436: done getting the remaining hosts for this loop 18285 1726853402.04440: getting the next task for host managed_node1 18285 1726853402.04446: done getting next task for host managed_node1 18285 1726853402.04451: ^ task is: TASK: meta (flush_handlers) 18285 1726853402.04453: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853402.04457: getting variables 18285 1726853402.04458: in VariableManager get_vars() 18285 1726853402.04494: Calling all_inventory to load vars for managed_node1 18285 1726853402.04497: Calling groups_inventory to load vars for managed_node1 18285 1726853402.04502: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.04513: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.04516: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.04519: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.04896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.05077: done with get_vars() 18285 1726853402.05090: done getting variables 18285 1726853402.05140: in VariableManager get_vars() 18285 1726853402.05147: Calling all_inventory to load vars for managed_node1 18285 1726853402.05148: Calling groups_inventory to load vars for managed_node1 18285 1726853402.05152: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.05155: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.05156: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.05158: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.05331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.05520: done with get_vars() 18285 1726853402.05531: done queuing things up, now waiting for results queue to drain 18285 1726853402.05533: results queue empty 18285 1726853402.05534: checking for any_errors_fatal 18285 1726853402.05536: done checking for any_errors_fatal 18285 1726853402.05537: checking for max_fail_percentage 18285 1726853402.05537: done checking for max_fail_percentage 18285 1726853402.05538: checking to see if all hosts have failed and the running result is not ok 18285 1726853402.05539: done checking to see if all hosts have failed 18285 1726853402.05540: getting the remaining hosts for this loop 18285 1726853402.05540: done getting the remaining hosts for this loop 18285 1726853402.05543: getting the next task for host managed_node1 18285 1726853402.05546: done getting next task for host managed_node1 18285 1726853402.05548: ^ task is: TASK: meta (flush_handlers) 18285 1726853402.05549: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853402.05551: getting variables 18285 1726853402.05552: in VariableManager get_vars() 18285 1726853402.05560: Calling all_inventory to load vars for managed_node1 18285 1726853402.05561: Calling groups_inventory to load vars for managed_node1 18285 1726853402.05563: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.05575: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.05577: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.05587: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.05751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.05945: done with get_vars() 18285 1726853402.05953: done getting variables 18285 1726853402.05997: in VariableManager get_vars() 18285 1726853402.06005: Calling all_inventory to load vars for managed_node1 18285 1726853402.06007: Calling groups_inventory to load vars for managed_node1 18285 1726853402.06009: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.06020: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.06023: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.06026: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.06169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.06373: done with get_vars() 18285 1726853402.06384: done queuing things up, now waiting for results queue to drain 18285 1726853402.06386: results queue empty 18285 1726853402.06387: checking for any_errors_fatal 18285 1726853402.06388: done checking for any_errors_fatal 18285 1726853402.06389: checking for max_fail_percentage 18285 1726853402.06390: done checking for max_fail_percentage 18285 1726853402.06390: checking to see if all hosts have failed and the running result is not ok 18285 1726853402.06391: done checking to see if all hosts have failed 18285 1726853402.06392: getting the remaining hosts for this loop 18285 1726853402.06392: done getting the remaining hosts for this loop 18285 1726853402.06395: getting the next task for host managed_node1 18285 1726853402.06397: done getting next task for host managed_node1 18285 1726853402.06398: ^ task is: None 18285 1726853402.06399: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853402.06401: done queuing things up, now waiting for results queue to drain 18285 1726853402.06401: results queue empty 18285 1726853402.06402: checking for any_errors_fatal 18285 1726853402.06403: done checking for any_errors_fatal 18285 1726853402.06403: checking for max_fail_percentage 18285 1726853402.06404: done checking for max_fail_percentage 18285 1726853402.06405: checking to see if all hosts have failed and the running result is not ok 18285 1726853402.06406: done checking to see if all hosts have failed 18285 1726853402.06407: getting the next task for host managed_node1 18285 1726853402.06409: done getting next task for host managed_node1 18285 1726853402.06409: ^ task is: None 18285 1726853402.06411: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853402.06439: in VariableManager get_vars() 18285 1726853402.06475: done with get_vars() 18285 1726853402.06481: in VariableManager get_vars() 18285 1726853402.06489: done with get_vars() 18285 1726853402.06493: variable 'omit' from source: magic vars 18285 1726853402.06520: in VariableManager get_vars() 18285 1726853402.06528: done with get_vars() 18285 1726853402.06548: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 18285 1726853402.06739: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18285 1726853402.06761: getting the remaining hosts for this loop 18285 1726853402.06762: done getting the remaining hosts for this loop 18285 1726853402.06764: getting the next task for host managed_node1 18285 1726853402.06767: done getting next task for host managed_node1 18285 1726853402.06768: ^ task is: TASK: Gathering Facts 18285 1726853402.06770: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853402.06778: getting variables 18285 1726853402.06779: in VariableManager get_vars() 18285 1726853402.06787: Calling all_inventory to load vars for managed_node1 18285 1726853402.06789: Calling groups_inventory to load vars for managed_node1 18285 1726853402.06791: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.06796: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.06798: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.06801: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.07000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.07460: done with get_vars() 18285 1726853402.07468: done getting variables 18285 1726853402.07556: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Friday 20 September 2024 13:30:02 -0400 (0:00:00.077) 0:00:08.012 ****** 18285 1726853402.07582: entering _queue_task() for managed_node1/gather_facts 18285 1726853402.08241: worker is 1 (out of 1 available) 18285 1726853402.08254: exiting _queue_task() for managed_node1/gather_facts 18285 1726853402.08266: done queuing things up, now waiting for results queue to drain 18285 1726853402.08267: waiting for pending results... 18285 1726853402.08499: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18285 1726853402.08560: in run() - task 02083763-bbaf-9200-7ca6-0000000002e3 18285 1726853402.08584: variable 'ansible_search_path' from source: unknown 18285 1726853402.08705: calling self._execute() 18285 1726853402.08710: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853402.08723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853402.08740: variable 'omit' from source: magic vars 18285 1726853402.09175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853402.11679: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853402.11746: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853402.11789: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853402.11824: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853402.11949: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853402.11974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853402.12010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853402.12041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853402.12093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853402.12115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853402.12259: variable 'ansible_distribution' from source: facts 18285 1726853402.12275: variable 'ansible_distribution_major_version' from source: facts 18285 1726853402.12317: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853402.12326: when evaluation is False, skipping this task 18285 1726853402.12333: _execute() done 18285 1726853402.12340: dumping result to json 18285 1726853402.12348: done dumping result, returning 18285 1726853402.12359: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-9200-7ca6-0000000002e3] 18285 1726853402.12369: sending task result for task 02083763-bbaf-9200-7ca6-0000000002e3 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853402.12527: no more pending results, returning what we have 18285 1726853402.12531: results queue empty 18285 1726853402.12532: checking for any_errors_fatal 18285 1726853402.12533: done checking for any_errors_fatal 18285 1726853402.12534: checking for max_fail_percentage 18285 1726853402.12535: done checking for max_fail_percentage 18285 1726853402.12536: checking to see if all hosts have failed and the running result is not ok 18285 1726853402.12536: done checking to see if all hosts have failed 18285 1726853402.12537: getting the remaining hosts for this loop 18285 1726853402.12538: done getting the remaining hosts for this loop 18285 1726853402.12542: getting the next task for host managed_node1 18285 1726853402.12547: done getting next task for host managed_node1 18285 1726853402.12551: ^ task is: TASK: meta (flush_handlers) 18285 1726853402.12553: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853402.12556: getting variables 18285 1726853402.12557: in VariableManager get_vars() 18285 1726853402.12598: Calling all_inventory to load vars for managed_node1 18285 1726853402.12600: Calling groups_inventory to load vars for managed_node1 18285 1726853402.12605: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.12620: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.12623: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.12627: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.12928: done sending task result for task 02083763-bbaf-9200-7ca6-0000000002e3 18285 1726853402.12931: WORKER PROCESS EXITING 18285 1726853402.12957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.13418: done with get_vars() 18285 1726853402.13430: done getting variables 18285 1726853402.13505: in VariableManager get_vars() 18285 1726853402.13515: Calling all_inventory to load vars for managed_node1 18285 1726853402.13517: Calling groups_inventory to load vars for managed_node1 18285 1726853402.13520: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.13524: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.13526: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.13530: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.13667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.13880: done with get_vars() 18285 1726853402.13892: done queuing things up, now waiting for results queue to drain 18285 1726853402.13894: results queue empty 18285 1726853402.13899: checking for any_errors_fatal 18285 1726853402.13901: done checking for any_errors_fatal 18285 1726853402.13902: checking for max_fail_percentage 18285 1726853402.13903: done checking for max_fail_percentage 18285 1726853402.13903: checking to see if all hosts have failed and the running result is not ok 18285 1726853402.13904: done checking to see if all hosts have failed 18285 1726853402.13905: getting the remaining hosts for this loop 18285 1726853402.13906: done getting the remaining hosts for this loop 18285 1726853402.13908: getting the next task for host managed_node1 18285 1726853402.13912: done getting next task for host managed_node1 18285 1726853402.13914: ^ task is: TASK: Verify network state restored to default 18285 1726853402.13916: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853402.13918: getting variables 18285 1726853402.13919: in VariableManager get_vars() 18285 1726853402.13926: Calling all_inventory to load vars for managed_node1 18285 1726853402.13928: Calling groups_inventory to load vars for managed_node1 18285 1726853402.13930: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.13940: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.13943: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.13945: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.14088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.14316: done with get_vars() 18285 1726853402.14324: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Friday 20 September 2024 13:30:02 -0400 (0:00:00.068) 0:00:08.080 ****** 18285 1726853402.14409: entering _queue_task() for managed_node1/include_tasks 18285 1726853402.14736: worker is 1 (out of 1 available) 18285 1726853402.14749: exiting _queue_task() for managed_node1/include_tasks 18285 1726853402.14760: done queuing things up, now waiting for results queue to drain 18285 1726853402.14762: waiting for pending results... 18285 1726853402.15073: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 18285 1726853402.15227: in run() - task 02083763-bbaf-9200-7ca6-000000000078 18285 1726853402.15246: variable 'ansible_search_path' from source: unknown 18285 1726853402.15286: calling self._execute() 18285 1726853402.15446: variable 'ansible_host' from source: host vars for 'managed_node1' 18285 1726853402.15450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18285 1726853402.15453: variable 'omit' from source: magic vars 18285 1726853402.15937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18285 1726853402.17642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18285 1726853402.17686: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18285 1726853402.17716: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18285 1726853402.17742: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18285 1726853402.17762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18285 1726853402.17822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18285 1726853402.17845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18285 1726853402.17864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18285 1726853402.17891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18285 1726853402.17902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18285 1726853402.17998: variable 'ansible_distribution' from source: facts 18285 1726853402.18004: variable 'ansible_distribution_major_version' from source: facts 18285 1726853402.18021: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18285 1726853402.18025: when evaluation is False, skipping this task 18285 1726853402.18028: _execute() done 18285 1726853402.18030: dumping result to json 18285 1726853402.18032: done dumping result, returning 18285 1726853402.18038: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [02083763-bbaf-9200-7ca6-000000000078] 18285 1726853402.18044: sending task result for task 02083763-bbaf-9200-7ca6-000000000078 18285 1726853402.18133: done sending task result for task 02083763-bbaf-9200-7ca6-000000000078 18285 1726853402.18135: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18285 1726853402.18198: no more pending results, returning what we have 18285 1726853402.18202: results queue empty 18285 1726853402.18203: checking for any_errors_fatal 18285 1726853402.18205: done checking for any_errors_fatal 18285 1726853402.18205: checking for max_fail_percentage 18285 1726853402.18207: done checking for max_fail_percentage 18285 1726853402.18208: checking to see if all hosts have failed and the running result is not ok 18285 1726853402.18209: done checking to see if all hosts have failed 18285 1726853402.18209: getting the remaining hosts for this loop 18285 1726853402.18210: done getting the remaining hosts for this loop 18285 1726853402.18215: getting the next task for host managed_node1 18285 1726853402.18221: done getting next task for host managed_node1 18285 1726853402.18223: ^ task is: TASK: meta (flush_handlers) 18285 1726853402.18224: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853402.18229: getting variables 18285 1726853402.18230: in VariableManager get_vars() 18285 1726853402.18262: Calling all_inventory to load vars for managed_node1 18285 1726853402.18264: Calling groups_inventory to load vars for managed_node1 18285 1726853402.18267: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.18280: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.18282: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.18290: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.18475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.18663: done with get_vars() 18285 1726853402.18675: done getting variables 18285 1726853402.18746: in VariableManager get_vars() 18285 1726853402.18756: Calling all_inventory to load vars for managed_node1 18285 1726853402.18758: Calling groups_inventory to load vars for managed_node1 18285 1726853402.18761: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.18765: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.18768: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.18770: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.18948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.19143: done with get_vars() 18285 1726853402.19155: done queuing things up, now waiting for results queue to drain 18285 1726853402.19157: results queue empty 18285 1726853402.19157: checking for any_errors_fatal 18285 1726853402.19160: done checking for any_errors_fatal 18285 1726853402.19160: checking for max_fail_percentage 18285 1726853402.19162: done checking for max_fail_percentage 18285 1726853402.19162: checking to see if all hosts have failed and the running result is not ok 18285 1726853402.19163: done checking to see if all hosts have failed 18285 1726853402.19163: getting the remaining hosts for this loop 18285 1726853402.19164: done getting the remaining hosts for this loop 18285 1726853402.19167: getting the next task for host managed_node1 18285 1726853402.19170: done getting next task for host managed_node1 18285 1726853402.19174: ^ task is: TASK: meta (flush_handlers) 18285 1726853402.19176: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853402.19178: getting variables 18285 1726853402.19179: in VariableManager get_vars() 18285 1726853402.19187: Calling all_inventory to load vars for managed_node1 18285 1726853402.19189: Calling groups_inventory to load vars for managed_node1 18285 1726853402.19191: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.19200: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.19203: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.19205: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.19347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.19507: done with get_vars() 18285 1726853402.19513: done getting variables 18285 1726853402.19542: in VariableManager get_vars() 18285 1726853402.19547: Calling all_inventory to load vars for managed_node1 18285 1726853402.19549: Calling groups_inventory to load vars for managed_node1 18285 1726853402.19551: Calling all_plugins_inventory to load vars for managed_node1 18285 1726853402.19555: Calling all_plugins_play to load vars for managed_node1 18285 1726853402.19557: Calling groups_plugins_inventory to load vars for managed_node1 18285 1726853402.19559: Calling groups_plugins_play to load vars for managed_node1 18285 1726853402.19637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18285 1726853402.19761: done with get_vars() 18285 1726853402.19769: done queuing things up, now waiting for results queue to drain 18285 1726853402.19772: results queue empty 18285 1726853402.19773: checking for any_errors_fatal 18285 1726853402.19774: done checking for any_errors_fatal 18285 1726853402.19775: checking for max_fail_percentage 18285 1726853402.19776: done checking for max_fail_percentage 18285 1726853402.19777: checking to see if all hosts have failed and the running result is not ok 18285 1726853402.19777: done checking to see if all hosts have failed 18285 1726853402.19777: getting the remaining hosts for this loop 18285 1726853402.19778: done getting the remaining hosts for this loop 18285 1726853402.19780: getting the next task for host managed_node1 18285 1726853402.19782: done getting next task for host managed_node1 18285 1726853402.19782: ^ task is: None 18285 1726853402.19783: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18285 1726853402.19784: done queuing things up, now waiting for results queue to drain 18285 1726853402.19784: results queue empty 18285 1726853402.19785: checking for any_errors_fatal 18285 1726853402.19785: done checking for any_errors_fatal 18285 1726853402.19786: checking for max_fail_percentage 18285 1726853402.19786: done checking for max_fail_percentage 18285 1726853402.19787: checking to see if all hosts have failed and the running result is not ok 18285 1726853402.19787: done checking to see if all hosts have failed 18285 1726853402.19788: getting the next task for host managed_node1 18285 1726853402.19789: done getting next task for host managed_node1 18285 1726853402.19789: ^ task is: None 18285 1726853402.19790: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=6 changed=0 unreachable=0 failed=0 skipped=94 rescued=0 ignored=0 Friday 20 September 2024 13:30:02 -0400 (0:00:00.054) 0:00:08.134 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 1.60s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml:5 Check if system is ostree ----------------------------------------------- 0.84s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gathering Facts --------------------------------------------------------- 0.14s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later --- 0.11s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 --- 0.11s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.10s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces --- 0.10s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.10s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable --- 0.10s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces --- 0.10s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces --- 0.09s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Show inside ethernet tests ---------------------------------------------- 0.09s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces --- 0.09s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces --- 0.09s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Include the task 'enable_epel.yml' -------------------------------------- 0.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Include the task 'assert_output_in_stderr_without_warnings.yml' --------- 0.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gathering Facts --------------------------------------------------------- 0.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Show network_provider --------------------------------------------------- 0.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 18285 1726853402.19853: RUNNING CLEANUP