6585 1727196498.63547: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-gLm executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 6585 1727196498.63809: Added group all to inventory 6585 1727196498.63811: Added group ungrouped to inventory 6585 1727196498.63813: Group all now contains ungrouped 6585 1727196498.63815: Examining possible inventory source: /tmp/ad_integration-csD/inventory-kwd.yml 6585 1727196498.72285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 6585 1727196498.72325: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 6585 1727196498.72343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 6585 1727196498.72379: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 6585 1727196498.72427: Loaded config def from plugin (inventory/script) 6585 1727196498.72429: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 6585 1727196498.72458: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 6585 1727196498.72511: Loaded config def from plugin (inventory/yaml) 6585 1727196498.72512: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 6585 1727196498.72573: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 6585 1727196498.72841: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 6585 1727196498.72844: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 6585 1727196498.72846: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 6585 1727196498.72850: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 6585 1727196498.72853: Loading data from /tmp/ad_integration-csD/inventory-kwd.yml 6585 1727196498.72897: /tmp/ad_integration-csD/inventory-kwd.yml was not parsable by auto 6585 1727196498.72940: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 6585 1727196498.72966: Loading data from /tmp/ad_integration-csD/inventory-kwd.yml 6585 1727196498.73017: group all already in inventory 6585 1727196498.73025: set inventory_file for managed-node1 6585 1727196498.73028: set inventory_dir for managed-node1 6585 1727196498.73029: Added host managed-node1 to inventory 6585 1727196498.73031: Added host managed-node1 to group all 6585 1727196498.73031: set ansible_host for managed-node1 6585 1727196498.73032: set ansible_ssh_extra_args for managed-node1 6585 1727196498.73034: set inventory_file for managed-node2 6585 1727196498.73036: set inventory_dir for managed-node2 6585 1727196498.73036: Added host managed-node2 to inventory 6585 1727196498.73037: Added host managed-node2 to group all 6585 1727196498.73038: set ansible_host for managed-node2 6585 1727196498.73038: set ansible_ssh_extra_args for managed-node2 6585 1727196498.73040: set inventory_file for managed-node3 6585 1727196498.73041: set inventory_dir for managed-node3 6585 1727196498.73041: Added host managed-node3 to inventory 6585 1727196498.73042: Added host managed-node3 to group all 6585 1727196498.73043: set ansible_host for managed-node3 6585 1727196498.73043: set ansible_ssh_extra_args for managed-node3 6585 1727196498.73045: Reconcile groups and hosts in inventory. 6585 1727196498.73047: Group ungrouped now contains managed-node1 6585 1727196498.73048: Group ungrouped now contains managed-node2 6585 1727196498.73049: Group ungrouped now contains managed-node3 6585 1727196498.73099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 6585 1727196498.73177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 6585 1727196498.73207: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 6585 1727196498.73227: Loaded config def from plugin (vars/host_group_vars) 6585 1727196498.73229: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 6585 1727196498.73234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 6585 1727196498.73239: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 6585 1727196498.73267: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 6585 1727196498.73494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.73563: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 6585 1727196498.73585: Loaded config def from plugin (connection/local) 6585 1727196498.73587: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 6585 1727196498.73960: Loaded config def from plugin (connection/paramiko_ssh) 6585 1727196498.73963: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 6585 1727196498.74503: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 6585 1727196498.74531: Loaded config def from plugin (connection/psrp) 6585 1727196498.74533: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 6585 1727196498.74932: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 6585 1727196498.74957: Loaded config def from plugin (connection/ssh) 6585 1727196498.74959: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 6585 1727196498.76177: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 6585 1727196498.76199: Loaded config def from plugin (connection/winrm) 6585 1727196498.76201: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 6585 1727196498.76220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 6585 1727196498.76267: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 6585 1727196498.76305: Loaded config def from plugin (shell/cmd) 6585 1727196498.76306: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) [WARNING]: Could not match supplied host pattern, ignoring: ad 6585 1727196498.76327: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 6585 1727196498.76363: Loaded config def from plugin (shell/powershell) 6585 1727196498.76365: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 6585 1727196498.76401: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 6585 1727196498.76504: Loaded config def from plugin (shell/sh) 6585 1727196498.76506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 6585 1727196498.76531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 6585 1727196498.76601: Loaded config def from plugin (become/runas) 6585 1727196498.76603: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 6585 1727196498.76710: Loaded config def from plugin (become/su) 6585 1727196498.76711: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 6585 1727196498.76805: Loaded config def from plugin (become/sudo) 6585 1727196498.76807: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 6585 1727196498.76831: Loading data from /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml 6585 1727196498.77098: trying /usr/local/lib/python3.12/site-packages/ansible/modules 6585 1727196498.78929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 6585 1727196498.78998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 6585 1727196498.79007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 6585 1727196498.79159: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 6585 1727196498.79249: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 6585 1727196498.79251: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-gLm/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 6585 1727196498.79272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 6585 1727196498.79287: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 6585 1727196498.79384: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 6585 1727196498.79418: Loaded config def from plugin (callback/default) 6585 1727196498.79420: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 6585 1727196498.80126: Loaded config def from plugin (callback/junit) 6585 1727196498.80128: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 6585 1727196498.80156: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 6585 1727196498.80194: Loaded config def from plugin (callback/minimal) 6585 1727196498.80195: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 6585 1727196498.80220: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 6585 1727196498.80260: Loaded config def from plugin (callback/tree) 6585 1727196498.80261: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 6585 1727196498.80341: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 6585 1727196498.80343: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-gLm/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml 6585 1727196498.80359: in VariableManager get_vars() 6585 1727196498.80377: Could not match supplied host pattern, ignoring: ad 6585 1727196498.80388: done with get_vars() 6585 1727196498.80392: in VariableManager get_vars() 6585 1727196498.80398: done with get_vars() 6585 1727196498.80401: variable 'omit' from source: magic vars 6585 1727196498.80428: in VariableManager get_vars() 6585 1727196498.80437: done with get_vars() 6585 1727196498.80450: variable 'omit' from source: magic vars PLAY [Ensure role behaviour with default parameters] *************************** 6585 1727196498.80786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 6585 1727196498.80834: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 6585 1727196498.80859: getting the remaining hosts for this loop 6585 1727196498.80861: done getting the remaining hosts for this loop 6585 1727196498.80863: getting the next task for host managed-node2 6585 1727196498.80865: done getting next task for host managed-node2 6585 1727196498.80867: ^ task is: TASK: meta (flush_handlers) 6585 1727196498.80868: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 6585 1727196498.80872: getting variables 6585 1727196498.80873: in VariableManager get_vars() 6585 1727196498.80879: Calling all_inventory to load vars for managed-node2 6585 1727196498.80880: Calling groups_inventory to load vars for managed-node2 6585 1727196498.80882: Calling all_plugins_inventory to load vars for managed-node2 6585 1727196498.80890: Calling all_plugins_play to load vars for managed-node2 6585 1727196498.80896: Calling groups_plugins_inventory to load vars for managed-node2 6585 1727196498.80898: Calling groups_plugins_play to load vars for managed-node2 6585 1727196498.80918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.80955: done with get_vars() 6585 1727196498.80960: done getting variables 6585 1727196498.80985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 6585 1727196498.81014: in VariableManager get_vars() 6585 1727196498.81020: Calling all_inventory to load vars for managed-node2 6585 1727196498.81027: Calling groups_inventory to load vars for managed-node2 6585 1727196498.81028: Calling all_plugins_inventory to load vars for managed-node2 6585 1727196498.81031: Calling all_plugins_play to load vars for managed-node2 6585 1727196498.81033: Calling groups_plugins_inventory to load vars for managed-node2 6585 1727196498.81034: Calling groups_plugins_play to load vars for managed-node2 6585 1727196498.81051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.81060: done with get_vars() 6585 1727196498.81068: done queuing things up, now waiting for results queue to drain 6585 1727196498.81069: results queue empty 6585 1727196498.81069: checking for any_errors_fatal 6585 1727196498.81070: done checking for any_errors_fatal 6585 1727196498.81071: checking for max_fail_percentage 6585 1727196498.81071: done checking for max_fail_percentage 6585 1727196498.81072: checking to see if all hosts have failed and the running result is not ok 6585 1727196498.81072: done checking to see if all hosts have failed 6585 1727196498.81072: getting the remaining hosts for this loop 6585 1727196498.81073: done getting the remaining hosts for this loop 6585 1727196498.81075: getting the next task for host managed-node2 6585 1727196498.81077: done getting next task for host managed-node2 6585 1727196498.81078: ^ task is: TASK: Include the role 6585 1727196498.81079: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 6585 1727196498.81080: getting variables 6585 1727196498.81081: in VariableManager get_vars() 6585 1727196498.81096: Calling all_inventory to load vars for managed-node2 6585 1727196498.81097: Calling groups_inventory to load vars for managed-node2 6585 1727196498.81099: Calling all_plugins_inventory to load vars for managed-node2 6585 1727196498.81102: Calling all_plugins_play to load vars for managed-node2 6585 1727196498.81103: Calling groups_plugins_inventory to load vars for managed-node2 6585 1727196498.81105: Calling groups_plugins_play to load vars for managed-node2 6585 1727196498.81120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.81133: done with get_vars() 6585 1727196498.81137: done getting variables TASK [Include the role] ******************************************************** task path: /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:11 Tuesday 24 September 2024 12:48:18 -0400 (0:00:00.008) 0:00:00.008 ***** 6585 1727196498.81179: entering _queue_task() for managed-node2/include_role 6585 1727196498.81180: Creating lock for include_role 6585 1727196498.81390: worker is 1 (out of 1 available) 6585 1727196498.81404: exiting _queue_task() for managed-node2/include_role 6585 1727196498.81415: done queuing things up, now waiting for results queue to drain 6585 1727196498.81418: waiting for pending results... 6585 1727196498.81541: running TaskExecutor() for managed-node2/TASK: Include the role 6585 1727196498.81593: in run() - task 0e890950-22ad-228f-c704-000000000006 6585 1727196498.81602: variable 'ansible_search_path' from source: unknown 6585 1727196498.81635: calling self._execute() 6585 1727196498.81678: variable 'ansible_host' from source: host vars for 'managed-node2' 6585 1727196498.81681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 6585 1727196498.81690: variable 'omit' from source: magic vars 6585 1727196498.81761: _execute() done 6585 1727196498.81766: dumping result to json 6585 1727196498.81769: done dumping result, returning 6585 1727196498.81772: done running TaskExecutor() for managed-node2/TASK: Include the role [0e890950-22ad-228f-c704-000000000006] 6585 1727196498.81780: sending task result for task 0e890950-22ad-228f-c704-000000000006 6585 1727196498.81868: done sending task result for task 0e890950-22ad-228f-c704-000000000006 6585 1727196498.81871: WORKER PROCESS EXITING 6585 1727196498.81917: no more pending results, returning what we have 6585 1727196498.81925: in VariableManager get_vars() 6585 1727196498.81947: Calling all_inventory to load vars for managed-node2 6585 1727196498.81950: Calling groups_inventory to load vars for managed-node2 6585 1727196498.81952: Calling all_plugins_inventory to load vars for managed-node2 6585 1727196498.81959: Calling all_plugins_play to load vars for managed-node2 6585 1727196498.81961: Calling groups_plugins_inventory to load vars for managed-node2 6585 1727196498.81963: Calling groups_plugins_play to load vars for managed-node2 6585 1727196498.81986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.81998: done with get_vars() 6585 1727196498.82001: variable 'ansible_search_path' from source: unknown 6585 1727196498.82048: variable 'omit' from source: magic vars 6585 1727196498.82064: variable 'omit' from source: magic vars 6585 1727196498.82073: variable 'omit' from source: magic vars 6585 1727196498.82075: we have included files to process 6585 1727196498.82076: generating all_blocks data 6585 1727196498.82076: done generating all_blocks data 6585 1727196498.82077: processing included file: fedora.linux_system_roles.ad_integration 6585 1727196498.82089: in VariableManager get_vars() 6585 1727196498.82097: done with get_vars() 6585 1727196498.82142: in VariableManager get_vars() 6585 1727196498.82150: done with get_vars() 6585 1727196498.82176: Loading data from /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/roles/ad_integration/vars/main.yml 6585 1727196498.82349: Loading data from /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/roles/ad_integration/defaults/main.yml 6585 1727196498.82419: Loading data from /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/roles/ad_integration/meta/main.yml 6585 1727196498.82502: Loading data from /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml 6585 1727196498.84635: in VariableManager get_vars() 6585 1727196498.84648: done with get_vars() 6585 1727196498.86243: Loading data from /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/roles/ad_integration/handlers/main.yml 6585 1727196498.86651: iterating over new_blocks loaded from include file 6585 1727196498.86653: in VariableManager get_vars() 6585 1727196498.86662: done with get_vars() 6585 1727196498.86663: filtering new block on tags 6585 1727196498.86692: done filtering new block on tags 6585 1727196498.86694: in VariableManager get_vars() 6585 1727196498.86701: done with get_vars() 6585 1727196498.86702: filtering new block on tags 6585 1727196498.86711: done filtering new block on tags 6585 1727196498.86712: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.ad_integration for managed-node2 6585 1727196498.86714: extending task lists for all hosts with included blocks 6585 1727196498.86746: done extending task lists 6585 1727196498.86747: done processing included files 6585 1727196498.86747: results queue empty 6585 1727196498.86747: checking for any_errors_fatal 6585 1727196498.86748: done checking for any_errors_fatal 6585 1727196498.86749: checking for max_fail_percentage 6585 1727196498.86750: done checking for max_fail_percentage 6585 1727196498.86750: checking to see if all hosts have failed and the running result is not ok 6585 1727196498.86751: done checking to see if all hosts have failed 6585 1727196498.86751: getting the remaining hosts for this loop 6585 1727196498.86752: done getting the remaining hosts for this loop 6585 1727196498.86753: getting the next task for host managed-node2 6585 1727196498.86756: done getting next task for host managed-node2 6585 1727196498.86757: ^ task is: TASK: fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available 6585 1727196498.86758: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 6585 1727196498.86764: getting variables 6585 1727196498.86765: in VariableManager get_vars() 6585 1727196498.86773: Calling all_inventory to load vars for managed-node2 6585 1727196498.86775: Calling groups_inventory to load vars for managed-node2 6585 1727196498.86786: Calling all_plugins_inventory to load vars for managed-node2 6585 1727196498.86789: Calling all_plugins_play to load vars for managed-node2 6585 1727196498.86790: Calling groups_plugins_inventory to load vars for managed-node2 6585 1727196498.86792: Calling groups_plugins_play to load vars for managed-node2 6585 1727196498.86809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.86825: done with get_vars() 6585 1727196498.86830: done getting variables 6585 1727196498.86871: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available] *** task path: /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 Tuesday 24 September 2024 12:48:18 -0400 (0:00:00.057) 0:00:00.066 ***** 6585 1727196498.86890: entering _queue_task() for managed-node2/fail 6585 1727196498.86891: Creating lock for fail 6585 1727196498.87072: worker is 1 (out of 1 available) 6585 1727196498.87085: exiting _queue_task() for managed-node2/fail 6585 1727196498.87098: done queuing things up, now waiting for results queue to drain 6585 1727196498.87099: waiting for pending results... 6585 1727196498.87233: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available 6585 1727196498.87301: in run() - task 0e890950-22ad-228f-c704-000000000023 6585 1727196498.87312: variable 'ansible_search_path' from source: unknown 6585 1727196498.87315: variable 'ansible_search_path' from source: unknown 6585 1727196498.87354: calling self._execute() 6585 1727196498.87396: variable 'ansible_host' from source: host vars for 'managed-node2' 6585 1727196498.87400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 6585 1727196498.87408: variable 'omit' from source: magic vars 6585 1727196498.87710: variable 'ad_integration_realm' from source: role '' defaults 6585 1727196498.87717: Evaluated conditional (not ad_integration_realm): True 6585 1727196498.87727: variable 'omit' from source: magic vars 6585 1727196498.87753: variable 'omit' from source: magic vars 6585 1727196498.87780: variable 'omit' from source: magic vars 6585 1727196498.87809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 6585 1727196498.87836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 6585 1727196498.87851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 6585 1727196498.87863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 6585 1727196498.87875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 6585 1727196498.87894: variable 'inventory_hostname' from source: host vars for 'managed-node2' 6585 1727196498.87900: variable 'ansible_host' from source: host vars for 'managed-node2' 6585 1727196498.87904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 6585 1727196498.87971: Set connection var ansible_shell_type to sh 6585 1727196498.87974: Set connection var ansible_timeout to 10 6585 1727196498.87984: Set connection var ansible_module_compression to ZIP_DEFLATED 6585 1727196498.87989: Set connection var ansible_connection to ssh 6585 1727196498.87994: Set connection var ansible_pipelining to False 6585 1727196498.87999: Set connection var ansible_shell_executable to /bin/sh 6585 1727196498.88016: variable 'ansible_shell_executable' from source: unknown 6585 1727196498.88019: variable 'ansible_connection' from source: unknown 6585 1727196498.88023: variable 'ansible_module_compression' from source: unknown 6585 1727196498.88028: variable 'ansible_shell_type' from source: unknown 6585 1727196498.88030: variable 'ansible_shell_executable' from source: unknown 6585 1727196498.88034: variable 'ansible_host' from source: host vars for 'managed-node2' 6585 1727196498.88038: variable 'ansible_pipelining' from source: unknown 6585 1727196498.88041: variable 'ansible_timeout' from source: unknown 6585 1727196498.88045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 6585 1727196498.88149: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 6585 1727196498.88157: variable 'omit' from source: magic vars 6585 1727196498.88162: starting attempt loop 6585 1727196498.88164: running the handler 6585 1727196498.88172: handler run complete 6585 1727196498.88195: attempt loop complete, returning result 6585 1727196498.88198: _execute() done 6585 1727196498.88201: dumping result to json 6585 1727196498.88203: done dumping result, returning 6585 1727196498.88209: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available [0e890950-22ad-228f-c704-000000000023] 6585 1727196498.88212: sending task result for task 0e890950-22ad-228f-c704-000000000023 6585 1727196498.88291: done sending task result for task 0e890950-22ad-228f-c704-000000000023 6585 1727196498.88294: WORKER PROCESS EXITING 6585 1727196498.88327: marking managed-node2 as failed 6585 1727196498.88332: marking host managed-node2 failed, current state: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 6585 1727196498.88338: ^ failed state is now: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=5, fail_state=2, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 6585 1727196498.88341: getting the next task for host managed-node2 6585 1727196498.88344: done getting next task for host managed-node2 6585 1727196498.88347: ^ task is: TASK: Assert that user is notified about missing variables 6585 1727196498.88348: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=0, handlers=0, run_state=2, fail_state=2, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False fatal: [managed-node2]: FAILED! => { "changed": false } MSG: Variable ad_integration_realm must be provided! 6585 1727196498.88452: no more pending results, returning what we have 6585 1727196498.88455: results queue empty 6585 1727196498.88456: checking for any_errors_fatal 6585 1727196498.88456: done checking for any_errors_fatal 6585 1727196498.88457: checking for max_fail_percentage 6585 1727196498.88458: done checking for max_fail_percentage 6585 1727196498.88458: checking to see if all hosts have failed and the running result is not ok 6585 1727196498.88459: done checking to see if all hosts have failed 6585 1727196498.88459: getting the remaining hosts for this loop 6585 1727196498.88460: done getting the remaining hosts for this loop 6585 1727196498.88462: getting the next task for host managed-node2 6585 1727196498.88464: done getting next task for host managed-node2 6585 1727196498.88465: ^ task is: TASK: Assert that user is notified about missing variables 6585 1727196498.88466: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=0, handlers=0, run_state=2, fail_state=2, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 6585 1727196498.88469: getting variables 6585 1727196498.88470: in VariableManager get_vars() 6585 1727196498.88484: Calling all_inventory to load vars for managed-node2 6585 1727196498.88485: Calling groups_inventory to load vars for managed-node2 6585 1727196498.88487: Calling all_plugins_inventory to load vars for managed-node2 6585 1727196498.88496: Calling all_plugins_play to load vars for managed-node2 6585 1727196498.88498: Calling groups_plugins_inventory to load vars for managed-node2 6585 1727196498.88499: Calling groups_plugins_play to load vars for managed-node2 6585 1727196498.88525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.88537: done with get_vars() 6585 1727196498.88542: done getting variables 6585 1727196498.88601: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Assert that user is notified about missing variables] ******************** task path: /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:16 Tuesday 24 September 2024 12:48:18 -0400 (0:00:00.017) 0:00:00.083 ***** 6585 1727196498.88617: entering _queue_task() for managed-node2/assert 6585 1727196498.88618: Creating lock for assert 6585 1727196498.88780: worker is 1 (out of 1 available) 6585 1727196498.88793: exiting _queue_task() for managed-node2/assert 6585 1727196498.88803: done queuing things up, now waiting for results queue to drain 6585 1727196498.88804: waiting for pending results... 6585 1727196498.88924: running TaskExecutor() for managed-node2/TASK: Assert that user is notified about missing variables 6585 1727196498.88966: in run() - task 0e890950-22ad-228f-c704-000000000007 6585 1727196498.88976: variable 'ansible_search_path' from source: unknown 6585 1727196498.89001: calling self._execute() 6585 1727196498.89046: variable 'ansible_host' from source: host vars for 'managed-node2' 6585 1727196498.89050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 6585 1727196498.89058: variable 'omit' from source: magic vars 6585 1727196498.89153: variable 'omit' from source: magic vars 6585 1727196498.89173: variable 'omit' from source: magic vars 6585 1727196498.89195: variable 'omit' from source: magic vars 6585 1727196498.89225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 6585 1727196498.89250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 6585 1727196498.89267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 6585 1727196498.89279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 6585 1727196498.89288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 6585 1727196498.89307: variable 'inventory_hostname' from source: host vars for 'managed-node2' 6585 1727196498.89311: variable 'ansible_host' from source: host vars for 'managed-node2' 6585 1727196498.89313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 6585 1727196498.89378: Set connection var ansible_shell_type to sh 6585 1727196498.89384: Set connection var ansible_timeout to 10 6585 1727196498.89390: Set connection var ansible_module_compression to ZIP_DEFLATED 6585 1727196498.89395: Set connection var ansible_connection to ssh 6585 1727196498.89400: Set connection var ansible_pipelining to False 6585 1727196498.89405: Set connection var ansible_shell_executable to /bin/sh 6585 1727196498.89420: variable 'ansible_shell_executable' from source: unknown 6585 1727196498.89424: variable 'ansible_connection' from source: unknown 6585 1727196498.89429: variable 'ansible_module_compression' from source: unknown 6585 1727196498.89431: variable 'ansible_shell_type' from source: unknown 6585 1727196498.89433: variable 'ansible_shell_executable' from source: unknown 6585 1727196498.89437: variable 'ansible_host' from source: host vars for 'managed-node2' 6585 1727196498.89441: variable 'ansible_pipelining' from source: unknown 6585 1727196498.89443: variable 'ansible_timeout' from source: unknown 6585 1727196498.89447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 6585 1727196498.89546: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 6585 1727196498.89553: variable 'omit' from source: magic vars 6585 1727196498.89558: starting attempt loop 6585 1727196498.89560: running the handler 6585 1727196498.89789: variable 'ansible_failed_result' from source: set_fact 6585 1727196498.89802: Evaluated conditional ("Variable ad_integration_realm" in ansible_failed_result.msg): True 6585 1727196498.89805: handler run complete 6585 1727196498.89817: attempt loop complete, returning result 6585 1727196498.89820: _execute() done 6585 1727196498.89824: dumping result to json 6585 1727196498.89827: done dumping result, returning 6585 1727196498.89834: done running TaskExecutor() for managed-node2/TASK: Assert that user is notified about missing variables [0e890950-22ad-228f-c704-000000000007] 6585 1727196498.89838: sending task result for task 0e890950-22ad-228f-c704-000000000007 6585 1727196498.89904: done sending task result for task 0e890950-22ad-228f-c704-000000000007 6585 1727196498.89907: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 6585 1727196498.89957: no more pending results, returning what we have 6585 1727196498.89959: results queue empty 6585 1727196498.89960: checking for any_errors_fatal 6585 1727196498.89964: done checking for any_errors_fatal 6585 1727196498.89965: checking for max_fail_percentage 6585 1727196498.89966: done checking for max_fail_percentage 6585 1727196498.89967: checking to see if all hosts have failed and the running result is not ok 6585 1727196498.89967: done checking to see if all hosts have failed 6585 1727196498.89968: getting the remaining hosts for this loop 6585 1727196498.89969: done getting the remaining hosts for this loop 6585 1727196498.89972: getting the next task for host managed-node2 6585 1727196498.89978: done getting next task for host managed-node2 6585 1727196498.89980: ^ task is: TASK: meta (flush_handlers) 6585 1727196498.89981: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 6585 1727196498.89984: getting variables 6585 1727196498.89985: in VariableManager get_vars() 6585 1727196498.90007: Calling all_inventory to load vars for managed-node2 6585 1727196498.90009: Calling groups_inventory to load vars for managed-node2 6585 1727196498.90011: Calling all_plugins_inventory to load vars for managed-node2 6585 1727196498.90018: Calling all_plugins_play to load vars for managed-node2 6585 1727196498.90020: Calling groups_plugins_inventory to load vars for managed-node2 6585 1727196498.90025: Calling groups_plugins_play to load vars for managed-node2 6585 1727196498.90049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.90062: done with get_vars() 6585 1727196498.90067: done getting variables 6585 1727196498.90107: in VariableManager get_vars() 6585 1727196498.90134: Calling all_inventory to load vars for managed-node2 6585 1727196498.90136: Calling groups_inventory to load vars for managed-node2 6585 1727196498.90138: Calling all_plugins_inventory to load vars for managed-node2 6585 1727196498.90141: Calling all_plugins_play to load vars for managed-node2 6585 1727196498.90142: Calling groups_plugins_inventory to load vars for managed-node2 6585 1727196498.90144: Calling groups_plugins_play to load vars for managed-node2 6585 1727196498.90160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.90169: done with get_vars() 6585 1727196498.90176: done queuing things up, now waiting for results queue to drain 6585 1727196498.90177: results queue empty 6585 1727196498.90177: checking for any_errors_fatal 6585 1727196498.90178: done checking for any_errors_fatal 6585 1727196498.90179: checking for max_fail_percentage 6585 1727196498.90179: done checking for max_fail_percentage 6585 1727196498.90180: checking to see if all hosts have failed and the running result is not ok 6585 1727196498.90180: done checking to see if all hosts have failed 6585 1727196498.90180: getting the remaining hosts for this loop 6585 1727196498.90181: done getting the remaining hosts for this loop 6585 1727196498.90182: getting the next task for host managed-node2 6585 1727196498.90185: done getting next task for host managed-node2 6585 1727196498.90189: ^ task is: TASK: meta (flush_handlers) 6585 1727196498.90190: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 6585 1727196498.90191: getting variables 6585 1727196498.90192: in VariableManager get_vars() 6585 1727196498.90197: Calling all_inventory to load vars for managed-node2 6585 1727196498.90198: Calling groups_inventory to load vars for managed-node2 6585 1727196498.90199: Calling all_plugins_inventory to load vars for managed-node2 6585 1727196498.90202: Calling all_plugins_play to load vars for managed-node2 6585 1727196498.90203: Calling groups_plugins_inventory to load vars for managed-node2 6585 1727196498.90205: Calling groups_plugins_play to load vars for managed-node2 6585 1727196498.90221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.90232: done with get_vars() 6585 1727196498.90237: done getting variables 6585 1727196498.90265: in VariableManager get_vars() 6585 1727196498.90270: Calling all_inventory to load vars for managed-node2 6585 1727196498.90272: Calling groups_inventory to load vars for managed-node2 6585 1727196498.90273: Calling all_plugins_inventory to load vars for managed-node2 6585 1727196498.90275: Calling all_plugins_play to load vars for managed-node2 6585 1727196498.90277: Calling groups_plugins_inventory to load vars for managed-node2 6585 1727196498.90278: Calling groups_plugins_play to load vars for managed-node2 6585 1727196498.90293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 6585 1727196498.90301: done with get_vars() 6585 1727196498.90307: done queuing things up, now waiting for results queue to drain 6585 1727196498.90308: results queue empty 6585 1727196498.90308: checking for any_errors_fatal 6585 1727196498.90309: done checking for any_errors_fatal 6585 1727196498.90309: checking for max_fail_percentage 6585 1727196498.90310: done checking for max_fail_percentage 6585 1727196498.90310: checking to see if all hosts have failed and the running result is not ok 6585 1727196498.90311: done checking to see if all hosts have failed 6585 1727196498.90311: getting the remaining hosts for this loop 6585 1727196498.90312: done getting the remaining hosts for this loop 6585 1727196498.90313: getting the next task for host managed-node2 6585 1727196498.90315: done getting next task for host managed-node2 6585 1727196498.90315: ^ task is: None 6585 1727196498.90316: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 6585 1727196498.90316: done queuing things up, now waiting for results queue to drain 6585 1727196498.90317: results queue empty 6585 1727196498.90317: checking for any_errors_fatal 6585 1727196498.90318: done checking for any_errors_fatal 6585 1727196498.90318: checking for max_fail_percentage 6585 1727196498.90319: done checking for max_fail_percentage 6585 1727196498.90319: checking to see if all hosts have failed and the running result is not ok 6585 1727196498.90320: done checking to see if all hosts have failed 6585 1727196498.90320: getting the next task for host managed-node2 6585 1727196498.90325: done getting next task for host managed-node2 6585 1727196498.90325: ^ task is: None 6585 1727196498.90326: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=1 ignored=0 Tuesday 24 September 2024 12:48:18 -0400 (0:00:00.017) 0:00:00.100 ***** =============================================================================== Include the role -------------------------------------------------------- 0.06s /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:11 Assert that user is notified about missing variables -------------------- 0.02s /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/tests/ad_integration/tests_default.yml:16 fedora.linux_system_roles.ad_integration : Ensure that mandatory variable ad_integration_realm is available --- 0.02s /tmp/collections-gLm/ansible_collections/fedora/linux_system_roles/roles/ad_integration/tasks/main.yml:3 6585 1727196498.90371: RUNNING CLEANUP