[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. ansible-playbook [core 2.17.2] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/tmp.KuQPUKsjNP executable location = /usr/local/bin/ansible-playbook python version = 3.12.4 (main, Jun 7 2024, 00:00:00) [GCC 14.1.1 20240607 (Red Hat 14.1.1-5)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_quadlet_demo.yml *********************************************** 1 plays in /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml PLAY [Deploy the quadlet demo app] ********************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:3 Saturday 27 July 2024 12:38:08 -0400 (0:00:00.020) 0:00:00.020 ********* [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed_node1] TASK [Generate certificates] *************************************************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:33 Saturday 27 July 2024 12:38:10 -0400 (0:00:02.592) 0:00:02.612 ********* included: fedora.linux_system_roles.certificate for managed_node1 TASK [fedora.linux_system_roles.certificate : Set version specific variables] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:2 Saturday 27 July 2024 12:38:10 -0400 (0:00:00.043) 0:00:02.656 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.certificate : Ensure ansible_facts used by role] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:2 Saturday 27 July 2024 12:38:10 -0400 (0:00:00.023) 0:00:02.679 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__certificate_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.certificate : Check if system is ostree] ******* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:10 Saturday 27 July 2024 12:38:10 -0400 (0:00:00.024) 0:00:02.704 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.certificate : Set flag to indicate system is ostree] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:15 Saturday 27 July 2024 12:38:11 -0400 (0:00:00.519) 0:00:03.223 ********* ok: [managed_node1] => { "ansible_facts": { "__certificate_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.certificate : Set platform/version specific variables] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:19 Saturday 27 July 2024 12:38:11 -0400 (0:00:00.025) 0:00:03.249 ********* skipping: [managed_node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node1] => (item=Fedora.yml) => { "ansible_facts": { "__certificate_certmonger_packages": [ "certmonger", "python3-packaging" ] }, "ansible_included_var_files": [ "/tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node1] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.certificate : Ensure certificate role dependencies are installed] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:5 Saturday 27 July 2024 12:38:11 -0400 (0:00:00.042) 0:00:03.291 ********* changed: [managed_node1] => { "changed": true, "rc": 0, "results": [ "Installed: python3-ply-3.11-23.fc40.noarch", "Installed: python3-cryptography-41.0.7-1.fc40.x86_64", "Installed: python3-pyasn1-0.5.1-3.fc40.noarch", "Installed: python3-pycparser-2.20-14.fc40.noarch", "Installed: python3-cffi-1.16.0-4.fc40.x86_64" ] } lsrpackages: python3-cryptography python3-dbus python3-pyasn1 TASK [fedora.linux_system_roles.certificate : Ensure provider packages are installed] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:23 Saturday 27 July 2024 12:38:14 -0400 (0:00:03.404) 0:00:06.696 ********* changed: [managed_node1] => (item=certmonger) => { "__certificate_provider": "certmonger", "ansible_loop_var": "__certificate_provider", "changed": true, "rc": 0, "results": [ "Installed: dbus-tools-1:1.14.10-3.fc40.x86_64", "Installed: certmonger-0.79.19-5.fc40.x86_64", "Installed: python3-packaging-23.2-4.fc40.noarch" ] } lsrpackages: certmonger python3-packaging TASK [fedora.linux_system_roles.certificate : Ensure pre-scripts hooks directory exists] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:35 Saturday 27 July 2024 12:38:19 -0400 (0:00:04.718) 0:00:11.415 ********* changed: [managed_node1] => (item=certmonger) => { "__certificate_provider": "certmonger", "ansible_loop_var": "__certificate_provider", "changed": true, "gid": 0, "group": "root", "mode": "0700", "owner": "root", "path": "/etc/certmonger//pre-scripts", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 4096, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.certificate : Ensure post-scripts hooks directory exists] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:61 Saturday 27 July 2024 12:38:20 -0400 (0:00:00.615) 0:00:12.030 ********* changed: [managed_node1] => (item=certmonger) => { "__certificate_provider": "certmonger", "ansible_loop_var": "__certificate_provider", "changed": true, "gid": 0, "group": "root", "mode": "0700", "owner": "root", "path": "/etc/certmonger//post-scripts", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 4096, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.certificate : Ensure provider service is running] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:90 Saturday 27 July 2024 12:38:20 -0400 (0:00:00.458) 0:00:12.489 ********* changed: [managed_node1] => (item=certmonger) => { "__certificate_provider": "certmonger", "ansible_loop_var": "__certificate_provider", "changed": true, "enabled": true, "name": "certmonger", "state": "started", "status": { "AccessSELinuxContext": "system_u:object_r:certmonger_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "network.target dbus-broker.service sysinit.target dbus.socket systemd-journald.socket syslog.target system.slice basic.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedorahosted.certmonger", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "no", "Description": "Certificate monitoring and PKI enrollment", "DevicePolicy": "auto", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "DynamicUser": "no", "EnvironmentFiles": "/etc/sysconfig/certmonger (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/certmonger ; argv[]=/usr/sbin/certmonger -S -p /run/certmonger.pid -n $OPTS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/certmonger ; argv[]=/usr/sbin/certmonger -S -p /run/certmonger.pid -n $OPTS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/certmonger.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "certmonger.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14725", "LimitNPROCSoft": "14725", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14725", "LimitSIGPENDINGSoft": "14725", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3340156928", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "[not set]", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "[not set]", "MemorySwapMax": "infinity", "MemorySwapPeak": "[not set]", "MemoryZSwapCurrent": "[not set]", "MemoryZSwapMax": "infinity", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "certmonger.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "PIDFile": "/run/certmonger.pid", "PartOf": "dbus-broker.service", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target system.slice dbus.socket", "Restart": "no", "RestartKillSignal": "15", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4417", "TimeoutAbortUSec": "45s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "45s", "TimeoutStopFailureMode": "abort", "TimeoutStopUSec": "45s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "disabled", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.certificate : Ensure certificate requests] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:101 Saturday 27 July 2024 12:38:22 -0400 (0:00:01.375) 0:00:13.865 ********* changed: [managed_node1] => (item={'name': 'quadlet_demo', 'dns': ['localhost'], 'ca': 'self-sign'}) => { "ansible_loop_var": "item", "changed": true, "item": { "ca": "self-sign", "dns": [ "localhost" ], "name": "quadlet_demo" } } MSG: Certificate requested (new). TASK [fedora.linux_system_roles.certificate : Slurp the contents of the files] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:152 Saturday 27 July 2024 12:38:23 -0400 (0:00:00.995) 0:00:14.861 ********* ok: [managed_node1] => (item=['cert', {'name': 'quadlet_demo', 'dns': ['localhost'], 'ca': 'self-sign'}]) => { "ansible_loop_var": "item", "changed": false, "content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURnakNDQW1xZ0F3SUJBZ0lRTHNnRVd6TVVSZEdiODFKcWE5akFZakFOQmdrcWhraUc5dzBCQVFzRkFEQlEKTVNBd0hnWURWUVFEREJkTWIyTmhiQ0JUYVdkdWFXNW5JRUYxZEdodmNtbDBlVEVzTUNvR0ExVUVBd3dqTW1WagpPREEwTldJdE16TXhORFExWkRFdE9XSm1NelV5Tm1FdE5tSmtPR013TmpFd0hoY05NalF3TnpJM01UWXpPREl5CldoY05NalV3TnpJM01UWXpPREl5V2pBVU1SSXdFQVlEVlFRREV3bHNiMk5oYkdodmMzUXdnZ0VpTUEwR0NTcUcKU0liM0RRRUJBUVVBQTRJQkR3QXdnZ0VLQW9JQkFRQzVndCtheVp0d3ZIWDA4TUZrWkdpMUlndThSSmRHb1RlYgpuWHBYclc2Sk5INERSaTVkeFlOV1lreS9TZmZrS0FUdE9lcjV6aDdMN2x1YkhiVFZYQng5ekVMNW1OTFRnbVVlCmh0K3h2VzdlZklrcGJvOTlodTg5cUxqM2pLQ1lISXl3SjVVZDV3cW9EMXlsYXZZTWg0dFpoMGJ5eUQ5azB0aVkKVFdhUjRDWnZpVkZFTldadEdRaDA3cStJaUJWaytQU1k0TjVOVXpwRGNhRG1WM0l1VzJiclZVK01MaDJ3bDRMTQpXeUZueThRQzh6SjF4WHIzN2p1eldjY1FHR1lYWUpOaVd0THBvcVg2UmIvTHBYdVlRSE1WS0M3RVowL1VMZXp4CkQ2VmllamVWM2habW9YNkxZM1RmOHlNeDErVG1pcWpYUXFnUWxiZ3NsTlc3MVVxWHdtK3BBZ01CQUFHamdaTXcKZ1pBd0N3WURWUjBQQkFRREFnV2dNQlFHQTFVZEVRUU5NQXVDQ1d4dlkyRnNhRzl6ZERBZEJnTlZIU1VFRmpBVQpCZ2dyQmdFRkJRY0RBUVlJS3dZQkJRVUhBd0l3REFZRFZSMFRBUUgvQkFJd0FEQWRCZ05WSFE0RUZnUVVlRkJUCm9xUXpmNDA0Qk1pVHBORUIxOHN4VmFZd0h3WURWUjBqQkJnd0ZvQVVYbG1tc09GdE0vcndVYXBHcTY1OGZKL0EKSURJd0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFCUjVJS1dTdlNZbVVZTEJxMjZDRnFhYlBTclI1ZUdHQmZCQQppRVMzQ1I5bU5DV1RQN01Jbnh5eG1tVGFoNHNYdjVOWTZLZnQrMXR4N0VQT1B3ckk0MUx2elNYdGN0Q2lUQkRDCnR3UG0vOHdmRkEvM2U4a1pUbndWcjlIMVo5MFVvVy8vd09VWGRqN0VMLzNmOVQ4N1ZxKy9zM0ZabXN4RlVCNVcKeWNET3lMRktXQWRaUzZ3czV1VEVWVlMxcXdHVGV1Nkd0S0kvaVAwbnRtVk15R3BUSytmci9DSkxIcU55RWtvZQpieDdsRFIzcStLaVdWRTFIZzgzZHVSVWFaYWVETG9GcnZncTZUMWNnaWNtblRWMVNnWjM2R1NzcU9XNFA0TlByCkZUNnRkZHlraXB5QWZZSFNEaWpWOTlvOUFUVERvZlNXWWljTFZBbmZ0QXA3KzlFMWE0VT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=", "encoding": "base64", "item": [ "cert", { "ca": "self-sign", "dns": [ "localhost" ], "name": "quadlet_demo" } ], "source": "/etc/pki/tls/certs/quadlet_demo.crt" } ok: [managed_node1] => (item=['key', {'name': 'quadlet_demo', 'dns': ['localhost'], 'ca': 'self-sign'}]) => { "ansible_loop_var": "item", "changed": false, "content": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tCk1JSUV2UUlCQURBTkJna3Foa2lHOXcwQkFRRUZBQVNDQktjd2dnU2pBZ0VBQW9JQkFRQzVndCtheVp0d3ZIWDAKOE1Ga1pHaTFJZ3U4UkpkR29UZWJuWHBYclc2Sk5INERSaTVkeFlOV1lreS9TZmZrS0FUdE9lcjV6aDdMN2x1YgpIYlRWWEJ4OXpFTDVtTkxUZ21VZWh0K3h2VzdlZklrcGJvOTlodTg5cUxqM2pLQ1lISXl3SjVVZDV3cW9EMXlsCmF2WU1oNHRaaDBieXlEOWswdGlZVFdhUjRDWnZpVkZFTldadEdRaDA3cStJaUJWaytQU1k0TjVOVXpwRGNhRG0KVjNJdVcyYnJWVStNTGgyd2w0TE1XeUZueThRQzh6SjF4WHIzN2p1eldjY1FHR1lYWUpOaVd0THBvcVg2UmIvTApwWHVZUUhNVktDN0VaMC9VTGV6eEQ2VmllamVWM2habW9YNkxZM1RmOHlNeDErVG1pcWpYUXFnUWxiZ3NsTlc3CjFVcVh3bStwQWdNQkFBRUNnZ0VBUnZ3MnZ0NlZYYktudWtYajRwdnZXeHcvZkZlTXdVaVFaRG9DcGdrbHFsZk4KUGtoOUZvR3RLNEZxMTZtZ3N3dkRNdGsrT2o5dWxsOUxhMVFYTGF0VTlhZ1RHcm9wTmlrSHM0SXRsN0FpZS8wUQphU3UydVhSSUdJSjI0TGJOZnRjeDJIZ1UvYis2V3F1aUttUCs2YlYyRTVpSmcvNHZEOUFFZEdidHNEVWl6L3VICk56aC9FTU93WklTRE9nOGRhRGhCYUhUMVc3MDdUVkZrZHVjeVZ6TENyNm5GUWU5VlowMm9xeVhHeXE4K2xEWHIKNnRKdnFCYlc1eElCbDF5cnl3OE5TVHpsTzRPVDI3LzJsdGszODUzdkd5RG5lMW5wdFFSajN6UnpvaTFLQlJ0MgpmLzJNNlZtNkdiUGlCUmlGcHRNcm9BcVJ2Y1BWREJleENtSUFrcTFDVHdLQmdRRGF2ZytMYndZUFJBZFBaWldTCnEwMlh6RG5DaVZ2UE4yTW15akZRRTBCalpTRnlXVW14TmYzQjVlQTJjQ0FaTzRBY256Y1pyQWkwT3I5WUJ2NGMKNkt3cnl2VHBqZGR6c2NoV0hWTVVFL0dtWUowYnhTMUw4anBYY3FNM2VEMHBtdzhZcnZvdXZrUWpmTlpYWnFzUwpJUm1Mai9idFRiYUdFcWdQNnJFY3ZZQklEd0tCZ1FEWkc5REVFU1FHWE9DVVVpV3lGLzR2eWVNK0hxUXAyeVYyCm5FN2ZUYkVtTXB0WTBlTHZXbzBEaHZEVEwvLzVwNGNzamVlaHhhMWM1emVBdjVieTNsTmhoUUxRYzN2eDlrdEcKc0VuNWErRmhEV3JHR3VSd0hNQXBGeUhTd0hZckdGSXlyY2JHWGdERHl2dVBmTzdIanhiYmswdUhFQTBtaVRWMQp2em1vYnNMVXh3S0JnR2tqS1QyUG80MzYyTGlrenZ1c01xTmZtZk9US2ZteldZanZianhheEh3Qnc4MitkTmtrClprK29PZGh3bEQwTWlFczVpN1pmSmQvYXpOVjJwdlVtTUhyc0ZrT2IxWTdhU0x4N1k4OG10dU9OVkhaZ2s2RUoKVUZsdjFGdDVBNHpYNXEycWpMMmkzZ1ZnbjNWcTk1YkRLaEFXcGt6eEtXWFAyYytzc214cCtScXBBb0dCQU1JUgpnUjJpRE1lN1FleVRPYXJtazRwNE5xOFNpTC81YXBXSngxOElmYStkVVF1bUllcSswSW8wbHhUUU5Vb2VuRkFSClVOcGtiMEU2Vlh5NnhkMjNLbVZqbGs1cXpJSlBISjdGZW5xQUdtaThxNU1GK3VqVWFsalFtcVZlOE1JNDdWRHIKMFdFRUtqN2FOTi9nVnpqL0NQbWh3c09xYjNiZzgyNmJUcGptMlRXRkFvR0FIbE4wSUxqdTRJcmZEUEtxb3IwdAo1SEU1alRXRE5LTWF3V3l2eG5Dc3VMMm00WmNOZ01DV0ZiWG9xTGlQYlBCc0xtOTBsbWU1M3BTTFVyRDJkVU9HCkVwVlNyVGZTcnBkQXF0emVJQUNHemZneWRzSm10MUZmT1FuUFVSYU1hKzdRd3JKTzU5bEtjVmtDQXE5dWE3ZW4KZTdXK0J2Y04rNTUyamU2Nk1KanMxeW89Ci0tLS0tRU5EIFBSSVZBVEUgS0VZLS0tLS0K", "encoding": "base64", "item": [ "key", { "ca": "self-sign", "dns": [ "localhost" ], "name": "quadlet_demo" } ], "source": "/etc/pki/tls/private/quadlet_demo.key" } ok: [managed_node1] => (item=['ca', {'name': 'quadlet_demo', 'dns': ['localhost'], 'ca': 'self-sign'}]) => { "ansible_loop_var": "item", "changed": false, "content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURnakNDQW1xZ0F3SUJBZ0lRTHNnRVd6TVVSZEdiODFKcWE5akFZakFOQmdrcWhraUc5dzBCQVFzRkFEQlEKTVNBd0hnWURWUVFEREJkTWIyTmhiQ0JUYVdkdWFXNW5JRUYxZEdodmNtbDBlVEVzTUNvR0ExVUVBd3dqTW1WagpPREEwTldJdE16TXhORFExWkRFdE9XSm1NelV5Tm1FdE5tSmtPR013TmpFd0hoY05NalF3TnpJM01UWXpPREl5CldoY05NalV3TnpJM01UWXpPREl5V2pBVU1SSXdFQVlEVlFRREV3bHNiMk5oYkdodmMzUXdnZ0VpTUEwR0NTcUcKU0liM0RRRUJBUVVBQTRJQkR3QXdnZ0VLQW9JQkFRQzVndCtheVp0d3ZIWDA4TUZrWkdpMUlndThSSmRHb1RlYgpuWHBYclc2Sk5INERSaTVkeFlOV1lreS9TZmZrS0FUdE9lcjV6aDdMN2x1YkhiVFZYQng5ekVMNW1OTFRnbVVlCmh0K3h2VzdlZklrcGJvOTlodTg5cUxqM2pLQ1lISXl3SjVVZDV3cW9EMXlsYXZZTWg0dFpoMGJ5eUQ5azB0aVkKVFdhUjRDWnZpVkZFTldadEdRaDA3cStJaUJWaytQU1k0TjVOVXpwRGNhRG1WM0l1VzJiclZVK01MaDJ3bDRMTQpXeUZueThRQzh6SjF4WHIzN2p1eldjY1FHR1lYWUpOaVd0THBvcVg2UmIvTHBYdVlRSE1WS0M3RVowL1VMZXp4CkQ2VmllamVWM2habW9YNkxZM1RmOHlNeDErVG1pcWpYUXFnUWxiZ3NsTlc3MVVxWHdtK3BBZ01CQUFHamdaTXcKZ1pBd0N3WURWUjBQQkFRREFnV2dNQlFHQTFVZEVRUU5NQXVDQ1d4dlkyRnNhRzl6ZERBZEJnTlZIU1VFRmpBVQpCZ2dyQmdFRkJRY0RBUVlJS3dZQkJRVUhBd0l3REFZRFZSMFRBUUgvQkFJd0FEQWRCZ05WSFE0RUZnUVVlRkJUCm9xUXpmNDA0Qk1pVHBORUIxOHN4VmFZd0h3WURWUjBqQkJnd0ZvQVVYbG1tc09GdE0vcndVYXBHcTY1OGZKL0EKSURJd0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFCUjVJS1dTdlNZbVVZTEJxMjZDRnFhYlBTclI1ZUdHQmZCQQppRVMzQ1I5bU5DV1RQN01Jbnh5eG1tVGFoNHNYdjVOWTZLZnQrMXR4N0VQT1B3ckk0MUx2elNYdGN0Q2lUQkRDCnR3UG0vOHdmRkEvM2U4a1pUbndWcjlIMVo5MFVvVy8vd09VWGRqN0VMLzNmOVQ4N1ZxKy9zM0ZabXN4RlVCNVcKeWNET3lMRktXQWRaUzZ3czV1VEVWVlMxcXdHVGV1Nkd0S0kvaVAwbnRtVk15R3BUSytmci9DSkxIcU55RWtvZQpieDdsRFIzcStLaVdWRTFIZzgzZHVSVWFaYWVETG9GcnZncTZUMWNnaWNtblRWMVNnWjM2R1NzcU9XNFA0TlByCkZUNnRkZHlraXB5QWZZSFNEaWpWOTlvOUFUVERvZlNXWWljTFZBbmZ0QXA3KzlFMWE0VT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=", "encoding": "base64", "item": [ "ca", { "ca": "self-sign", "dns": [ "localhost" ], "name": "quadlet_demo" } ], "source": "/etc/pki/tls/certs/quadlet_demo.crt" } TASK [fedora.linux_system_roles.certificate : Create return data] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:160 Saturday 27 July 2024 12:38:24 -0400 (0:00:01.341) 0:00:16.202 ********* ok: [managed_node1] => { "ansible_facts": { "certificate_test_certs": { "quadlet_demo": { "ca": "/etc/pki/tls/certs/quadlet_demo.crt", "ca_content": "-----BEGIN CERTIFICATE-----\nMIIDgjCCAmqgAwIBAgIQLsgEWzMURdGb81Jqa9jAYjANBgkqhkiG9w0BAQsFADBQ\nMSAwHgYDVQQDDBdMb2NhbCBTaWduaW5nIEF1dGhvcml0eTEsMCoGA1UEAwwjMmVj\nODA0NWItMzMxNDQ1ZDEtOWJmMzUyNmEtNmJkOGMwNjEwHhcNMjQwNzI3MTYzODIy\nWhcNMjUwNzI3MTYzODIyWjAUMRIwEAYDVQQDEwlsb2NhbGhvc3QwggEiMA0GCSqG\nSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC5gt+ayZtwvHX08MFkZGi1Igu8RJdGoTeb\nnXpXrW6JNH4DRi5dxYNWYky/SffkKATtOer5zh7L7lubHbTVXBx9zEL5mNLTgmUe\nht+xvW7efIkpbo99hu89qLj3jKCYHIywJ5Ud5wqoD1ylavYMh4tZh0byyD9k0tiY\nTWaR4CZviVFENWZtGQh07q+IiBVk+PSY4N5NUzpDcaDmV3IuW2brVU+MLh2wl4LM\nWyFny8QC8zJ1xXr37juzWccQGGYXYJNiWtLpoqX6Rb/LpXuYQHMVKC7EZ0/ULezx\nD6ViejeV3hZmoX6LY3Tf8yMx1+TmiqjXQqgQlbgslNW71UqXwm+pAgMBAAGjgZMw\ngZAwCwYDVR0PBAQDAgWgMBQGA1UdEQQNMAuCCWxvY2FsaG9zdDAdBgNVHSUEFjAU\nBggrBgEFBQcDAQYIKwYBBQUHAwIwDAYDVR0TAQH/BAIwADAdBgNVHQ4EFgQUeFBT\noqQzf404BMiTpNEB18sxVaYwHwYDVR0jBBgwFoAUXlmmsOFtM/rwUapGq658fJ/A\nIDIwDQYJKoZIhvcNAQELBQADggEBABR5IKWSvSYmUYLBq26CFqabPSrR5eGGBfBA\niES3CR9mNCWTP7MInxyxmmTah4sXv5NY6Kft+1tx7EPOPwrI41LvzSXtctCiTBDC\ntwPm/8wfFA/3e8kZTnwVr9H1Z90UoW//wOUXdj7EL/3f9T87Vq+/s3FZmsxFUB5W\nycDOyLFKWAdZS6ws5uTEVVS1qwGTeu6GtKI/iP0ntmVMyGpTK+fr/CJLHqNyEkoe\nbx7lDR3q+KiWVE1Hg83duRUaZaeDLoFrvgq6T1cgicmnTV1SgZ36GSsqOW4P4NPr\nFT6tddykipyAfYHSDijV99o9ATTDofSWYicLVAnftAp7+9E1a4U=\n-----END CERTIFICATE-----\n", "cert": "/etc/pki/tls/certs/quadlet_demo.crt", "cert_content": "-----BEGIN CERTIFICATE-----\nMIIDgjCCAmqgAwIBAgIQLsgEWzMURdGb81Jqa9jAYjANBgkqhkiG9w0BAQsFADBQ\nMSAwHgYDVQQDDBdMb2NhbCBTaWduaW5nIEF1dGhvcml0eTEsMCoGA1UEAwwjMmVj\nODA0NWItMzMxNDQ1ZDEtOWJmMzUyNmEtNmJkOGMwNjEwHhcNMjQwNzI3MTYzODIy\nWhcNMjUwNzI3MTYzODIyWjAUMRIwEAYDVQQDEwlsb2NhbGhvc3QwggEiMA0GCSqG\nSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC5gt+ayZtwvHX08MFkZGi1Igu8RJdGoTeb\nnXpXrW6JNH4DRi5dxYNWYky/SffkKATtOer5zh7L7lubHbTVXBx9zEL5mNLTgmUe\nht+xvW7efIkpbo99hu89qLj3jKCYHIywJ5Ud5wqoD1ylavYMh4tZh0byyD9k0tiY\nTWaR4CZviVFENWZtGQh07q+IiBVk+PSY4N5NUzpDcaDmV3IuW2brVU+MLh2wl4LM\nWyFny8QC8zJ1xXr37juzWccQGGYXYJNiWtLpoqX6Rb/LpXuYQHMVKC7EZ0/ULezx\nD6ViejeV3hZmoX6LY3Tf8yMx1+TmiqjXQqgQlbgslNW71UqXwm+pAgMBAAGjgZMw\ngZAwCwYDVR0PBAQDAgWgMBQGA1UdEQQNMAuCCWxvY2FsaG9zdDAdBgNVHSUEFjAU\nBggrBgEFBQcDAQYIKwYBBQUHAwIwDAYDVR0TAQH/BAIwADAdBgNVHQ4EFgQUeFBT\noqQzf404BMiTpNEB18sxVaYwHwYDVR0jBBgwFoAUXlmmsOFtM/rwUapGq658fJ/A\nIDIwDQYJKoZIhvcNAQELBQADggEBABR5IKWSvSYmUYLBq26CFqabPSrR5eGGBfBA\niES3CR9mNCWTP7MInxyxmmTah4sXv5NY6Kft+1tx7EPOPwrI41LvzSXtctCiTBDC\ntwPm/8wfFA/3e8kZTnwVr9H1Z90UoW//wOUXdj7EL/3f9T87Vq+/s3FZmsxFUB5W\nycDOyLFKWAdZS6ws5uTEVVS1qwGTeu6GtKI/iP0ntmVMyGpTK+fr/CJLHqNyEkoe\nbx7lDR3q+KiWVE1Hg83duRUaZaeDLoFrvgq6T1cgicmnTV1SgZ36GSsqOW4P4NPr\nFT6tddykipyAfYHSDijV99o9ATTDofSWYicLVAnftAp7+9E1a4U=\n-----END CERTIFICATE-----\n", "key": "/etc/pki/tls/private/quadlet_demo.key", "key_content": "-----BEGIN PRIVATE KEY-----\nMIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQC5gt+ayZtwvHX0\n8MFkZGi1Igu8RJdGoTebnXpXrW6JNH4DRi5dxYNWYky/SffkKATtOer5zh7L7lub\nHbTVXBx9zEL5mNLTgmUeht+xvW7efIkpbo99hu89qLj3jKCYHIywJ5Ud5wqoD1yl\navYMh4tZh0byyD9k0tiYTWaR4CZviVFENWZtGQh07q+IiBVk+PSY4N5NUzpDcaDm\nV3IuW2brVU+MLh2wl4LMWyFny8QC8zJ1xXr37juzWccQGGYXYJNiWtLpoqX6Rb/L\npXuYQHMVKC7EZ0/ULezxD6ViejeV3hZmoX6LY3Tf8yMx1+TmiqjXQqgQlbgslNW7\n1UqXwm+pAgMBAAECggEARvw2vt6VXbKnukXj4pvvWxw/fFeMwUiQZDoCpgklqlfN\nPkh9FoGtK4Fq16mgswvDMtk+Oj9ull9La1QXLatU9agTGropNikHs4Itl7Aie/0Q\naSu2uXRIGIJ24LbNftcx2HgU/b+6WquiKmP+6bV2E5iJg/4vD9AEdGbtsDUiz/uH\nNzh/EMOwZISDOg8daDhBaHT1W707TVFkducyVzLCr6nFQe9VZ02oqyXGyq8+lDXr\n6tJvqBbW5xIBl1yryw8NSTzlO4OT27/2ltk3853vGyDne1nptQRj3zRzoi1KBRt2\nf/2M6Vm6GbPiBRiFptMroAqRvcPVDBexCmIAkq1CTwKBgQDavg+LbwYPRAdPZZWS\nq02XzDnCiVvPN2MmyjFQE0BjZSFyWUmxNf3B5eA2cCAZO4AcnzcZrAi0Or9YBv4c\n6KwryvTpjddzschWHVMUE/GmYJ0bxS1L8jpXcqM3eD0pmw8YrvouvkQjfNZXZqsS\nIRmLj/btTbaGEqgP6rEcvYBIDwKBgQDZG9DEESQGXOCUUiWyF/4vyeM+HqQp2yV2\nnE7fTbEmMptY0eLvWo0DhvDTL//5p4csjeehxa1c5zeAv5by3lNhhQLQc3vx9ktG\nsEn5a+FhDWrGGuRwHMApFyHSwHYrGFIyrcbGXgDDyvuPfO7Hjxbbk0uHEA0miTV1\nvzmobsLUxwKBgGkjKT2Po4362LikzvusMqNfmfOTKfmzWYjvbjxaxHwBw82+dNkk\nZk+oOdhwlD0MiEs5i7ZfJd/azNV2pvUmMHrsFkOb1Y7aSLx7Y88mtuONVHZgk6EJ\nUFlv1Ft5A4zX5q2qjL2i3gVgn3Vq95bDKhAWpkzxKWXP2c+ssmxp+RqpAoGBAMIR\ngR2iDMe7QeyTOarmk4p4Nq8SiL/5apWJx18Ifa+dUQumIeq+0Io0lxTQNUoenFAR\nUNpkb0E6VXy6xd23KmVjlk5qzIJPHJ7FenqAGmi8q5MF+ujUaljQmqVe8MI47VDr\n0WEEKj7aNN/gVzj/CPmhwsOqb3bg826bTpjm2TWFAoGAHlN0ILju4IrfDPKqor0t\n5HE5jTWDNKMawWyvxnCsuL2m4ZcNgMCWFbXoqLiPbPBsLm90lme53pSLUrD2dUOG\nEpVSrTfSrpdAqtzeIACGzfgydsJmt1FfOQnPURaMa+7QwrJO59lKcVkCAq9ua7en\ne7W+BvcN+552je66MJjs1yo=\n-----END PRIVATE KEY-----\n" } } }, "changed": false } TASK [fedora.linux_system_roles.certificate : Stop tracking certificates] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:176 Saturday 27 July 2024 12:38:24 -0400 (0:00:00.033) 0:00:16.235 ********* ok: [managed_node1] => (item={'cert': '/etc/pki/tls/certs/quadlet_demo.crt', 'cert_content': '-----BEGIN CERTIFICATE-----\nMIIDgjCCAmqgAwIBAgIQLsgEWzMURdGb81Jqa9jAYjANBgkqhkiG9w0BAQsFADBQ\nMSAwHgYDVQQDDBdMb2NhbCBTaWduaW5nIEF1dGhvcml0eTEsMCoGA1UEAwwjMmVj\nODA0NWItMzMxNDQ1ZDEtOWJmMzUyNmEtNmJkOGMwNjEwHhcNMjQwNzI3MTYzODIy\nWhcNMjUwNzI3MTYzODIyWjAUMRIwEAYDVQQDEwlsb2NhbGhvc3QwggEiMA0GCSqG\nSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC5gt+ayZtwvHX08MFkZGi1Igu8RJdGoTeb\nnXpXrW6JNH4DRi5dxYNWYky/SffkKATtOer5zh7L7lubHbTVXBx9zEL5mNLTgmUe\nht+xvW7efIkpbo99hu89qLj3jKCYHIywJ5Ud5wqoD1ylavYMh4tZh0byyD9k0tiY\nTWaR4CZviVFENWZtGQh07q+IiBVk+PSY4N5NUzpDcaDmV3IuW2brVU+MLh2wl4LM\nWyFny8QC8zJ1xXr37juzWccQGGYXYJNiWtLpoqX6Rb/LpXuYQHMVKC7EZ0/ULezx\nD6ViejeV3hZmoX6LY3Tf8yMx1+TmiqjXQqgQlbgslNW71UqXwm+pAgMBAAGjgZMw\ngZAwCwYDVR0PBAQDAgWgMBQGA1UdEQQNMAuCCWxvY2FsaG9zdDAdBgNVHSUEFjAU\nBggrBgEFBQcDAQYIKwYBBQUHAwIwDAYDVR0TAQH/BAIwADAdBgNVHQ4EFgQUeFBT\noqQzf404BMiTpNEB18sxVaYwHwYDVR0jBBgwFoAUXlmmsOFtM/rwUapGq658fJ/A\nIDIwDQYJKoZIhvcNAQELBQADggEBABR5IKWSvSYmUYLBq26CFqabPSrR5eGGBfBA\niES3CR9mNCWTP7MInxyxmmTah4sXv5NY6Kft+1tx7EPOPwrI41LvzSXtctCiTBDC\ntwPm/8wfFA/3e8kZTnwVr9H1Z90UoW//wOUXdj7EL/3f9T87Vq+/s3FZmsxFUB5W\nycDOyLFKWAdZS6ws5uTEVVS1qwGTeu6GtKI/iP0ntmVMyGpTK+fr/CJLHqNyEkoe\nbx7lDR3q+KiWVE1Hg83duRUaZaeDLoFrvgq6T1cgicmnTV1SgZ36GSsqOW4P4NPr\nFT6tddykipyAfYHSDijV99o9ATTDofSWYicLVAnftAp7+9E1a4U=\n-----END CERTIFICATE-----\n', 'key': '/etc/pki/tls/private/quadlet_demo.key', 'key_content': '-----BEGIN PRIVATE KEY-----\nMIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQC5gt+ayZtwvHX0\n8MFkZGi1Igu8RJdGoTebnXpXrW6JNH4DRi5dxYNWYky/SffkKATtOer5zh7L7lub\nHbTVXBx9zEL5mNLTgmUeht+xvW7efIkpbo99hu89qLj3jKCYHIywJ5Ud5wqoD1yl\navYMh4tZh0byyD9k0tiYTWaR4CZviVFENWZtGQh07q+IiBVk+PSY4N5NUzpDcaDm\nV3IuW2brVU+MLh2wl4LMWyFny8QC8zJ1xXr37juzWccQGGYXYJNiWtLpoqX6Rb/L\npXuYQHMVKC7EZ0/ULezxD6ViejeV3hZmoX6LY3Tf8yMx1+TmiqjXQqgQlbgslNW7\n1UqXwm+pAgMBAAECggEARvw2vt6VXbKnukXj4pvvWxw/fFeMwUiQZDoCpgklqlfN\nPkh9FoGtK4Fq16mgswvDMtk+Oj9ull9La1QXLatU9agTGropNikHs4Itl7Aie/0Q\naSu2uXRIGIJ24LbNftcx2HgU/b+6WquiKmP+6bV2E5iJg/4vD9AEdGbtsDUiz/uH\nNzh/EMOwZISDOg8daDhBaHT1W707TVFkducyVzLCr6nFQe9VZ02oqyXGyq8+lDXr\n6tJvqBbW5xIBl1yryw8NSTzlO4OT27/2ltk3853vGyDne1nptQRj3zRzoi1KBRt2\nf/2M6Vm6GbPiBRiFptMroAqRvcPVDBexCmIAkq1CTwKBgQDavg+LbwYPRAdPZZWS\nq02XzDnCiVvPN2MmyjFQE0BjZSFyWUmxNf3B5eA2cCAZO4AcnzcZrAi0Or9YBv4c\n6KwryvTpjddzschWHVMUE/GmYJ0bxS1L8jpXcqM3eD0pmw8YrvouvkQjfNZXZqsS\nIRmLj/btTbaGEqgP6rEcvYBIDwKBgQDZG9DEESQGXOCUUiWyF/4vyeM+HqQp2yV2\nnE7fTbEmMptY0eLvWo0DhvDTL//5p4csjeehxa1c5zeAv5by3lNhhQLQc3vx9ktG\nsEn5a+FhDWrGGuRwHMApFyHSwHYrGFIyrcbGXgDDyvuPfO7Hjxbbk0uHEA0miTV1\nvzmobsLUxwKBgGkjKT2Po4362LikzvusMqNfmfOTKfmzWYjvbjxaxHwBw82+dNkk\nZk+oOdhwlD0MiEs5i7ZfJd/azNV2pvUmMHrsFkOb1Y7aSLx7Y88mtuONVHZgk6EJ\nUFlv1Ft5A4zX5q2qjL2i3gVgn3Vq95bDKhAWpkzxKWXP2c+ssmxp+RqpAoGBAMIR\ngR2iDMe7QeyTOarmk4p4Nq8SiL/5apWJx18Ifa+dUQumIeq+0Io0lxTQNUoenFAR\nUNpkb0E6VXy6xd23KmVjlk5qzIJPHJ7FenqAGmi8q5MF+ujUaljQmqVe8MI47VDr\n0WEEKj7aNN/gVzj/CPmhwsOqb3bg826bTpjm2TWFAoGAHlN0ILju4IrfDPKqor0t\n5HE5jTWDNKMawWyvxnCsuL2m4ZcNgMCWFbXoqLiPbPBsLm90lme53pSLUrD2dUOG\nEpVSrTfSrpdAqtzeIACGzfgydsJmt1FfOQnPURaMa+7QwrJO59lKcVkCAq9ua7en\ne7W+BvcN+552je66MJjs1yo=\n-----END PRIVATE KEY-----\n', 'ca': '/etc/pki/tls/certs/quadlet_demo.crt', 'ca_content': '-----BEGIN CERTIFICATE-----\nMIIDgjCCAmqgAwIBAgIQLsgEWzMURdGb81Jqa9jAYjANBgkqhkiG9w0BAQsFADBQ\nMSAwHgYDVQQDDBdMb2NhbCBTaWduaW5nIEF1dGhvcml0eTEsMCoGA1UEAwwjMmVj\nODA0NWItMzMxNDQ1ZDEtOWJmMzUyNmEtNmJkOGMwNjEwHhcNMjQwNzI3MTYzODIy\nWhcNMjUwNzI3MTYzODIyWjAUMRIwEAYDVQQDEwlsb2NhbGhvc3QwggEiMA0GCSqG\nSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC5gt+ayZtwvHX08MFkZGi1Igu8RJdGoTeb\nnXpXrW6JNH4DRi5dxYNWYky/SffkKATtOer5zh7L7lubHbTVXBx9zEL5mNLTgmUe\nht+xvW7efIkpbo99hu89qLj3jKCYHIywJ5Ud5wqoD1ylavYMh4tZh0byyD9k0tiY\nTWaR4CZviVFENWZtGQh07q+IiBVk+PSY4N5NUzpDcaDmV3IuW2brVU+MLh2wl4LM\nWyFny8QC8zJ1xXr37juzWccQGGYXYJNiWtLpoqX6Rb/LpXuYQHMVKC7EZ0/ULezx\nD6ViejeV3hZmoX6LY3Tf8yMx1+TmiqjXQqgQlbgslNW71UqXwm+pAgMBAAGjgZMw\ngZAwCwYDVR0PBAQDAgWgMBQGA1UdEQQNMAuCCWxvY2FsaG9zdDAdBgNVHSUEFjAU\nBggrBgEFBQcDAQYIKwYBBQUHAwIwDAYDVR0TAQH/BAIwADAdBgNVHQ4EFgQUeFBT\noqQzf404BMiTpNEB18sxVaYwHwYDVR0jBBgwFoAUXlmmsOFtM/rwUapGq658fJ/A\nIDIwDQYJKoZIhvcNAQELBQADggEBABR5IKWSvSYmUYLBq26CFqabPSrR5eGGBfBA\niES3CR9mNCWTP7MInxyxmmTah4sXv5NY6Kft+1tx7EPOPwrI41LvzSXtctCiTBDC\ntwPm/8wfFA/3e8kZTnwVr9H1Z90UoW//wOUXdj7EL/3f9T87Vq+/s3FZmsxFUB5W\nycDOyLFKWAdZS6ws5uTEVVS1qwGTeu6GtKI/iP0ntmVMyGpTK+fr/CJLHqNyEkoe\nbx7lDR3q+KiWVE1Hg83duRUaZaeDLoFrvgq6T1cgicmnTV1SgZ36GSsqOW4P4NPr\nFT6tddykipyAfYHSDijV99o9ATTDofSWYicLVAnftAp7+9E1a4U=\n-----END CERTIFICATE-----\n'}) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "getcert", "stop-tracking", "-f", "/etc/pki/tls/certs/quadlet_demo.crt" ], "delta": "0:00:00.033804", "end": "2024-07-27 12:38:24.902550", "item": { "ca": "/etc/pki/tls/certs/quadlet_demo.crt", "ca_content": "-----BEGIN CERTIFICATE-----\nMIIDgjCCAmqgAwIBAgIQLsgEWzMURdGb81Jqa9jAYjANBgkqhkiG9w0BAQsFADBQ\nMSAwHgYDVQQDDBdMb2NhbCBTaWduaW5nIEF1dGhvcml0eTEsMCoGA1UEAwwjMmVj\nODA0NWItMzMxNDQ1ZDEtOWJmMzUyNmEtNmJkOGMwNjEwHhcNMjQwNzI3MTYzODIy\nWhcNMjUwNzI3MTYzODIyWjAUMRIwEAYDVQQDEwlsb2NhbGhvc3QwggEiMA0GCSqG\nSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC5gt+ayZtwvHX08MFkZGi1Igu8RJdGoTeb\nnXpXrW6JNH4DRi5dxYNWYky/SffkKATtOer5zh7L7lubHbTVXBx9zEL5mNLTgmUe\nht+xvW7efIkpbo99hu89qLj3jKCYHIywJ5Ud5wqoD1ylavYMh4tZh0byyD9k0tiY\nTWaR4CZviVFENWZtGQh07q+IiBVk+PSY4N5NUzpDcaDmV3IuW2brVU+MLh2wl4LM\nWyFny8QC8zJ1xXr37juzWccQGGYXYJNiWtLpoqX6Rb/LpXuYQHMVKC7EZ0/ULezx\nD6ViejeV3hZmoX6LY3Tf8yMx1+TmiqjXQqgQlbgslNW71UqXwm+pAgMBAAGjgZMw\ngZAwCwYDVR0PBAQDAgWgMBQGA1UdEQQNMAuCCWxvY2FsaG9zdDAdBgNVHSUEFjAU\nBggrBgEFBQcDAQYIKwYBBQUHAwIwDAYDVR0TAQH/BAIwADAdBgNVHQ4EFgQUeFBT\noqQzf404BMiTpNEB18sxVaYwHwYDVR0jBBgwFoAUXlmmsOFtM/rwUapGq658fJ/A\nIDIwDQYJKoZIhvcNAQELBQADggEBABR5IKWSvSYmUYLBq26CFqabPSrR5eGGBfBA\niES3CR9mNCWTP7MInxyxmmTah4sXv5NY6Kft+1tx7EPOPwrI41LvzSXtctCiTBDC\ntwPm/8wfFA/3e8kZTnwVr9H1Z90UoW//wOUXdj7EL/3f9T87Vq+/s3FZmsxFUB5W\nycDOyLFKWAdZS6ws5uTEVVS1qwGTeu6GtKI/iP0ntmVMyGpTK+fr/CJLHqNyEkoe\nbx7lDR3q+KiWVE1Hg83duRUaZaeDLoFrvgq6T1cgicmnTV1SgZ36GSsqOW4P4NPr\nFT6tddykipyAfYHSDijV99o9ATTDofSWYicLVAnftAp7+9E1a4U=\n-----END CERTIFICATE-----\n", "cert": "/etc/pki/tls/certs/quadlet_demo.crt", "cert_content": "-----BEGIN CERTIFICATE-----\nMIIDgjCCAmqgAwIBAgIQLsgEWzMURdGb81Jqa9jAYjANBgkqhkiG9w0BAQsFADBQ\nMSAwHgYDVQQDDBdMb2NhbCBTaWduaW5nIEF1dGhvcml0eTEsMCoGA1UEAwwjMmVj\nODA0NWItMzMxNDQ1ZDEtOWJmMzUyNmEtNmJkOGMwNjEwHhcNMjQwNzI3MTYzODIy\nWhcNMjUwNzI3MTYzODIyWjAUMRIwEAYDVQQDEwlsb2NhbGhvc3QwggEiMA0GCSqG\nSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC5gt+ayZtwvHX08MFkZGi1Igu8RJdGoTeb\nnXpXrW6JNH4DRi5dxYNWYky/SffkKATtOer5zh7L7lubHbTVXBx9zEL5mNLTgmUe\nht+xvW7efIkpbo99hu89qLj3jKCYHIywJ5Ud5wqoD1ylavYMh4tZh0byyD9k0tiY\nTWaR4CZviVFENWZtGQh07q+IiBVk+PSY4N5NUzpDcaDmV3IuW2brVU+MLh2wl4LM\nWyFny8QC8zJ1xXr37juzWccQGGYXYJNiWtLpoqX6Rb/LpXuYQHMVKC7EZ0/ULezx\nD6ViejeV3hZmoX6LY3Tf8yMx1+TmiqjXQqgQlbgslNW71UqXwm+pAgMBAAGjgZMw\ngZAwCwYDVR0PBAQDAgWgMBQGA1UdEQQNMAuCCWxvY2FsaG9zdDAdBgNVHSUEFjAU\nBggrBgEFBQcDAQYIKwYBBQUHAwIwDAYDVR0TAQH/BAIwADAdBgNVHQ4EFgQUeFBT\noqQzf404BMiTpNEB18sxVaYwHwYDVR0jBBgwFoAUXlmmsOFtM/rwUapGq658fJ/A\nIDIwDQYJKoZIhvcNAQELBQADggEBABR5IKWSvSYmUYLBq26CFqabPSrR5eGGBfBA\niES3CR9mNCWTP7MInxyxmmTah4sXv5NY6Kft+1tx7EPOPwrI41LvzSXtctCiTBDC\ntwPm/8wfFA/3e8kZTnwVr9H1Z90UoW//wOUXdj7EL/3f9T87Vq+/s3FZmsxFUB5W\nycDOyLFKWAdZS6ws5uTEVVS1qwGTeu6GtKI/iP0ntmVMyGpTK+fr/CJLHqNyEkoe\nbx7lDR3q+KiWVE1Hg83duRUaZaeDLoFrvgq6T1cgicmnTV1SgZ36GSsqOW4P4NPr\nFT6tddykipyAfYHSDijV99o9ATTDofSWYicLVAnftAp7+9E1a4U=\n-----END CERTIFICATE-----\n", "key": "/etc/pki/tls/private/quadlet_demo.key", "key_content": "-----BEGIN PRIVATE KEY-----\nMIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQC5gt+ayZtwvHX0\n8MFkZGi1Igu8RJdGoTebnXpXrW6JNH4DRi5dxYNWYky/SffkKATtOer5zh7L7lub\nHbTVXBx9zEL5mNLTgmUeht+xvW7efIkpbo99hu89qLj3jKCYHIywJ5Ud5wqoD1yl\navYMh4tZh0byyD9k0tiYTWaR4CZviVFENWZtGQh07q+IiBVk+PSY4N5NUzpDcaDm\nV3IuW2brVU+MLh2wl4LMWyFny8QC8zJ1xXr37juzWccQGGYXYJNiWtLpoqX6Rb/L\npXuYQHMVKC7EZ0/ULezxD6ViejeV3hZmoX6LY3Tf8yMx1+TmiqjXQqgQlbgslNW7\n1UqXwm+pAgMBAAECggEARvw2vt6VXbKnukXj4pvvWxw/fFeMwUiQZDoCpgklqlfN\nPkh9FoGtK4Fq16mgswvDMtk+Oj9ull9La1QXLatU9agTGropNikHs4Itl7Aie/0Q\naSu2uXRIGIJ24LbNftcx2HgU/b+6WquiKmP+6bV2E5iJg/4vD9AEdGbtsDUiz/uH\nNzh/EMOwZISDOg8daDhBaHT1W707TVFkducyVzLCr6nFQe9VZ02oqyXGyq8+lDXr\n6tJvqBbW5xIBl1yryw8NSTzlO4OT27/2ltk3853vGyDne1nptQRj3zRzoi1KBRt2\nf/2M6Vm6GbPiBRiFptMroAqRvcPVDBexCmIAkq1CTwKBgQDavg+LbwYPRAdPZZWS\nq02XzDnCiVvPN2MmyjFQE0BjZSFyWUmxNf3B5eA2cCAZO4AcnzcZrAi0Or9YBv4c\n6KwryvTpjddzschWHVMUE/GmYJ0bxS1L8jpXcqM3eD0pmw8YrvouvkQjfNZXZqsS\nIRmLj/btTbaGEqgP6rEcvYBIDwKBgQDZG9DEESQGXOCUUiWyF/4vyeM+HqQp2yV2\nnE7fTbEmMptY0eLvWo0DhvDTL//5p4csjeehxa1c5zeAv5by3lNhhQLQc3vx9ktG\nsEn5a+FhDWrGGuRwHMApFyHSwHYrGFIyrcbGXgDDyvuPfO7Hjxbbk0uHEA0miTV1\nvzmobsLUxwKBgGkjKT2Po4362LikzvusMqNfmfOTKfmzWYjvbjxaxHwBw82+dNkk\nZk+oOdhwlD0MiEs5i7ZfJd/azNV2pvUmMHrsFkOb1Y7aSLx7Y88mtuONVHZgk6EJ\nUFlv1Ft5A4zX5q2qjL2i3gVgn3Vq95bDKhAWpkzxKWXP2c+ssmxp+RqpAoGBAMIR\ngR2iDMe7QeyTOarmk4p4Nq8SiL/5apWJx18Ifa+dUQumIeq+0Io0lxTQNUoenFAR\nUNpkb0E6VXy6xd23KmVjlk5qzIJPHJ7FenqAGmi8q5MF+ujUaljQmqVe8MI47VDr\n0WEEKj7aNN/gVzj/CPmhwsOqb3bg826bTpjm2TWFAoGAHlN0ILju4IrfDPKqor0t\n5HE5jTWDNKMawWyvxnCsuL2m4ZcNgMCWFbXoqLiPbPBsLm90lme53pSLUrD2dUOG\nEpVSrTfSrpdAqtzeIACGzfgydsJmt1FfOQnPURaMa+7QwrJO59lKcVkCAq9ua7en\ne7W+BvcN+552je66MJjs1yo=\n-----END PRIVATE KEY-----\n" }, "rc": 0, "start": "2024-07-27 12:38:24.868746" } STDOUT: Request "20240727163822" removed. TASK [fedora.linux_system_roles.certificate : Remove files] ******************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:181 Saturday 27 July 2024 12:38:24 -0400 (0:00:00.572) 0:00:16.807 ********* changed: [managed_node1] => (item=/etc/pki/tls/certs/quadlet_demo.crt) => { "ansible_loop_var": "item", "changed": true, "item": "/etc/pki/tls/certs/quadlet_demo.crt", "path": "/etc/pki/tls/certs/quadlet_demo.crt", "state": "absent" } changed: [managed_node1] => (item=/etc/pki/tls/private/quadlet_demo.key) => { "ansible_loop_var": "item", "changed": true, "item": "/etc/pki/tls/private/quadlet_demo.key", "path": "/etc/pki/tls/private/quadlet_demo.key", "state": "absent" } ok: [managed_node1] => (item=/etc/pki/tls/certs/quadlet_demo.crt) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/pki/tls/certs/quadlet_demo.crt", "path": "/etc/pki/tls/certs/quadlet_demo.crt", "state": "absent" } TASK [Run the role] ************************************************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:44 Saturday 27 July 2024 12:38:26 -0400 (0:00:01.285) 0:00:18.093 ********* included: fedora.linux_system_roles.podman for managed_node1 TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:38:26 -0400 (0:00:00.065) 0:00:18.158 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:38:26 -0400 (0:00:00.030) 0:00:18.188 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:38:26 -0400 (0:00:00.027) 0:00:18.216 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:38:26 -0400 (0:00:00.424) 0:00:18.641 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:38:26 -0400 (0:00:00.025) 0:00:18.666 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } ok: [managed_node1] => (item=Fedora.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node1] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:38:26 -0400 (0:00:00.046) 0:00:18.712 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:38:28 -0400 (0:00:01.315) 0:00:20.028 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:38:28 -0400 (0:00:00.040) 0:00:20.069 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:38:28 -0400 (0:00:00.044) 0:00:20.113 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.034162", "end": "2024-07-27 12:38:28.667232", "rc": 0, "start": "2024-07-27 12:38:28.633070" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:38:28 -0400 (0:00:00.475) 0:00:20.588 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:38:28 -0400 (0:00:00.037) 0:00:20.626 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:38:28 -0400 (0:00:00.037) 0:00:20.664 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:38:28 -0400 (0:00:00.046) 0:00:20.711 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:38:28 -0400 (0:00:00.052) 0:00:20.764 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:38:29 -0400 (0:00:00.096) 0:00:20.861 ********* ok: [managed_node1] => { "ansible_facts": { "getent_passwd": { "root": [ "x", "0", "0", "Super User", "/root", "/bin/bash" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:38:29 -0400 (0:00:00.553) 0:00:21.414 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:38:29 -0400 (0:00:00.040) 0:00:21.455 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:38:29 -0400 (0:00:00.050) 0:00:21.505 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.449) 0:00:21.955 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.045) 0:00:22.001 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097784.1304564, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "90e21e4e1feb2c9919e712a87df6caa0b481c909", "ctime": 1722097708.323023, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 203700, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1716422400.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2720902497", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.448) 0:00:22.449 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.035) 0:00:22.485 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.036) 0:00:22.521 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.035) 0:00:22.556 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.033) 0:00:22.590 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.034) 0:00:22.625 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.034) 0:00:22.659 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.034) 0:00:22.694 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.034) 0:00:22.728 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/etc/containers/policy.json", "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/etc/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:38:30 -0400 (0:00:00.068) 0:00:22.796 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.072) 0:00:22.869 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.064) 0:00:22.933 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.035) 0:00:22.969 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.072) 0:00:23.041 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.035) 0:00:23.077 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.036) 0:00:23.113 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.074) 0:00:23.187 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.036) 0:00:23.223 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.035) 0:00:23.259 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.077) 0:00:23.336 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.035) 0:00:23.372 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.035) 0:00:23.408 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.035) 0:00:23.444 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.036) 0:00:23.480 ********* included: fedora.linux_system_roles.firewall for managed_node1 TASK [fedora.linux_system_roles.firewall : Setup firewalld] ******************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:2 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.156) 0:00:23.636 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml for managed_node1 TASK [fedora.linux_system_roles.firewall : Ensure ansible_facts used by role] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:2 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.068) 0:00:23.704 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Check if system is ostree] ********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:10 Saturday 27 July 2024 12:38:31 -0400 (0:00:00.044) 0:00:23.748 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.firewall : Set flag to indicate system is ostree] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:15 Saturday 27 July 2024 12:38:32 -0400 (0:00:00.433) 0:00:24.182 ********* ok: [managed_node1] => { "ansible_facts": { "__firewall_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.firewall : Check if transactional-update exists in /sbin] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:22 Saturday 27 July 2024 12:38:32 -0400 (0:00:00.042) 0:00:24.224 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.firewall : Set flag if transactional-update exists] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:27 Saturday 27 July 2024 12:38:32 -0400 (0:00:00.434) 0:00:24.659 ********* ok: [managed_node1] => { "ansible_facts": { "__firewall_is_transactional": false }, "changed": false } TASK [fedora.linux_system_roles.firewall : Install firewalld] ****************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:31 Saturday 27 July 2024 12:38:32 -0400 (0:00:00.040) 0:00:24.700 ********* ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: firewalld TASK [fedora.linux_system_roles.firewall : Notify user that reboot is needed to apply changes] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:43 Saturday 27 July 2024 12:38:34 -0400 (0:00:01.551) 0:00:26.251 ********* skipping: [managed_node1] => { "false_condition": "__firewall_is_transactional | d(false)" } TASK [fedora.linux_system_roles.firewall : Reboot transactional update systems] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:48 Saturday 27 July 2024 12:38:34 -0400 (0:00:00.037) 0:00:26.289 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_is_transactional | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Fail if reboot is needed and not set] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:53 Saturday 27 July 2024 12:38:34 -0400 (0:00:00.036) 0:00:26.325 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_is_transactional | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Collect service facts] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:5 Saturday 27 July 2024 12:38:34 -0400 (0:00:00.036) 0:00:26.361 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Attempt to stop and disable conflicting services] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:9 Saturday 27 July 2024 12:38:34 -0400 (0:00:00.034) 0:00:26.396 ********* skipping: [managed_node1] => (item=nftables) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "item": "nftables", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=iptables) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "item": "iptables", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=ufw) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "item": "ufw", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.firewall : Unmask firewalld service] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:22 Saturday 27 July 2024 12:38:34 -0400 (0:00:00.043) 0:00:26.439 ********* ok: [managed_node1] => { "changed": false, "name": "firewalld", "status": { "AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0", "ActiveEnterTimestamp": "Sat 2024-07-27 12:32:07 EDT", "ActiveEnterTimestampMonotonic": "465722898", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.socket sysinit.target polkit.service dbus-broker.service basic.target system.slice", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Sat 2024-07-27 12:32:06 EDT", "AssertTimestampMonotonic": "464443775", "Before": "network-pre.target multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "686061000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Sat 2024-07-27 12:32:06 EDT", "ConditionTimestampMonotonic": "464443771", "ConfigurationDirectoryMode": "0755", "Conflicts": "ip6tables.service shutdown.target iptables.service ipset.service nftables.service ebtables.service", "ControlGroup": "/system.slice/firewalld.service", "ControlGroupId": "5823", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "\"man:firewalld(1)\"", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "DynamicUser": "no", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "24057", "ExecMainStartTimestamp": "Sat 2024-07-27 12:32:06 EDT", "ExecMainStartTimestampMonotonic": "464452173", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Sat 2024-07-27 12:32:06 EDT", "InactiveExitTimestampMonotonic": "464452493", "InvocationID": "4454cf7d3a6b4bae82a01fc96c89405e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14725", "LimitNPROCSoft": "14725", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14725", "LimitSIGPENDINGSoft": "14725", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "24057", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3359748096", "MemoryCurrent": "33288192", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "33816576", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "0", "MemorySwapMax": "infinity", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryZSwapMax": "infinity", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "firewalld.service dbus-org.fedoraproject.FirewallD1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice", "Restart": "no", "RestartKillSignal": "15", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestamp": "Sat 2024-07-27 12:38:21 EDT", "StateChangeTimestampMonotonic": "839753683", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "2", "TasksMax": "4417", "TimeoutAbortUSec": "45s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "45s", "TimeoutStopFailureMode": "abort", "TimeoutStopUSec": "45s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.firewall : Enable and start firewalld service] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:28 Saturday 27 July 2024 12:38:35 -0400 (0:00:00.645) 0:00:27.084 ********* ok: [managed_node1] => { "changed": false, "enabled": true, "name": "firewalld", "state": "started", "status": { "AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0", "ActiveEnterTimestamp": "Sat 2024-07-27 12:32:07 EDT", "ActiveEnterTimestampMonotonic": "465722898", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.socket sysinit.target polkit.service dbus-broker.service basic.target system.slice", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Sat 2024-07-27 12:32:06 EDT", "AssertTimestampMonotonic": "464443775", "Before": "network-pre.target multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "686061000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Sat 2024-07-27 12:32:06 EDT", "ConditionTimestampMonotonic": "464443771", "ConfigurationDirectoryMode": "0755", "Conflicts": "ip6tables.service shutdown.target iptables.service ipset.service nftables.service ebtables.service", "ControlGroup": "/system.slice/firewalld.service", "ControlGroupId": "5823", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "\"man:firewalld(1)\"", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "DynamicUser": "no", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "24057", "ExecMainStartTimestamp": "Sat 2024-07-27 12:32:06 EDT", "ExecMainStartTimestampMonotonic": "464452173", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Sat 2024-07-27 12:32:06 EDT", "InactiveExitTimestampMonotonic": "464452493", "InvocationID": "4454cf7d3a6b4bae82a01fc96c89405e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14725", "LimitNPROCSoft": "14725", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14725", "LimitSIGPENDINGSoft": "14725", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "24057", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3358425088", "MemoryCurrent": "33288192", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "33816576", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "0", "MemorySwapMax": "infinity", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryZSwapMax": "infinity", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "firewalld.service dbus-org.fedoraproject.FirewallD1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice", "Restart": "no", "RestartKillSignal": "15", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestamp": "Sat 2024-07-27 12:38:21 EDT", "StateChangeTimestampMonotonic": "839753683", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "2", "TasksMax": "4417", "TimeoutAbortUSec": "45s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "45s", "TimeoutStopFailureMode": "abort", "TimeoutStopUSec": "45s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.firewall : Check if previous replaced is defined] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:34 Saturday 27 July 2024 12:38:35 -0400 (0:00:00.646) 0:00:27.731 ********* ok: [managed_node1] => { "ansible_facts": { "__firewall_previous_replaced": false, "__firewall_python_cmd": "/usr/bin/python3.12", "__firewall_report_changed": true }, "changed": false } TASK [fedora.linux_system_roles.firewall : Get config files, checksums before and remove] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:43 Saturday 27 July 2024 12:38:35 -0400 (0:00:00.048) 0:00:27.779 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Tell firewall module it is able to report changed] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:55 Saturday 27 July 2024 12:38:35 -0400 (0:00:00.033) 0:00:27.813 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Configure firewall] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:71 Saturday 27 July 2024 12:38:36 -0400 (0:00:00.064) 0:00:27.878 ********* changed: [managed_node1] => (item={'port': '8000/tcp', 'state': 'enabled'}) => { "__firewall_changed": true, "ansible_loop_var": "item", "changed": true, "item": { "port": "8000/tcp", "state": "enabled" } } changed: [managed_node1] => (item={'port': '9000/tcp', 'state': 'enabled'}) => { "__firewall_changed": true, "ansible_loop_var": "item", "changed": true, "item": { "port": "9000/tcp", "state": "enabled" } } TASK [fedora.linux_system_roles.firewall : Gather firewall config information] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:120 Saturday 27 July 2024 12:38:37 -0400 (0:00:01.267) 0:00:29.145 ********* skipping: [managed_node1] => (item={'port': '8000/tcp', 'state': 'enabled'}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall | length == 1", "item": { "port": "8000/tcp", "state": "enabled" }, "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item={'port': '9000/tcp', 'state': 'enabled'}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall | length == 1", "item": { "port": "9000/tcp", "state": "enabled" }, "skip_reason": "Conditional result was False" } skipping: [managed_node1] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.firewall : Update firewalld_config fact] ******* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:130 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.057) 0:00:29.202 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "firewall | length == 1", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Gather firewall config if no arguments] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:139 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.039) 0:00:29.241 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "firewall == None or firewall | length == 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Update firewalld_config fact] ******* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:144 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.036) 0:00:29.278 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "firewall == None or firewall | length == 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Get config files, checksums after] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:153 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.037) 0:00:29.315 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Calculate what has changed] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:163 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.033) 0:00:29.349 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Show diffs] ************************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:169 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.035) 0:00:29.384 ********* skipping: [managed_node1] => { "false_condition": "__firewall_previous_replaced | bool" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.055) 0:00:29.440 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.037) 0:00:29.477 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.035) 0:00:29.513 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.033) 0:00:29.547 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.031) 0:00:29.579 ********* fatal: [managed_node1]: FAILED! => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result" } TASK [Dump journal] ************************************************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:124 Saturday 27 July 2024 12:38:37 -0400 (0:00:00.034) 0:00:29.613 ********* fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": [ "journalctl", "-ex" ], "delta": "0:00:00.052310", "end": "2024-07-27 12:38:38.178868", "failed_when_result": true, "rc": 0, "start": "2024-07-27 12:38:38.126558" } STDOUT: Jul 27 12:38:11 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:11 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:11 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=58723 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:12 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[58741]: ansible-ansible.legacy.dnf Invoked with name=['python3-pyasn1', 'python3-cryptography', 'python3-dbus'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=58753 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=58770 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=58786 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=58808 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:14 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:15 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:15 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:15 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=58824 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:15 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:15 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:15 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:15 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:15 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=58841 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:15 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[58859]: ansible-ansible.legacy.dnf Invoked with name=['certmonger', 'python3-packaging'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com dbus-broker-launch[611]: Noticed file-system modification, trigger reload. ░░ Subject: A configuration directory was written to ░░ Defined-By: dbus-broker ░░ Support: https://groups.google.com/forum/#!forum/bus1-devel ░░ ░░ A write was detected to one of the directories containing D-Bus configuration ░░ files, triggering a configuration reload. ░░ ░░ This functionality exists for backwards compatibility to pick up changes to ░░ D-Bus configuration without an explicit reolad request. Typically when ░░ installing or removing third-party software causes D-Bus configuration files ░░ to be added or removed. ░░ ░░ It is worth noting that this may cause partial configuration to be loaded in ░░ case dispatching this notification races with the writing of the configuration ░░ files. However, a future notification will then cause the configuration to be ░░ reladed again. Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com dbus-broker-launch[611]: Noticed file-system modification, trigger reload. ░░ Subject: A configuration directory was written to ░░ Defined-By: dbus-broker ░░ Support: https://groups.google.com/forum/#!forum/bus1-devel ░░ ░░ A write was detected to one of the directories containing D-Bus configuration ░░ files, triggering a configuration reload. ░░ ░░ This functionality exists for backwards compatibility to pick up changes to ░░ D-Bus configuration without an explicit reolad request. Typically when ░░ installing or removing third-party software causes D-Bus configuration files ░░ to be added or removed. ░░ ░░ It is worth noting that this may cause partial configuration to be loaded in ░░ case dispatching this notification races with the writing of the configuration ░░ files. However, a future notification will then cause the configuration to be ░░ reladed again. Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com dbus-broker-launch[611]: Noticed file-system modification, trigger reload. ░░ Subject: A configuration directory was written to ░░ Defined-By: dbus-broker ░░ Support: https://groups.google.com/forum/#!forum/bus1-devel ░░ ░░ A write was detected to one of the directories containing D-Bus configuration ░░ files, triggering a configuration reload. ░░ ░░ This functionality exists for backwards compatibility to pick up changes to ░░ D-Bus configuration without an explicit reolad request. Typically when ░░ installing or removing third-party software causes D-Bus configuration files ░░ to be added or removed. ░░ ░░ It is worth noting that this may cause partial configuration to be loaded in ░░ case dispatching this notification races with the writing of the configuration ░░ files. However, a future notification will then cause the configuration to be ░░ reladed again. Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Reloading requested from client PID 58870 ('systemctl') (unit session-5.scope)... Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com systemd-getty-generator[58903]: Failed to parse $SYSTEMD_GETTY_AUTO environment variable, ignoring: Permission denied Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 255 ms. Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=309 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=290 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=310 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=291 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=311 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=292 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=312 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=313 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=293 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=294 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=314 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=315 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=295 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=296 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=316 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=297 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=317 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=318 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=298 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=299 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=319 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=300 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=320 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=321 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=301 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=302 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=322 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=303 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=323 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=304 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=324 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=305 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=325 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=306 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=326 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=327 op=LOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=307 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=308 op=UNLOAD Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Starting dnf-makecache.service - dnf makecache... ░░ Subject: A start job for unit dnf-makecache.service has begun execution ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ A start job for unit dnf-makecache.service has begun execution. ░░ ░░ The job identifier is 2542. Jul 27 12:38:17 ip-10-31-41-232.us-east-1.aws.redhat.com dnf[58918]: Failed determining last makecache time. Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Started run-r27ddfa06f22144fc806fbd9118b2679a.service - /usr/bin/systemctl start man-db-cache-update. ░░ Subject: A start job for unit run-r27ddfa06f22144fc806fbd9118b2679a.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ A start job for unit run-r27ddfa06f22144fc806fbd9118b2679a.service has finished successfully. ░░ ░░ The job identifier is 2632. Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=run-r27ddfa06f22144fc806fbd9118b2679a comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Starting man-db-cache-update.service... ░░ Subject: A start job for unit man-db-cache-update.service has begun execution ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ A start job for unit man-db-cache-update.service has begun execution. ░░ ░░ The job identifier is 2722. Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Reloading requested from client PID 58929 ('systemctl') (unit session-5.scope)... Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com dnf[58918]: Beaker Client - Fedora40 9.6 kB/s | 1.5 kB 00:00 Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com dnf[58918]: Beaker harness 87 kB/s | 1.3 kB 00:00 Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com systemd-getty-generator[58964]: Failed to parse $SYSTEMD_GETTY_AUTO environment variable, ignoring: Permission denied Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com dnf[58918]: Fedora 40 - x86_64 260 kB/s | 28 kB 00:00 Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 457 ms. Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=328 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=309 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=329 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=310 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=330 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=311 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=331 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=332 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=312 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=313 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=333 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=334 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=314 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=315 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=335 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=316 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=336 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=337 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=317 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=318 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=338 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=319 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=339 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=340 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=320 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=321 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=341 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=322 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=342 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=323 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=343 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=324 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=344 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=325 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=345 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=346 op=LOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=326 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=327 op=UNLOAD Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Queuing reload/restart jobs for marked units… Jul 27 12:38:18 ip-10-31-41-232.us-east-1.aws.redhat.com dnf[58918]: Fedora 40 openh264 (From Cisco) - x86_64 8.4 kB/s | 989 B 00:00 Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com dnf[58918]: Fedora 40 - x86_64 - Updates 94 kB/s | 27 kB 00:00 Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=58991 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59008 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59024 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: man-db-cache-update.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ The unit man-db-cache-update.service has successfully entered the 'dead' state. Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Finished man-db-cache-update.service. ░░ Subject: A start job for unit man-db-cache-update.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ A start job for unit man-db-cache-update.service has finished successfully. ░░ ░░ The job identifier is 2722. Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=man-db-cache-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=man-db-cache-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=run-r27ddfa06f22144fc806fbd9118b2679a comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: run-r27ddfa06f22144fc806fbd9118b2679a.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ The unit run-r27ddfa06f22144fc806fbd9118b2679a.service has successfully entered the 'dead' state. Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59051 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com dnf[58918]: Copr repo for qa-tools owned by lpol 33 kB/s | 1.8 kB 00:00 Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com dnf[58918]: Metadata cache created. Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59067 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: dnf-makecache.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ The unit dnf-makecache.service has successfully entered the 'dead' state. Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Finished dnf-makecache.service - dnf makecache. ░░ Subject: A start job for unit dnf-makecache.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ A start job for unit dnf-makecache.service has finished successfully. ░░ ░░ The job identifier is 2542. Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=dnf-makecache comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=dnf-makecache comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: dnf-makecache.service: Consumed 1.404s CPU time, 30.8M memory peak, 0B memory swap peak. ░░ Subject: Resources consumed by unit runtime ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ The unit dnf-makecache.service completed and consumed the indicated resources. Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:19 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59085 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[59103]: ansible-file Invoked with name=/etc/certmonger//pre-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//pre-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59104 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59121 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59137 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59159 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59175 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59192 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[59210]: ansible-file Invoked with name=/etc/certmonger//post-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//post-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59211 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59228 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59244 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:20 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59266 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59282 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59299 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[59317]: ansible-ansible.legacy.systemd Invoked with name=certmonger state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Reloading requested from client PID 59320 ('systemctl') (unit session-5.scope)... Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com systemd-getty-generator[59351]: Failed to parse $SYSTEMD_GETTY_AUTO environment variable, ignoring: Permission denied Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 254 ms. Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=347 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=328 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=348 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=329 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=349 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=330 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=350 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=351 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=331 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=332 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=352 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=353 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=333 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=334 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=354 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=335 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=355 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=356 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=336 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=337 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=357 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=338 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=358 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=359 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=339 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=340 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=360 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=341 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=361 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=342 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=362 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=343 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=363 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=344 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=364 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=365 op=LOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=345 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit: BPF prog-id=346 op=UNLOAD Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Starting certmonger.service - Certificate monitoring and PKI enrollment... ░░ Subject: A start job for unit certmonger.service has begun execution ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ A start job for unit certmonger.service has begun execution. ░░ ░░ The job identifier is 2812. Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com (rtmonger)[59370]: certmonger.service: Referenced but unset environment variable evaluates to an empty string: OPTS Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com systemd[1]: Started certmonger.service - Certificate monitoring and PKI enrollment. ░░ Subject: A start job for unit certmonger.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ A start job for unit certmonger.service has finished successfully. ░░ ░░ The job identifier is 2812. Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=certmonger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:21 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59383 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59412 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59428 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59453 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59469 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59486 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[59504]: ansible-fedora.linux_system_roles.certificate_request Invoked with name=quadlet_demo dns=['localhost'] directory=/etc/pki/tls wait=True ca=self-sign __header=# # Ansible managed # # system_role:certificate provider_config_directory=/etc/certmonger provider=certmonger key_usage=['digitalSignature', 'keyEncipherment'] extended_key_usage=['id-kp-serverAuth', 'id-kp-clientAuth'] auto_renew=True ip=None email=None common_name=None country=None state=None locality=None organization=None organizational_unit=None contact_email=None key_size=None owner=None group=None mode=None principal=None run_before=None run_after=None Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59519]: Certificate in file "/etc/pki/tls/certs/quadlet_demo.crt" issued by CA and saved. Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:22 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:22 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59520 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59537 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59553 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59575 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59591 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59608 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[59626]: ansible-slurp Invoked with path=/etc/pki/tls/certs/quadlet_demo.crt src=/etc/pki/tls/certs/quadlet_demo.crt Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59627 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59644 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59660 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59682 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59698 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59715 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[59733]: ansible-slurp Invoked with path=/etc/pki/tls/private/quadlet_demo.key src=/etc/pki/tls/private/quadlet_demo.key Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59734 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59751 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:23 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59767 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59789 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59805 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59822 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[59840]: ansible-slurp Invoked with path=/etc/pki/tls/certs/quadlet_demo.crt src=/etc/pki/tls/certs/quadlet_demo.crt Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59841 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59858 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59874 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59896 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59912 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59929 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[59947]: ansible-ansible.legacy.command Invoked with _raw_params=getcert stop-tracking -f /etc/pki/tls/certs/quadlet_demo.crt _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com certmonger[59370]: 2024-07-27 12:38:24 [59370] Wrote to /var/lib/certmonger/requests/20240727163822 Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59949 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:24 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59966 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=59982 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60004 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60020 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60037 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[60055]: ansible-file Invoked with path=/etc/pki/tls/certs/quadlet_demo.crt state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60056 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60073 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60089 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60111 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60127 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60144 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[60162]: ansible-file Invoked with path=/etc/pki/tls/private/quadlet_demo.key state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60163 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60180 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60196 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60218 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60234 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:25 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60251 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[60269]: ansible-file Invoked with path=/etc/pki/tls/certs/quadlet_demo.crt state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60270 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60287 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60303 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60325 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60341 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60358 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[60376]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60377 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60394 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60410 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:26 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60432 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60448 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:27 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60465 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60484 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60501 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60517 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60539 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60555 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60572 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[60590]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60597 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:28 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60614 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60630 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60652 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60668 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60685 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[60703]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60705 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60722 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60738 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60760 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60776 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:29 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60793 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[60811]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60813 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60830 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60846 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60868 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60884 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60901 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[60919]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60922 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:30 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60939 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60955 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:31 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60977 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=60993 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61010 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[61028]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61029 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61046 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61062 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61084 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61100 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61117 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[61135]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61136 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61153 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61169 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61191 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:32 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61207 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:33 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:33 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:33 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:33 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:33 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61224 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:33 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[61242]: ansible-ansible.legacy.dnf Invoked with name=['firewalld'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61244 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61261 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61277 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61299 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61315 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:34 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61332 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[61350]: ansible-systemd Invoked with name=firewalld masked=False daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61353 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61370 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61386 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61408 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61424 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61441 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[61459]: ansible-ansible.legacy.systemd Invoked with name=firewalld state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61462 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:35 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61479 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61495 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61517 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61533 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61550 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[61568]: ansible-fedora.linux_system_roles.firewall_lib Invoked with port=['8000/tcp'] permanent=True runtime=True state=enabled __report_changed=True service=[] source_port=[] forward_port=[] rich_rule=[] source=[] interface=[] interface_pci_id=[] icmp_block=[] timeout=0 ipset_entries=[] protocol=[] helper_module=[] destination=[] firewalld_conf=None masquerade=None icmp_block_inversion=None target=None zone=None set_default_zone=None ipset=None ipset_type=None description=None short=None Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[24057]: NETFILTER_CFG table=firewalld:95 family=1 entries=1 op=nft_register_rule pid=24057 subj=system_u:system_r:firewalld_t:s0 comm="firewalld" Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61569 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61586 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61602 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61624 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61640 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:36 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61657 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[61675]: ansible-fedora.linux_system_roles.firewall_lib Invoked with port=['9000/tcp'] permanent=True runtime=True state=enabled __report_changed=True service=[] source_port=[] forward_port=[] rich_rule=[] source=[] interface=[] interface_pci_id=[] icmp_block=[] timeout=0 ipset_entries=[] protocol=[] helper_module=[] destination=[] firewalld_conf=None masquerade=None icmp_block_inversion=None target=None zone=None set_default_zone=None ipset=None ipset_type=None description=None short=None Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[24057]: NETFILTER_CFG table=firewalld:96 family=1 entries=1 op=nft_register_rule pid=24057 subj=system_u:system_r:firewalld_t:s0 comm="firewalld" Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=/dev/pts/0 res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61676 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61693 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61709 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61731 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61747 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_END pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGOUT pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=ssh res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_LOGIN pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: USER_START pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login id=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=/dev/pts/0 res=success' Jul 27 12:38:37 ip-10-31-41-232.us-east-1.aws.redhat.com audit[1454]: CRYPTO_KEY_USER pid=1454 uid=0 auid=0 ses=5 subj=system_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=destroy kind=server fp=SHA256:2c:e0:8c:d0:3c:b8:58:59:e3:d5:f1:58:32:8b:7e:96:30:2a:7c:1f:27:32:d8:c8:ca:1a:a8:dd:9c:8a:3e:dd direction=? spid=61764 suid=0 exe="/usr/sbin/sshd" hostname=? addr=10.31.12.130 terminal=? res=success' Jul 27 12:38:38 ip-10-31-41-232.us-east-1.aws.redhat.com python3.12[61782]: ansible-ansible.legacy.command Invoked with _raw_params=journalctl -ex _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None TASK [Check] ******************************************************************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:130 Saturday 27 July 2024 12:38:38 -0400 (0:00:00.508) 0:00:30.122 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "ps", "-a" ], "delta": "0:00:00.041625", "end": "2024-07-27 12:38:38.678727", "rc": 0, "start": "2024-07-27 12:38:38.637102" } STDOUT: CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES TASK [Check pods] ************************************************************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:134 Saturday 27 July 2024 12:38:38 -0400 (0:00:00.476) 0:00:30.599 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "pod", "ps", "--ctr-ids", "--ctr-names", "--ctr-status" ], "delta": "0:00:00.040443", "end": "2024-07-27 12:38:39.153915", "failed_when_result": false, "rc": 0, "start": "2024-07-27 12:38:39.113472" } STDOUT: POD ID NAME STATUS CREATED INFRA ID IDS NAMES STATUS TASK [Check systemd] *********************************************************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:139 Saturday 27 July 2024 12:38:39 -0400 (0:00:00.475) 0:00:31.075 ********* ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail; systemctl list-units --all | grep quadlet", "delta": "0:00:00.015739", "end": "2024-07-27 12:38:39.606967", "failed_when_result": false, "rc": 1, "start": "2024-07-27 12:38:39.591228" } MSG: non-zero return code TASK [LS] ********************************************************************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:147 Saturday 27 July 2024 12:38:39 -0400 (0:00:00.451) 0:00:31.526 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-alrtF", "/etc/systemd/system" ], "delta": "0:00:00.005119", "end": "2024-07-27 12:38:40.043130", "failed_when_result": false, "rc": 0, "start": "2024-07-27 12:38:40.038011" } STDOUT: total 56 drwxr-xr-x. 5 root root 4096 Jul 24 05:01 ../ drwxr-xr-x. 2 root root 4096 Jul 24 05:01 systemd-journald.service.wants/ lrwxrwxrwx. 1 root root 48 Jul 24 05:01 dbus-org.freedesktop.resolve1.service -> /usr/lib/systemd/system/systemd-resolved.service drwxr-xr-x. 2 root root 4096 Jul 24 05:01 getty.target.wants/ lrwxrwxrwx. 1 root root 43 Jul 24 05:01 dbus.service -> /usr/lib/systemd/system/dbus-broker.service lrwxrwxrwx. 1 root root 37 Jul 24 05:01 ctrl-alt-del.target -> /usr/lib/systemd/system/reboot.target drwxr-xr-x. 2 root root 4096 Jul 24 05:01 systemd-homed.service.wants/ lrwxrwxrwx. 1 root root 44 Jul 24 05:01 dbus-org.freedesktop.oom1.service -> /usr/lib/systemd/system/systemd-oomd.service lrwxrwxrwx. 1 root root 45 Jul 24 05:01 dbus-org.freedesktop.home1.service -> /usr/lib/systemd/system/systemd-homed.service lrwxrwxrwx. 1 root root 41 Jul 24 05:02 dbus-org.bluez.service -> /usr/lib/systemd/system/bluetooth.service drwxr-xr-x. 2 root root 4096 Jul 24 05:02 bluetooth.target.wants/ drwxr-xr-x. 2 root root 4096 Jul 24 05:02 graphical.target.wants/ drwxr-xr-x. 2 root root 4096 Jul 24 05:02 timers.target.wants/ drwxr-xr-x. 2 root root 4096 Jul 24 05:02 network-online.target.wants/ lrwxrwxrwx. 1 root root 57 Jul 24 05:02 dbus-org.freedesktop.nm-dispatcher.service -> /usr/lib/systemd/system/NetworkManager-dispatcher.service lrwxrwxrwx. 1 root root 41 Jul 24 05:06 default.target -> /usr/lib/systemd/system/multi-user.target drwxr-xr-x. 2 root root 4096 Jul 24 05:17 remote-fs.target.wants/ drwxr-xr-x. 2 root root 4096 Jul 24 05:18 cloud-init.target.wants/ drwxr-xr-x. 2 root root 4096 Jul 24 05:18 sockets.target.wants/ drwxr-xr-x. 2 root root 4096 Jul 24 05:18 sysinit.target.wants/ lrwxrwxrwx. 1 root root 41 Jul 27 12:32 dbus-org.fedoraproject.FirewallD1.service -> /usr/lib/systemd/system/firewalld.service drwxr-xr-x. 14 root root 4096 Jul 27 12:37 ./ drwxr-xr-x. 2 root root 4096 Jul 27 12:38 multi-user.target.wants/ TASK [Cleanup] ***************************************************************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:154 Saturday 27 July 2024 12:38:40 -0400 (0:00:00.469) 0:00:31.996 ********* included: fedora.linux_system_roles.podman for managed_node1 TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:38:40 -0400 (0:00:00.086) 0:00:32.083 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:38:40 -0400 (0:00:00.067) 0:00:32.151 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:38:40 -0400 (0:00:00.045) 0:00:32.196 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:38:40 -0400 (0:00:00.035) 0:00:32.232 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:38:40 -0400 (0:00:00.037) 0:00:32.269 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } ok: [managed_node1] => (item=Fedora.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node1] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:38:40 -0400 (0:00:00.079) 0:00:32.349 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:38:41 -0400 (0:00:01.105) 0:00:33.454 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:38:41 -0400 (0:00:00.036) 0:00:33.490 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:38:41 -0400 (0:00:00.041) 0:00:33.531 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.032326", "end": "2024-07-27 12:38:42.085040", "rc": 0, "start": "2024-07-27 12:38:42.052714" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:38:42 -0400 (0:00:00.475) 0:00:34.007 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:38:42 -0400 (0:00:00.038) 0:00:34.045 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:38:42 -0400 (0:00:00.066) 0:00:34.112 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:38:42 -0400 (0:00:00.044) 0:00:34.156 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:38:42 -0400 (0:00:00.049) 0:00:34.206 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:38:42 -0400 (0:00:00.079) 0:00:34.285 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:38:42 -0400 (0:00:00.040) 0:00:34.326 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:38:42 -0400 (0:00:00.042) 0:00:34.369 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:38:42 -0400 (0:00:00.049) 0:00:34.418 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.450) 0:00:34.869 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.047) 0:00:34.917 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097784.1304564, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "90e21e4e1feb2c9919e712a87df6caa0b481c909", "ctime": 1722097708.323023, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 203700, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1716422400.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2720902497", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.447) 0:00:35.364 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.035) 0:00:35.400 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.033) 0:00:35.434 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.035) 0:00:35.469 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.034) 0:00:35.503 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.033) 0:00:35.536 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.034) 0:00:35.570 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.034) 0:00:35.604 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.064) 0:00:35.669 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/etc/containers/policy.json", "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/etc/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.068) 0:00:35.737 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:38:43 -0400 (0:00:00.074) 0:00:35.812 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.036) 0:00:35.848 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.036) 0:00:35.884 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.075) 0:00:35.960 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.037) 0:00:35.998 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.035) 0:00:36.033 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.079) 0:00:36.113 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.036) 0:00:36.149 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.036) 0:00:36.186 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.079) 0:00:36.266 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.037) 0:00:36.304 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.036) 0:00:36.340 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.067) 0:00:36.408 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.038) 0:00:36.446 ********* included: fedora.linux_system_roles.firewall for managed_node1 TASK [fedora.linux_system_roles.firewall : Setup firewalld] ******************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:2 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.130) 0:00:36.577 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml for managed_node1 TASK [fedora.linux_system_roles.firewall : Ensure ansible_facts used by role] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:2 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.068) 0:00:36.646 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Check if system is ostree] ********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:10 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.044) 0:00:36.690 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __firewall_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Set flag to indicate system is ostree] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:15 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.037) 0:00:36.727 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __firewall_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Check if transactional-update exists in /sbin] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:22 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.036) 0:00:36.764 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __firewall_is_transactional is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Set flag if transactional-update exists] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:27 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.037) 0:00:36.801 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __firewall_is_transactional is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Install firewalld] ****************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:31 Saturday 27 July 2024 12:38:44 -0400 (0:00:00.037) 0:00:36.839 ********* ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: firewalld TASK [fedora.linux_system_roles.firewall : Notify user that reboot is needed to apply changes] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:43 Saturday 27 July 2024 12:38:46 -0400 (0:00:01.526) 0:00:38.366 ********* skipping: [managed_node1] => { "false_condition": "__firewall_is_transactional | d(false)" } TASK [fedora.linux_system_roles.firewall : Reboot transactional update systems] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:48 Saturday 27 July 2024 12:38:46 -0400 (0:00:00.038) 0:00:38.404 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_is_transactional | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Fail if reboot is needed and not set] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:53 Saturday 27 July 2024 12:38:46 -0400 (0:00:00.036) 0:00:38.441 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_is_transactional | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Collect service facts] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:5 Saturday 27 July 2024 12:38:46 -0400 (0:00:00.036) 0:00:38.478 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Attempt to stop and disable conflicting services] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:9 Saturday 27 July 2024 12:38:46 -0400 (0:00:00.034) 0:00:38.512 ********* skipping: [managed_node1] => (item=nftables) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "item": "nftables", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=iptables) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "item": "iptables", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=ufw) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "item": "ufw", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.firewall : Unmask firewalld service] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:22 Saturday 27 July 2024 12:38:46 -0400 (0:00:00.043) 0:00:38.556 ********* ok: [managed_node1] => { "changed": false, "name": "firewalld", "status": { "AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0", "ActiveEnterTimestamp": "Sat 2024-07-27 12:32:07 EDT", "ActiveEnterTimestampMonotonic": "465722898", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.socket sysinit.target polkit.service dbus-broker.service basic.target system.slice", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Sat 2024-07-27 12:32:06 EDT", "AssertTimestampMonotonic": "464443775", "Before": "network-pre.target multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "778474000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Sat 2024-07-27 12:32:06 EDT", "ConditionTimestampMonotonic": "464443771", "ConfigurationDirectoryMode": "0755", "Conflicts": "ip6tables.service shutdown.target iptables.service ipset.service nftables.service ebtables.service", "ControlGroup": "/system.slice/firewalld.service", "ControlGroupId": "5823", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "\"man:firewalld(1)\"", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "DynamicUser": "no", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "24057", "ExecMainStartTimestamp": "Sat 2024-07-27 12:32:06 EDT", "ExecMainStartTimestampMonotonic": "464452173", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Sat 2024-07-27 12:32:06 EDT", "InactiveExitTimestampMonotonic": "464452493", "InvocationID": "4454cf7d3a6b4bae82a01fc96c89405e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14725", "LimitNPROCSoft": "14725", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14725", "LimitSIGPENDINGSoft": "14725", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "24057", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3350880256", "MemoryCurrent": "33296384", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "33816576", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "0", "MemorySwapMax": "infinity", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryZSwapMax": "infinity", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "firewalld.service dbus-org.fedoraproject.FirewallD1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice", "Restart": "no", "RestartKillSignal": "15", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestamp": "Sat 2024-07-27 12:38:21 EDT", "StateChangeTimestampMonotonic": "839753683", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "2", "TasksMax": "4417", "TimeoutAbortUSec": "45s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "45s", "TimeoutStopFailureMode": "abort", "TimeoutStopUSec": "45s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.firewall : Enable and start firewalld service] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:28 Saturday 27 July 2024 12:38:47 -0400 (0:00:00.677) 0:00:39.233 ********* ok: [managed_node1] => { "changed": false, "enabled": true, "name": "firewalld", "state": "started", "status": { "AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0", "ActiveEnterTimestamp": "Sat 2024-07-27 12:32:07 EDT", "ActiveEnterTimestampMonotonic": "465722898", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.socket sysinit.target polkit.service dbus-broker.service basic.target system.slice", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Sat 2024-07-27 12:32:06 EDT", "AssertTimestampMonotonic": "464443775", "Before": "network-pre.target multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "778474000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Sat 2024-07-27 12:32:06 EDT", "ConditionTimestampMonotonic": "464443771", "ConfigurationDirectoryMode": "0755", "Conflicts": "ip6tables.service shutdown.target iptables.service ipset.service nftables.service ebtables.service", "ControlGroup": "/system.slice/firewalld.service", "ControlGroupId": "5823", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "\"man:firewalld(1)\"", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "DynamicUser": "no", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "24057", "ExecMainStartTimestamp": "Sat 2024-07-27 12:32:06 EDT", "ExecMainStartTimestampMonotonic": "464452173", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Sat 2024-07-27 12:32:06 EDT", "InactiveExitTimestampMonotonic": "464452493", "InvocationID": "4454cf7d3a6b4bae82a01fc96c89405e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14725", "LimitNPROCSoft": "14725", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14725", "LimitSIGPENDINGSoft": "14725", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "24057", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3360010240", "MemoryCurrent": "33296384", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "33816576", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "0", "MemorySwapMax": "infinity", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryZSwapMax": "infinity", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "firewalld.service dbus-org.fedoraproject.FirewallD1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice", "Restart": "no", "RestartKillSignal": "15", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestamp": "Sat 2024-07-27 12:38:21 EDT", "StateChangeTimestampMonotonic": "839753683", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "2", "TasksMax": "4417", "TimeoutAbortUSec": "45s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "45s", "TimeoutStopFailureMode": "abort", "TimeoutStopUSec": "45s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.firewall : Check if previous replaced is defined] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:34 Saturday 27 July 2024 12:38:48 -0400 (0:00:00.647) 0:00:39.881 ********* ok: [managed_node1] => { "ansible_facts": { "__firewall_previous_replaced": false, "__firewall_python_cmd": "/usr/bin/python3.12", "__firewall_report_changed": true }, "changed": false } TASK [fedora.linux_system_roles.firewall : Get config files, checksums before and remove] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:43 Saturday 27 July 2024 12:38:48 -0400 (0:00:00.048) 0:00:39.929 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Tell firewall module it is able to report changed] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:55 Saturday 27 July 2024 12:38:48 -0400 (0:00:00.035) 0:00:39.965 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Configure firewall] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:71 Saturday 27 July 2024 12:38:48 -0400 (0:00:00.034) 0:00:39.999 ********* ok: [managed_node1] => (item={'port': '8000/tcp', 'state': 'enabled'}) => { "__firewall_changed": false, "ansible_loop_var": "item", "changed": false, "item": { "port": "8000/tcp", "state": "enabled" } } ok: [managed_node1] => (item={'port': '9000/tcp', 'state': 'enabled'}) => { "__firewall_changed": false, "ansible_loop_var": "item", "changed": false, "item": { "port": "9000/tcp", "state": "enabled" } } TASK [fedora.linux_system_roles.firewall : Gather firewall config information] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:120 Saturday 27 July 2024 12:38:49 -0400 (0:00:01.107) 0:00:41.107 ********* skipping: [managed_node1] => (item={'port': '8000/tcp', 'state': 'enabled'}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall | length == 1", "item": { "port": "8000/tcp", "state": "enabled" }, "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item={'port': '9000/tcp', 'state': 'enabled'}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall | length == 1", "item": { "port": "9000/tcp", "state": "enabled" }, "skip_reason": "Conditional result was False" } skipping: [managed_node1] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.firewall : Update firewalld_config fact] ******* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:130 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.056) 0:00:41.163 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "firewall | length == 1", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Gather firewall config if no arguments] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:139 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.039) 0:00:41.203 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "firewall == None or firewall | length == 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Update firewalld_config fact] ******* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:144 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.038) 0:00:41.241 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "firewall == None or firewall | length == 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Get config files, checksums after] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:153 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.038) 0:00:41.280 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Calculate what has changed] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:163 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.036) 0:00:41.316 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Show diffs] ************************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:169 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.034) 0:00:41.351 ********* skipping: [managed_node1] => { "false_condition": "__firewall_previous_replaced | bool" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.056) 0:00:41.407 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.035) 0:00:41.443 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.037) 0:00:41.480 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.032) 0:00:41.513 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.032) 0:00:41.545 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml for managed_node1 => (item=(censored due to no_log)) included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml for managed_node1 => (item=(censored due to no_log)) included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml for managed_node1 => (item=(censored due to no_log)) TASK [fedora.linux_system_roles.podman : Set variables part 1] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:3 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.146) 0:00:41.692 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set variables part 2] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:7 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.042) 0:00:41.734 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_rootless": false, "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:13 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.045) 0:00:41.780 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:38:49 -0400 (0:00:00.060) 0:00:41.840 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:38:50 -0400 (0:00:00.035) 0:00:41.876 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:38:50 -0400 (0:00:00.034) 0:00:41.911 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:18 Saturday 27 July 2024 12:38:50 -0400 (0:00:00.036) 0:00:41.947 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Manage each secret] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:34 Saturday 27 July 2024 12:38:50 -0400 (0:00:00.033) 0:00:41.981 ********* [WARNING]: Using a variable for a task's 'args' is unsafe in some situations (see https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat- unsafe) ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Set variables part 1] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:3 Saturday 27 July 2024 12:38:50 -0400 (0:00:00.611) 0:00:42.593 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set variables part 2] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:7 Saturday 27 July 2024 12:38:50 -0400 (0:00:00.040) 0:00:42.634 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_rootless": false, "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:13 Saturday 27 July 2024 12:38:50 -0400 (0:00:00.048) 0:00:42.682 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:38:50 -0400 (0:00:00.059) 0:00:42.742 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:38:50 -0400 (0:00:00.034) 0:00:42.777 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:38:50 -0400 (0:00:00.035) 0:00:42.813 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:18 Saturday 27 July 2024 12:38:51 -0400 (0:00:00.035) 0:00:42.848 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Manage each secret] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:34 Saturday 27 July 2024 12:38:51 -0400 (0:00:00.034) 0:00:42.882 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Set variables part 1] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:3 Saturday 27 July 2024 12:38:51 -0400 (0:00:00.534) 0:00:43.417 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set variables part 2] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:7 Saturday 27 July 2024 12:38:51 -0400 (0:00:00.043) 0:00:43.460 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_rootless": false, "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:13 Saturday 27 July 2024 12:38:51 -0400 (0:00:00.046) 0:00:43.506 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:38:51 -0400 (0:00:00.062) 0:00:43.569 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:38:51 -0400 (0:00:00.036) 0:00:43.605 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:38:51 -0400 (0:00:00.036) 0:00:43.642 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:18 Saturday 27 July 2024 12:38:51 -0400 (0:00:00.036) 0:00:43.678 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Manage each secret] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_secret.yml:34 Saturday 27 July 2024 12:38:51 -0400 (0:00:00.036) 0:00:43.714 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:129 Saturday 27 July 2024 12:38:52 -0400 (0:00:00.498) 0:00:44.213 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ******** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:136 Saturday 27 July 2024 12:38:52 -0400 (0:00:00.033) 0:00:44.246 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:38:52 -0400 (0:00:00.185) 0:00:44.432 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "quadlet-demo.kube", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "[Install]\nWantedBy=default.target\n\n[Unit]\nRequires=quadlet-demo-mysql.service\nAfter=quadlet-demo-mysql.service\n\n[Kube]\n# Point to the yaml file in the same directory\nYaml=quadlet-demo.yml\n# Use the quadlet-demo network\nNetwork=quadlet-demo.network\n# Publish the envoy proxy data port\nPublishPort=8000:8080\n# Publish the envoy proxy admin port\nPublishPort=9000:9901\n# Use the envoy proxy config map in the same directory\nConfigMap=envoy-proxy-configmap.yml", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:38:52 -0400 (0:00:00.052) 0:00:44.484 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": true, "__podman_state": "absent", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:38:52 -0400 (0:00:00.046) 0:00:44.530 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_quadlet_file_src", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:38:52 -0400 (0:00:00.037) 0:00:44.568 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "quadlet-demo", "__podman_quadlet_type": "kube", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:38:52 -0400 (0:00:00.056) 0:00:44.624 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:38:52 -0400 (0:00:00.106) 0:00:44.731 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:38:52 -0400 (0:00:00.041) 0:00:44.772 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:38:52 -0400 (0:00:00.042) 0:00:44.814 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:38:53 -0400 (0:00:00.051) 0:00:44.866 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:38:53 -0400 (0:00:00.451) 0:00:45.318 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:38:53 -0400 (0:00:00.049) 0:00:45.368 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097784.1304564, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "90e21e4e1feb2c9919e712a87df6caa0b481c909", "ctime": 1722097708.323023, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 203700, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1716422400.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2720902497", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:38:53 -0400 (0:00:00.449) 0:00:45.817 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.035) 0:00:45.853 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.035) 0:00:45.888 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.034) 0:00:45.923 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.035) 0:00:45.958 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.034) 0:00:45.993 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.036) 0:00:46.029 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.035) 0:00:46.064 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.033) 0:00:46.098 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_images_found": [], "__podman_kube_yamls_raw": [ "quadlet-demo.yml" ], "__podman_service_name": "quadlet-demo.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.062) 0:00:46.160 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.038) 0:00:46.199 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.070) 0:00:46.270 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [], "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo.kube", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.088) 0:00:46.358 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.046) 0:00:46.405 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.092) 0:00:46.497 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stop and disable service] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 Saturday 27 July 2024 12:38:54 -0400 (0:00:00.036) 0:00:46.534 ********* ok: [managed_node1] => { "changed": false, "failed_when_result": false } MSG: Could not find the requested service quadlet-demo.service: host TASK [fedora.linux_system_roles.podman : See if quadlet file exists] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33 Saturday 27 July 2024 12:38:55 -0400 (0:00:00.633) 0:00:47.167 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38 Saturday 27 July 2024 12:38:55 -0400 (0:00:00.436) 0:00:47.604 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove quadlet file] ****************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42 Saturday 27 July 2024 12:38:55 -0400 (0:00:00.035) 0:00:47.640 ********* ok: [managed_node1] => { "changed": false, "path": "/etc/containers/systemd/quadlet-demo.kube", "state": "absent" } TASK [fedora.linux_system_roles.podman : Refresh systemd] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 Saturday 27 July 2024 12:38:56 -0400 (0:00:00.445) 0:00:48.085 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_file_removed is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove managed resource] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 Saturday 27 July 2024 12:38:56 -0400 (0:00:00.039) 0:00:48.125 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Remove volumes] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:95 Saturday 27 July 2024 12:38:56 -0400 (0:00:00.038) 0:00:48.164 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:112 Saturday 27 July 2024 12:38:56 -0400 (0:00:00.053) 0:00:48.217 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_parsed": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ******** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116 Saturday 27 July 2024 12:38:56 -0400 (0:00:00.037) 0:00:48.254 ********* changed: [managed_node1] => { "changed": true, "cmd": [ "podman", "image", "prune", "--all", "-f" ], "delta": "0:00:00.069583", "end": "2024-07-27 12:38:56.846733", "rc": 0, "start": "2024-07-27 12:38:56.777150" } STDOUT: 3e8810767c96155f67e76e7b2c8786073155e002d60d6743e1a9acf758026347 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:127 Saturday 27 July 2024 12:38:56 -0400 (0:00:00.513) 0:00:48.767 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:38:56 -0400 (0:00:00.067) 0:00:48.835 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:38:57 -0400 (0:00:00.035) 0:00:48.870 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:38:57 -0400 (0:00:00.071) 0:00:48.942 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - images] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:137 Saturday 27 July 2024 12:38:57 -0400 (0:00:00.036) 0:00:48.978 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "images", "-n" ], "delta": "0:00:00.039127", "end": "2024-07-27 12:38:57.540844", "rc": 0, "start": "2024-07-27 12:38:57.501717" } TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:146 Saturday 27 July 2024 12:38:57 -0400 (0:00:00.485) 0:00:49.464 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "volume", "ls", "-n" ], "delta": "0:00:00.039685", "end": "2024-07-27 12:38:58.029544", "rc": 0, "start": "2024-07-27 12:38:57.989859" } TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:155 Saturday 27 July 2024 12:38:58 -0400 (0:00:00.485) 0:00:49.949 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "ps", "--noheading" ], "delta": "0:00:00.041361", "end": "2024-07-27 12:38:58.517536", "rc": 0, "start": "2024-07-27 12:38:58.476175" } TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:164 Saturday 27 July 2024 12:38:58 -0400 (0:00:00.490) 0:00:50.440 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "network", "ls", "-n", "-q" ], "delta": "0:00:00.039907", "end": "2024-07-27 12:38:59.008669", "rc": 0, "start": "2024-07-27 12:38:58.968762" } STDOUT: podman podman-default-kube-network TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:173 Saturday 27 July 2024 12:38:59 -0400 (0:00:00.492) 0:00:50.932 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : For testing and debugging - services] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Saturday 27 July 2024 12:38:59 -0400 (0:00:00.487) 0:00:51.420 ********* ok: [managed_node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "audit-rules.service": { "name": "audit-rules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "certmonger.service": { "name": "certmonger.service", "source": "systemd", "state": "running", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.fedoraproject.FirewallD1.service": { "name": "dbus-org.fedoraproject.FirewallD1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd.service": { "name": "dhcpcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd@.service": { "name": "dhcpcd@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "running", "status": "enabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ipset.service": { "name": "ipset.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "netavark-dhcp-proxy.service": { "name": "netavark-dhcp-proxy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "netavark-firewalld-reload.service": { "name": "netavark-firewalld-reload.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "passim.service": { "name": "passim.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "podman-auto-update.service": { "name": "podman-auto-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-clean-transient.service": { "name": "podman-clean-transient.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-kube@.service": { "name": "podman-kube@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "podman-restart.service": { "name": "podman-restart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman.service": { "name": "podman.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-battery-check.service": { "name": "systemd-battery-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-bsod.service": { "name": "systemd-bsod.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-confext.service": { "name": "systemd-confext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume.service": { "name": "systemd-hibernate-resume.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrextend@.service": { "name": "systemd-pcrextend@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrlock-file-system.service": { "name": "systemd-pcrlock-file-system.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-code.service": { "name": "systemd-pcrlock-firmware-code.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-config.service": { "name": "systemd-pcrlock-firmware-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-machine-id.service": { "name": "systemd-pcrlock-machine-id.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-make-policy.service": { "name": "systemd-pcrlock-make-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-authority.service": { "name": "systemd-pcrlock-secureboot-authority.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-policy.service": { "name": "systemd-pcrlock-secureboot-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-soft-reboot.service": { "name": "systemd-soft-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-storagetm.service": { "name": "systemd-storagetm.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-sysext@.service": { "name": "systemd-sysext@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev-early.service": { "name": "systemd-tmpfiles-setup-dev-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup-early.service": { "name": "systemd-tpm2-setup-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup.service": { "name": "systemd-tpm2-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:39:02 -0400 (0:00:03.205) 0:00:54.626 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:39:02 -0400 (0:00:00.040) 0:00:54.666 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "---\napiVersion: v1\nkind: PersistentVolumeClaim\nmetadata:\n name: wp-pv-claim\n labels:\n app: wordpress\nspec:\n accessModes:\n - ReadWriteOnce\n resources:\n requests:\n storage: 20Gi\n---\napiVersion: v1\nkind: Pod\nmetadata:\n name: quadlet-demo\nspec:\n containers:\n - name: wordpress\n image: quay.io/linux-system-roles/wordpress:4.8-apache\n env:\n - name: WORDPRESS_DB_HOST\n value: quadlet-demo-mysql\n - name: WORDPRESS_DB_PASSWORD\n valueFrom:\n secretKeyRef:\n name: mysql-root-password-kube\n key: password\n volumeMounts:\n - name: wordpress-persistent-storage\n mountPath: /var/www/html\n resources:\n requests:\n memory: \"64Mi\"\n cpu: \"250m\"\n limits:\n memory: \"128Mi\"\n cpu: \"500m\"\n - name: envoy\n image: quay.io/linux-system-roles/envoyproxy:v1.25.0\n volumeMounts:\n - name: config-volume\n mountPath: /etc/envoy\n - name: certificates\n mountPath: /etc/envoy-certificates\n env:\n - name: ENVOY_UID\n value: \"0\"\n resources:\n requests:\n memory: \"64Mi\"\n cpu: \"250m\"\n limits:\n memory: \"128Mi\"\n cpu: \"500m\"\n volumes:\n - name: config-volume\n configMap:\n name: envoy-proxy-config\n - name: certificates\n secret:\n secretName: envoy-certificates\n - name: wordpress-persistent-storage\n persistentVolumeClaim:\n claimName: wp-pv-claim\n - name: www # not used - for testing hostpath\n hostPath:\n path: /tmp/httpd3\n - name: create # not used - for testing hostpath\n hostPath:\n path: /tmp/httpd3-create\n", "__podman_quadlet_template_src": "quadlet-demo.yml.j2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:39:02 -0400 (0:00:00.114) 0:00:54.781 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": true, "__podman_state": "absent", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:39:02 -0400 (0:00:00.053) 0:00:54.835 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_quadlet_str", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:39:03 -0400 (0:00:00.045) 0:00:54.880 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "quadlet-demo", "__podman_quadlet_type": "yml", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:39:03 -0400 (0:00:00.062) 0:00:54.943 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:39:03 -0400 (0:00:00.085) 0:00:55.028 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:39:03 -0400 (0:00:00.095) 0:00:55.124 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:39:03 -0400 (0:00:00.047) 0:00:55.172 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:39:03 -0400 (0:00:00.059) 0:00:55.231 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:39:03 -0400 (0:00:00.461) 0:00:55.693 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:39:03 -0400 (0:00:00.057) 0:00:55.750 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097784.1304564, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "90e21e4e1feb2c9919e712a87df6caa0b481c909", "ctime": 1722097708.323023, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 203700, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1716422400.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2720902497", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.455) 0:00:56.206 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.040) 0:00:56.246 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.040) 0:00:56.286 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.039) 0:00:56.326 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.039) 0:00:56.366 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.040) 0:00:56.406 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.039) 0:00:56.446 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.039) 0:00:56.486 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.038) 0:00:56.524 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.070) 0:00:56.595 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.087) 0:00:56.682 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.040) 0:00:56.722 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [], "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo.yml", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:39:04 -0400 (0:00:00.095) 0:00:56.818 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:39:05 -0400 (0:00:00.051) 0:00:56.870 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4 Saturday 27 July 2024 12:39:05 -0400 (0:00:00.098) 0:00:56.968 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stop and disable service] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 Saturday 27 July 2024 12:39:05 -0400 (0:00:00.039) 0:00:57.007 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_service_name | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : See if quadlet file exists] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33 Saturday 27 July 2024 12:39:05 -0400 (0:00:00.044) 0:00:57.051 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38 Saturday 27 July 2024 12:39:05 -0400 (0:00:00.441) 0:00:57.493 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove quadlet file] ****************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42 Saturday 27 July 2024 12:39:05 -0400 (0:00:00.039) 0:00:57.533 ********* ok: [managed_node1] => { "changed": false, "path": "/etc/containers/systemd/quadlet-demo.yml", "state": "absent" } TASK [fedora.linux_system_roles.podman : Refresh systemd] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 Saturday 27 July 2024 12:39:06 -0400 (0:00:00.450) 0:00:57.984 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_file_removed is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove managed resource] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 Saturday 27 July 2024 12:39:06 -0400 (0:00:00.042) 0:00:58.027 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Remove volumes] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:95 Saturday 27 July 2024 12:39:06 -0400 (0:00:00.042) 0:00:58.070 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:112 Saturday 27 July 2024 12:39:06 -0400 (0:00:00.055) 0:00:58.125 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_parsed": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ******** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116 Saturday 27 July 2024 12:39:06 -0400 (0:00:00.040) 0:00:58.166 ********* changed: [managed_node1] => { "changed": true, "cmd": [ "podman", "image", "prune", "--all", "-f" ], "delta": "0:00:00.039678", "end": "2024-07-27 12:39:06.730872", "rc": 0, "start": "2024-07-27 12:39:06.691194" } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:127 Saturday 27 July 2024 12:39:06 -0400 (0:00:00.529) 0:00:58.696 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:39:06 -0400 (0:00:00.076) 0:00:58.772 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:39:06 -0400 (0:00:00.040) 0:00:58.813 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:39:07 -0400 (0:00:00.040) 0:00:58.853 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - images] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:137 Saturday 27 July 2024 12:39:07 -0400 (0:00:00.041) 0:00:58.895 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "images", "-n" ], "delta": "0:00:00.039610", "end": "2024-07-27 12:39:07.461691", "rc": 0, "start": "2024-07-27 12:39:07.422081" } TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:146 Saturday 27 July 2024 12:39:07 -0400 (0:00:00.492) 0:00:59.387 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "volume", "ls", "-n" ], "delta": "0:00:00.039650", "end": "2024-07-27 12:39:07.952693", "rc": 0, "start": "2024-07-27 12:39:07.913043" } TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:155 Saturday 27 July 2024 12:39:08 -0400 (0:00:00.490) 0:00:59.878 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "ps", "--noheading" ], "delta": "0:00:00.040236", "end": "2024-07-27 12:39:08.446746", "rc": 0, "start": "2024-07-27 12:39:08.406510" } TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:164 Saturday 27 July 2024 12:39:08 -0400 (0:00:00.494) 0:01:00.372 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "network", "ls", "-n", "-q" ], "delta": "0:00:00.039568", "end": "2024-07-27 12:39:08.939154", "rc": 0, "start": "2024-07-27 12:39:08.899586" } STDOUT: podman podman-default-kube-network TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:173 Saturday 27 July 2024 12:39:09 -0400 (0:00:00.493) 0:01:00.866 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : For testing and debugging - services] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Saturday 27 July 2024 12:39:09 -0400 (0:00:00.490) 0:01:01.356 ********* ok: [managed_node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "audit-rules.service": { "name": "audit-rules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "certmonger.service": { "name": "certmonger.service", "source": "systemd", "state": "running", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.fedoraproject.FirewallD1.service": { "name": "dbus-org.fedoraproject.FirewallD1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd.service": { "name": "dhcpcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd@.service": { "name": "dhcpcd@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "running", "status": "enabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ipset.service": { "name": "ipset.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "netavark-dhcp-proxy.service": { "name": "netavark-dhcp-proxy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "netavark-firewalld-reload.service": { "name": "netavark-firewalld-reload.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "passim.service": { "name": "passim.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "podman-auto-update.service": { "name": "podman-auto-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-clean-transient.service": { "name": "podman-clean-transient.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-kube@.service": { "name": "podman-kube@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "podman-restart.service": { "name": "podman-restart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman.service": { "name": "podman.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-battery-check.service": { "name": "systemd-battery-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-bsod.service": { "name": "systemd-bsod.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-confext.service": { "name": "systemd-confext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume.service": { "name": "systemd-hibernate-resume.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrextend@.service": { "name": "systemd-pcrextend@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrlock-file-system.service": { "name": "systemd-pcrlock-file-system.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-code.service": { "name": "systemd-pcrlock-firmware-code.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-config.service": { "name": "systemd-pcrlock-firmware-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-machine-id.service": { "name": "systemd-pcrlock-machine-id.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-make-policy.service": { "name": "systemd-pcrlock-make-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-authority.service": { "name": "systemd-pcrlock-secureboot-authority.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-policy.service": { "name": "systemd-pcrlock-secureboot-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-soft-reboot.service": { "name": "systemd-soft-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-storagetm.service": { "name": "systemd-storagetm.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-sysext@.service": { "name": "systemd-sysext@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev-early.service": { "name": "systemd-tmpfiles-setup-dev-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup-early.service": { "name": "systemd-tpm2-setup-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup.service": { "name": "systemd-tpm2-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:39:12 -0400 (0:00:02.987) 0:01:04.343 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:39:12 -0400 (0:00:00.039) 0:01:04.383 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "envoy-proxy-configmap.yml", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "---\napiVersion: v1\nkind: ConfigMap\nmetadata:\n name: envoy-proxy-config\ndata:\n envoy.yaml: |\n admin:\n address:\n socket_address:\n address: 0.0.0.0\n port_value: 9901\n\n static_resources:\n listeners:\n - name: listener_0\n address:\n socket_address:\n address: 0.0.0.0\n port_value: 8080\n filter_chains:\n - filters:\n - name: envoy.filters.network.http_connection_manager\n typed_config:\n \"@type\": type.googleapis.com/envoy.extensions.filters.network.http_connection_manager.v3.HttpConnectionManager\n stat_prefix: ingress_http\n codec_type: AUTO\n route_config:\n name: local_route\n virtual_hosts:\n - name: local_service\n domains: [\"*\"]\n routes:\n - match:\n prefix: \"/\"\n route:\n cluster: backend\n http_filters:\n - name: envoy.filters.http.router\n typed_config:\n \"@type\": type.googleapis.com/envoy.extensions.filters.http.router.v3.Router\n transport_socket:\n name: envoy.transport_sockets.tls\n typed_config:\n \"@type\": type.googleapis.com/envoy.extensions.transport_sockets.tls.v3.DownstreamTlsContext\n common_tls_context:\n tls_certificates:\n - certificate_chain:\n filename: /etc/envoy-certificates/certificate.pem\n private_key:\n filename: /etc/envoy-certificates/certificate.key\n clusters:\n - name: backend\n connect_timeout: 5s\n type: STATIC\n dns_refresh_rate: 1800s\n lb_policy: ROUND_ROBIN\n load_assignment:\n cluster_name: backend\n endpoints:\n - lb_endpoints:\n - endpoint:\n address:\n socket_address:\n address: 127.0.0.1\n port_value: 80", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:39:12 -0400 (0:00:00.055) 0:01:04.438 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": true, "__podman_state": "absent", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:39:12 -0400 (0:00:00.089) 0:01:04.528 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_quadlet_file_src", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:39:12 -0400 (0:00:00.045) 0:01:04.573 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "envoy-proxy-configmap", "__podman_quadlet_type": "yml", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:39:12 -0400 (0:00:00.063) 0:01:04.637 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:39:12 -0400 (0:00:00.082) 0:01:04.719 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:39:12 -0400 (0:00:00.048) 0:01:04.768 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:39:12 -0400 (0:00:00.047) 0:01:04.816 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:39:13 -0400 (0:00:00.058) 0:01:04.874 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:39:13 -0400 (0:00:00.457) 0:01:05.332 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:39:13 -0400 (0:00:00.056) 0:01:05.388 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097784.1304564, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "90e21e4e1feb2c9919e712a87df6caa0b481c909", "ctime": 1722097708.323023, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 203700, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1716422400.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2720902497", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:39:13 -0400 (0:00:00.452) 0:01:05.841 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.040) 0:01:05.881 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.038) 0:01:05.920 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.039) 0:01:05.959 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.079) 0:01:06.038 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.041) 0:01:06.080 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.041) 0:01:06.122 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.040) 0:01:06.162 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.039) 0:01:06.202 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.068) 0:01:06.271 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.044) 0:01:06.315 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.039) 0:01:06.355 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [], "__podman_quadlet_file": "/etc/containers/systemd/envoy-proxy-configmap.yml", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.092) 0:01:06.448 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.049) 0:01:06.497 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.100) 0:01:06.597 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stop and disable service] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.040) 0:01:06.637 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_service_name | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : See if quadlet file exists] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33 Saturday 27 July 2024 12:39:14 -0400 (0:00:00.043) 0:01:06.681 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38 Saturday 27 July 2024 12:39:15 -0400 (0:00:00.442) 0:01:07.123 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove quadlet file] ****************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42 Saturday 27 July 2024 12:39:15 -0400 (0:00:00.082) 0:01:07.205 ********* ok: [managed_node1] => { "changed": false, "path": "/etc/containers/systemd/envoy-proxy-configmap.yml", "state": "absent" } TASK [fedora.linux_system_roles.podman : Refresh systemd] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 Saturday 27 July 2024 12:39:15 -0400 (0:00:00.453) 0:01:07.658 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_file_removed is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove managed resource] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 Saturday 27 July 2024 12:39:15 -0400 (0:00:00.043) 0:01:07.701 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Remove volumes] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:95 Saturday 27 July 2024 12:39:15 -0400 (0:00:00.041) 0:01:07.743 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:112 Saturday 27 July 2024 12:39:15 -0400 (0:00:00.057) 0:01:07.801 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_parsed": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ******** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116 Saturday 27 July 2024 12:39:16 -0400 (0:00:00.044) 0:01:07.845 ********* changed: [managed_node1] => { "changed": true, "cmd": [ "podman", "image", "prune", "--all", "-f" ], "delta": "0:00:00.039547", "end": "2024-07-27 12:39:16.412393", "rc": 0, "start": "2024-07-27 12:39:16.372846" } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:127 Saturday 27 July 2024 12:39:16 -0400 (0:00:00.493) 0:01:08.339 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:39:16 -0400 (0:00:00.072) 0:01:08.412 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:39:16 -0400 (0:00:00.040) 0:01:08.453 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:39:16 -0400 (0:00:00.039) 0:01:08.492 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - images] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:137 Saturday 27 July 2024 12:39:16 -0400 (0:00:00.038) 0:01:08.531 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "images", "-n" ], "delta": "0:00:00.040356", "end": "2024-07-27 12:39:17.101610", "rc": 0, "start": "2024-07-27 12:39:17.061254" } TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:146 Saturday 27 July 2024 12:39:17 -0400 (0:00:00.496) 0:01:09.028 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "volume", "ls", "-n" ], "delta": "0:00:00.041605", "end": "2024-07-27 12:39:17.597688", "rc": 0, "start": "2024-07-27 12:39:17.556083" } TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:155 Saturday 27 July 2024 12:39:17 -0400 (0:00:00.495) 0:01:09.523 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "ps", "--noheading" ], "delta": "0:00:00.039402", "end": "2024-07-27 12:39:18.091876", "rc": 0, "start": "2024-07-27 12:39:18.052474" } TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:164 Saturday 27 July 2024 12:39:18 -0400 (0:00:00.537) 0:01:10.061 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "network", "ls", "-n", "-q" ], "delta": "0:00:00.039936", "end": "2024-07-27 12:39:18.628650", "rc": 0, "start": "2024-07-27 12:39:18.588714" } STDOUT: podman podman-default-kube-network TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:173 Saturday 27 July 2024 12:39:18 -0400 (0:00:00.493) 0:01:10.554 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : For testing and debugging - services] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Saturday 27 July 2024 12:39:19 -0400 (0:00:00.492) 0:01:11.047 ********* ok: [managed_node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "audit-rules.service": { "name": "audit-rules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "certmonger.service": { "name": "certmonger.service", "source": "systemd", "state": "running", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.fedoraproject.FirewallD1.service": { "name": "dbus-org.fedoraproject.FirewallD1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd.service": { "name": "dhcpcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd@.service": { "name": "dhcpcd@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "running", "status": "enabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ipset.service": { "name": "ipset.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "netavark-dhcp-proxy.service": { "name": "netavark-dhcp-proxy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "netavark-firewalld-reload.service": { "name": "netavark-firewalld-reload.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "passim.service": { "name": "passim.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "podman-auto-update.service": { "name": "podman-auto-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-clean-transient.service": { "name": "podman-clean-transient.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-kube@.service": { "name": "podman-kube@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "podman-restart.service": { "name": "podman-restart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman.service": { "name": "podman.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-battery-check.service": { "name": "systemd-battery-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-bsod.service": { "name": "systemd-bsod.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-confext.service": { "name": "systemd-confext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume.service": { "name": "systemd-hibernate-resume.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrextend@.service": { "name": "systemd-pcrextend@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrlock-file-system.service": { "name": "systemd-pcrlock-file-system.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-code.service": { "name": "systemd-pcrlock-firmware-code.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-config.service": { "name": "systemd-pcrlock-firmware-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-machine-id.service": { "name": "systemd-pcrlock-machine-id.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-make-policy.service": { "name": "systemd-pcrlock-make-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-authority.service": { "name": "systemd-pcrlock-secureboot-authority.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-policy.service": { "name": "systemd-pcrlock-secureboot-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-soft-reboot.service": { "name": "systemd-soft-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-storagetm.service": { "name": "systemd-storagetm.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-sysext@.service": { "name": "systemd-sysext@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev-early.service": { "name": "systemd-tmpfiles-setup-dev-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup-early.service": { "name": "systemd-tpm2-setup-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup.service": { "name": "systemd-tpm2-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:39:22 -0400 (0:00:03.004) 0:01:14.051 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:39:22 -0400 (0:00:00.039) 0:01:14.091 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "[Install]\nWantedBy=default.target\n\n[Container]\nImage=quay.io/linux-system-roles/mysql:5.6\nContainerName=quadlet-demo-mysql\nVolume=quadlet-demo-mysql.volume:/var/lib/mysql\nVolume=/tmp/quadlet_demo:/var/lib/quadlet_demo:Z\nNetwork=quadlet-demo.network\nSecret=mysql-root-password-container,type=env,target=MYSQL_ROOT_PASSWORD\nHealthCmd=/bin/true\nHealthOnFailure=kill\n", "__podman_quadlet_template_src": "quadlet-demo-mysql.container.j2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:39:22 -0400 (0:00:00.113) 0:01:14.205 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": true, "__podman_state": "absent", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:39:22 -0400 (0:00:00.052) 0:01:14.258 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_quadlet_str", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:39:22 -0400 (0:00:00.042) 0:01:14.300 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "quadlet-demo-mysql", "__podman_quadlet_type": "container", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:39:22 -0400 (0:00:00.063) 0:01:14.364 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:39:22 -0400 (0:00:00.083) 0:01:14.447 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:39:22 -0400 (0:00:00.048) 0:01:14.496 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:39:22 -0400 (0:00:00.045) 0:01:14.541 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:39:22 -0400 (0:00:00.056) 0:01:14.598 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:39:23 -0400 (0:00:00.500) 0:01:15.098 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:39:23 -0400 (0:00:00.054) 0:01:15.153 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097784.1304564, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "90e21e4e1feb2c9919e712a87df6caa0b481c909", "ctime": 1722097708.323023, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 203700, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1716422400.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2720902497", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:39:23 -0400 (0:00:00.453) 0:01:15.606 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:39:23 -0400 (0:00:00.041) 0:01:15.648 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:39:23 -0400 (0:00:00.041) 0:01:15.690 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:39:23 -0400 (0:00:00.040) 0:01:15.731 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:39:23 -0400 (0:00:00.041) 0:01:15.773 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:39:23 -0400 (0:00:00.041) 0:01:15.814 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:39:24 -0400 (0:00:00.040) 0:01:15.854 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:39:24 -0400 (0:00:00.040) 0:01:15.895 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:39:24 -0400 (0:00:00.040) 0:01:15.935 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_images_found": [ "quay.io/linux-system-roles/mysql:5.6" ], "__podman_kube_yamls_raw": "", "__podman_service_name": "quadlet-demo-mysql.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:39:24 -0400 (0:00:00.070) 0:01:16.006 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:39:24 -0400 (0:00:00.044) 0:01:16.051 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:39:24 -0400 (0:00:00.038) 0:01:16.090 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [ "quay.io/linux-system-roles/mysql:5.6" ], "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo-mysql.container", "__podman_volumes": [ "/tmp/quadlet_demo" ] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:39:24 -0400 (0:00:00.096) 0:01:16.186 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:39:24 -0400 (0:00:00.095) 0:01:16.281 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4 Saturday 27 July 2024 12:39:24 -0400 (0:00:00.100) 0:01:16.381 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stop and disable service] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 Saturday 27 July 2024 12:39:24 -0400 (0:00:00.040) 0:01:16.422 ********* ok: [managed_node1] => { "changed": false, "failed_when_result": false } MSG: Could not find the requested service quadlet-demo-mysql.service: host TASK [fedora.linux_system_roles.podman : See if quadlet file exists] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33 Saturday 27 July 2024 12:39:25 -0400 (0:00:00.637) 0:01:17.059 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38 Saturday 27 July 2024 12:39:25 -0400 (0:00:00.444) 0:01:17.504 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove quadlet file] ****************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42 Saturday 27 July 2024 12:39:25 -0400 (0:00:00.041) 0:01:17.546 ********* ok: [managed_node1] => { "changed": false, "path": "/etc/containers/systemd/quadlet-demo-mysql.container", "state": "absent" } TASK [fedora.linux_system_roles.podman : Refresh systemd] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 Saturday 27 July 2024 12:39:26 -0400 (0:00:00.456) 0:01:18.002 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_file_removed is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove managed resource] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 Saturday 27 July 2024 12:39:26 -0400 (0:00:00.044) 0:01:18.047 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Remove volumes] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:95 Saturday 27 July 2024 12:39:26 -0400 (0:00:00.043) 0:01:18.090 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:112 Saturday 27 July 2024 12:39:26 -0400 (0:00:00.058) 0:01:18.148 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_parsed": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ******** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116 Saturday 27 July 2024 12:39:26 -0400 (0:00:00.043) 0:01:18.192 ********* changed: [managed_node1] => { "changed": true, "cmd": [ "podman", "image", "prune", "--all", "-f" ], "delta": "0:00:00.041076", "end": "2024-07-27 12:39:26.756771", "rc": 0, "start": "2024-07-27 12:39:26.715695" } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:127 Saturday 27 July 2024 12:39:26 -0400 (0:00:00.490) 0:01:18.682 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:39:26 -0400 (0:00:00.114) 0:01:18.797 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:39:26 -0400 (0:00:00.041) 0:01:18.838 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:39:27 -0400 (0:00:00.041) 0:01:18.880 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - images] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:137 Saturday 27 July 2024 12:39:27 -0400 (0:00:00.040) 0:01:18.920 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "images", "-n" ], "delta": "0:00:00.038858", "end": "2024-07-27 12:39:27.487998", "rc": 0, "start": "2024-07-27 12:39:27.449140" } TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:146 Saturday 27 July 2024 12:39:27 -0400 (0:00:00.492) 0:01:19.413 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "volume", "ls", "-n" ], "delta": "0:00:00.038268", "end": "2024-07-27 12:39:27.979782", "rc": 0, "start": "2024-07-27 12:39:27.941514" } TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:155 Saturday 27 July 2024 12:39:28 -0400 (0:00:00.491) 0:01:19.905 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "ps", "--noheading" ], "delta": "0:00:00.040446", "end": "2024-07-27 12:39:28.475213", "rc": 0, "start": "2024-07-27 12:39:28.434767" } TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:164 Saturday 27 July 2024 12:39:28 -0400 (0:00:00.497) 0:01:20.402 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "network", "ls", "-n", "-q" ], "delta": "0:00:00.040950", "end": "2024-07-27 12:39:28.978533", "rc": 0, "start": "2024-07-27 12:39:28.937583" } STDOUT: podman podman-default-kube-network TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:173 Saturday 27 July 2024 12:39:29 -0400 (0:00:00.502) 0:01:20.904 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : For testing and debugging - services] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Saturday 27 July 2024 12:39:29 -0400 (0:00:00.494) 0:01:21.399 ********* ok: [managed_node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "audit-rules.service": { "name": "audit-rules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "certmonger.service": { "name": "certmonger.service", "source": "systemd", "state": "running", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.fedoraproject.FirewallD1.service": { "name": "dbus-org.fedoraproject.FirewallD1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd.service": { "name": "dhcpcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd@.service": { "name": "dhcpcd@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "running", "status": "enabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ipset.service": { "name": "ipset.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "netavark-dhcp-proxy.service": { "name": "netavark-dhcp-proxy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "netavark-firewalld-reload.service": { "name": "netavark-firewalld-reload.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "passim.service": { "name": "passim.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "podman-auto-update.service": { "name": "podman-auto-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-clean-transient.service": { "name": "podman-clean-transient.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-kube@.service": { "name": "podman-kube@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "podman-restart.service": { "name": "podman-restart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman.service": { "name": "podman.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-battery-check.service": { "name": "systemd-battery-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-bsod.service": { "name": "systemd-bsod.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-confext.service": { "name": "systemd-confext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume.service": { "name": "systemd-hibernate-resume.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrextend@.service": { "name": "systemd-pcrextend@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrlock-file-system.service": { "name": "systemd-pcrlock-file-system.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-code.service": { "name": "systemd-pcrlock-firmware-code.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-config.service": { "name": "systemd-pcrlock-firmware-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-machine-id.service": { "name": "systemd-pcrlock-machine-id.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-make-policy.service": { "name": "systemd-pcrlock-make-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-authority.service": { "name": "systemd-pcrlock-secureboot-authority.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-policy.service": { "name": "systemd-pcrlock-secureboot-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-soft-reboot.service": { "name": "systemd-soft-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-storagetm.service": { "name": "systemd-storagetm.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-sysext@.service": { "name": "systemd-sysext@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev-early.service": { "name": "systemd-tmpfiles-setup-dev-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup-early.service": { "name": "systemd-tpm2-setup-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup.service": { "name": "systemd-tpm2-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:39:32 -0400 (0:00:02.959) 0:01:24.359 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:39:32 -0400 (0:00:00.041) 0:01:24.400 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "quadlet-demo-mysql.volume", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "[Volume]", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:39:32 -0400 (0:00:00.056) 0:01:24.457 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": true, "__podman_state": "absent", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:39:32 -0400 (0:00:00.052) 0:01:24.509 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_quadlet_file_src", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:39:32 -0400 (0:00:00.043) 0:01:24.553 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "quadlet-demo-mysql", "__podman_quadlet_type": "volume", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:39:32 -0400 (0:00:00.110) 0:01:24.664 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:39:32 -0400 (0:00:00.083) 0:01:24.747 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:39:32 -0400 (0:00:00.048) 0:01:24.796 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:39:33 -0400 (0:00:00.047) 0:01:24.843 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:39:33 -0400 (0:00:00.056) 0:01:24.899 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:39:33 -0400 (0:00:00.454) 0:01:25.354 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:39:33 -0400 (0:00:00.056) 0:01:25.411 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097784.1304564, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "90e21e4e1feb2c9919e712a87df6caa0b481c909", "ctime": 1722097708.323023, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 203700, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1716422400.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2720902497", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.452) 0:01:25.863 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.040) 0:01:25.903 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.040) 0:01:25.944 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.041) 0:01:25.985 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.040) 0:01:26.026 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.039) 0:01:26.066 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.040) 0:01:26.106 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.083) 0:01:26.189 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.041) 0:01:26.231 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "quadlet-demo-mysql-volume.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.069) 0:01:26.301 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.045) 0:01:26.347 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.040) 0:01:26.387 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [], "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo-mysql.volume", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.093) 0:01:26.481 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.049) 0:01:26.531 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.101) 0:01:26.632 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stop and disable service] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 Saturday 27 July 2024 12:39:34 -0400 (0:00:00.041) 0:01:26.674 ********* ok: [managed_node1] => { "changed": false, "failed_when_result": false } MSG: Could not find the requested service quadlet-demo-mysql-volume.service: host TASK [fedora.linux_system_roles.podman : See if quadlet file exists] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33 Saturday 27 July 2024 12:39:35 -0400 (0:00:00.636) 0:01:27.310 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38 Saturday 27 July 2024 12:39:35 -0400 (0:00:00.443) 0:01:27.754 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove quadlet file] ****************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42 Saturday 27 July 2024 12:39:35 -0400 (0:00:00.040) 0:01:27.794 ********* ok: [managed_node1] => { "changed": false, "path": "/etc/containers/systemd/quadlet-demo-mysql.volume", "state": "absent" } TASK [fedora.linux_system_roles.podman : Refresh systemd] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 Saturday 27 July 2024 12:39:36 -0400 (0:00:00.454) 0:01:28.249 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_file_removed is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove managed resource] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 Saturday 27 July 2024 12:39:36 -0400 (0:00:00.043) 0:01:28.293 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Remove volumes] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:95 Saturday 27 July 2024 12:39:36 -0400 (0:00:00.082) 0:01:28.375 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:112 Saturday 27 July 2024 12:39:36 -0400 (0:00:00.056) 0:01:28.432 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_parsed": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ******** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116 Saturday 27 July 2024 12:39:36 -0400 (0:00:00.042) 0:01:28.475 ********* changed: [managed_node1] => { "changed": true, "cmd": [ "podman", "image", "prune", "--all", "-f" ], "delta": "0:00:00.040713", "end": "2024-07-27 12:39:37.043250", "rc": 0, "start": "2024-07-27 12:39:37.002537" } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:127 Saturday 27 July 2024 12:39:37 -0400 (0:00:00.493) 0:01:28.968 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:39:37 -0400 (0:00:00.077) 0:01:29.045 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:39:37 -0400 (0:00:00.041) 0:01:29.087 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:39:37 -0400 (0:00:00.040) 0:01:29.128 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - images] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:137 Saturday 27 July 2024 12:39:37 -0400 (0:00:00.039) 0:01:29.168 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "images", "-n" ], "delta": "0:00:00.040388", "end": "2024-07-27 12:39:37.733592", "rc": 0, "start": "2024-07-27 12:39:37.693204" } TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:146 Saturday 27 July 2024 12:39:37 -0400 (0:00:00.492) 0:01:29.661 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "volume", "ls", "-n" ], "delta": "0:00:00.040289", "end": "2024-07-27 12:39:38.228822", "rc": 0, "start": "2024-07-27 12:39:38.188533" } TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:155 Saturday 27 July 2024 12:39:38 -0400 (0:00:00.494) 0:01:30.155 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "ps", "--noheading" ], "delta": "0:00:00.040383", "end": "2024-07-27 12:39:38.722866", "rc": 0, "start": "2024-07-27 12:39:38.682483" } TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:164 Saturday 27 July 2024 12:39:38 -0400 (0:00:00.494) 0:01:30.650 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "network", "ls", "-n", "-q" ], "delta": "0:00:00.041607", "end": "2024-07-27 12:39:39.223533", "rc": 0, "start": "2024-07-27 12:39:39.181926" } STDOUT: podman podman-default-kube-network TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:173 Saturday 27 July 2024 12:39:39 -0400 (0:00:00.500) 0:01:31.150 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : For testing and debugging - services] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Saturday 27 July 2024 12:39:39 -0400 (0:00:00.492) 0:01:31.643 ********* ok: [managed_node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "audit-rules.service": { "name": "audit-rules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "certmonger.service": { "name": "certmonger.service", "source": "systemd", "state": "running", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.fedoraproject.FirewallD1.service": { "name": "dbus-org.fedoraproject.FirewallD1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd.service": { "name": "dhcpcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd@.service": { "name": "dhcpcd@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "running", "status": "enabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ipset.service": { "name": "ipset.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "netavark-dhcp-proxy.service": { "name": "netavark-dhcp-proxy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "netavark-firewalld-reload.service": { "name": "netavark-firewalld-reload.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "passim.service": { "name": "passim.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "podman-auto-update.service": { "name": "podman-auto-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-clean-transient.service": { "name": "podman-clean-transient.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-kube@.service": { "name": "podman-kube@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "podman-restart.service": { "name": "podman-restart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman.service": { "name": "podman.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-battery-check.service": { "name": "systemd-battery-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-bsod.service": { "name": "systemd-bsod.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-confext.service": { "name": "systemd-confext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume.service": { "name": "systemd-hibernate-resume.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrextend@.service": { "name": "systemd-pcrextend@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrlock-file-system.service": { "name": "systemd-pcrlock-file-system.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-code.service": { "name": "systemd-pcrlock-firmware-code.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-config.service": { "name": "systemd-pcrlock-firmware-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-machine-id.service": { "name": "systemd-pcrlock-machine-id.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-make-policy.service": { "name": "systemd-pcrlock-make-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-authority.service": { "name": "systemd-pcrlock-secureboot-authority.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-policy.service": { "name": "systemd-pcrlock-secureboot-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-soft-reboot.service": { "name": "systemd-soft-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-storagetm.service": { "name": "systemd-storagetm.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-sysext@.service": { "name": "systemd-sysext@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev-early.service": { "name": "systemd-tmpfiles-setup-dev-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup-early.service": { "name": "systemd-tpm2-setup-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup.service": { "name": "systemd-tpm2-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:39:42 -0400 (0:00:03.065) 0:01:34.708 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:39:42 -0400 (0:00:00.040) 0:01:34.749 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "quadlet-demo.network", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "[Network]\nSubnet=192.168.30.0/24\nGateway=192.168.30.1\nLabel=app=wordpress", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:39:42 -0400 (0:00:00.056) 0:01:34.805 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": true, "__podman_state": "absent", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:39:43 -0400 (0:00:00.051) 0:01:34.857 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_quadlet_file_src", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:39:43 -0400 (0:00:00.041) 0:01:34.899 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "quadlet-demo", "__podman_quadlet_type": "network", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:39:43 -0400 (0:00:00.060) 0:01:34.960 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:39:43 -0400 (0:00:00.082) 0:01:35.043 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:39:43 -0400 (0:00:00.047) 0:01:35.090 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:39:43 -0400 (0:00:00.049) 0:01:35.139 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:39:43 -0400 (0:00:00.058) 0:01:35.197 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:39:43 -0400 (0:00:00.457) 0:01:35.654 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:39:43 -0400 (0:00:00.053) 0:01:35.708 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097784.1304564, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "90e21e4e1feb2c9919e712a87df6caa0b481c909", "ctime": 1722097708.323023, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 203700, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1716422400.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2720902497", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.450) 0:01:36.158 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.082) 0:01:36.241 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.043) 0:01:36.284 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.040) 0:01:36.325 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.041) 0:01:36.366 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.041) 0:01:36.408 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.040) 0:01:36.449 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.039) 0:01:36.489 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.040) 0:01:36.530 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "quadlet-demo-network.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.071) 0:01:36.601 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.047) 0:01:36.648 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.039) 0:01:36.688 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [], "__podman_quadlet_file": "/etc/containers/systemd/quadlet-demo.network", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.094) 0:01:36.782 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:39:44 -0400 (0:00:00.050) 0:01:36.833 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4 Saturday 27 July 2024 12:39:45 -0400 (0:00:00.097) 0:01:36.931 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stop and disable service] ************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 Saturday 27 July 2024 12:39:45 -0400 (0:00:00.082) 0:01:37.014 ********* ok: [managed_node1] => { "changed": false, "failed_when_result": false } MSG: Could not find the requested service quadlet-demo-network.service: host TASK [fedora.linux_system_roles.podman : See if quadlet file exists] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33 Saturday 27 July 2024 12:39:45 -0400 (0:00:00.636) 0:01:37.650 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38 Saturday 27 July 2024 12:39:46 -0400 (0:00:00.440) 0:01:38.091 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove quadlet file] ****************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42 Saturday 27 July 2024 12:39:46 -0400 (0:00:00.040) 0:01:38.132 ********* ok: [managed_node1] => { "changed": false, "path": "/etc/containers/systemd/quadlet-demo.network", "state": "absent" } TASK [fedora.linux_system_roles.podman : Refresh systemd] ********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 Saturday 27 July 2024 12:39:46 -0400 (0:00:00.456) 0:01:38.589 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_file_removed is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Remove managed resource] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 Saturday 27 July 2024 12:39:46 -0400 (0:00:00.043) 0:01:38.632 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Remove volumes] *********************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:95 Saturday 27 July 2024 12:39:46 -0400 (0:00:00.042) 0:01:38.675 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] ********* task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:112 Saturday 27 July 2024 12:39:46 -0400 (0:00:00.054) 0:01:38.729 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_parsed": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ******** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116 Saturday 27 July 2024 12:39:46 -0400 (0:00:00.042) 0:01:38.772 ********* changed: [managed_node1] => { "changed": true, "cmd": [ "podman", "image", "prune", "--all", "-f" ], "delta": "0:00:00.040144", "end": "2024-07-27 12:39:47.340510", "rc": 0, "start": "2024-07-27 12:39:47.300366" } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:127 Saturday 27 July 2024 12:39:47 -0400 (0:00:00.494) 0:01:39.266 ********* included: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:39:47 -0400 (0:00:00.077) 0:01:39.344 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:39:47 -0400 (0:00:00.040) 0:01:39.384 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:39:47 -0400 (0:00:00.040) 0:01:39.425 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - images] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:137 Saturday 27 July 2024 12:39:47 -0400 (0:00:00.085) 0:01:39.511 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "images", "-n" ], "delta": "0:00:00.040533", "end": "2024-07-27 12:39:48.082798", "rc": 0, "start": "2024-07-27 12:39:48.042265" } TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:146 Saturday 27 July 2024 12:39:48 -0400 (0:00:00.497) 0:01:40.009 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "volume", "ls", "-n" ], "delta": "0:00:00.040699", "end": "2024-07-27 12:39:48.580936", "rc": 0, "start": "2024-07-27 12:39:48.540237" } TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:155 Saturday 27 July 2024 12:39:48 -0400 (0:00:00.499) 0:01:40.508 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "ps", "--noheading" ], "delta": "0:00:00.040465", "end": "2024-07-27 12:39:49.079320", "rc": 0, "start": "2024-07-27 12:39:49.038855" } TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:164 Saturday 27 July 2024 12:39:49 -0400 (0:00:00.498) 0:01:41.006 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "network", "ls", "-n", "-q" ], "delta": "0:00:00.039373", "end": "2024-07-27 12:39:49.577083", "rc": 0, "start": "2024-07-27 12:39:49.537710" } STDOUT: podman podman-default-kube-network TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:173 Saturday 27 July 2024 12:39:49 -0400 (0:00:00.496) 0:01:41.503 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : For testing and debugging - services] *** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Saturday 27 July 2024 12:39:50 -0400 (0:00:00.490) 0:01:41.993 ********* ok: [managed_node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "audit-rules.service": { "name": "audit-rules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "certmonger.service": { "name": "certmonger.service", "source": "systemd", "state": "running", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.fedoraproject.FirewallD1.service": { "name": "dbus-org.fedoraproject.FirewallD1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd.service": { "name": "dhcpcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd@.service": { "name": "dhcpcd@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "running", "status": "enabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "ip6tables.service": { "name": "ip6tables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ipset.service": { "name": "ipset.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iptables.service": { "name": "iptables.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "netavark-dhcp-proxy.service": { "name": "netavark-dhcp-proxy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "netavark-firewalld-reload.service": { "name": "netavark-firewalld-reload.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "passim.service": { "name": "passim.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "podman-auto-update.service": { "name": "podman-auto-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-clean-transient.service": { "name": "podman-clean-transient.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman-kube@.service": { "name": "podman-kube@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "podman-restart.service": { "name": "podman-restart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "podman.service": { "name": "podman.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-battery-check.service": { "name": "systemd-battery-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-bsod.service": { "name": "systemd-bsod.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-confext.service": { "name": "systemd-confext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume.service": { "name": "systemd-hibernate-resume.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrextend@.service": { "name": "systemd-pcrextend@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrlock-file-system.service": { "name": "systemd-pcrlock-file-system.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-code.service": { "name": "systemd-pcrlock-firmware-code.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-config.service": { "name": "systemd-pcrlock-firmware-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-machine-id.service": { "name": "systemd-pcrlock-machine-id.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-make-policy.service": { "name": "systemd-pcrlock-make-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-authority.service": { "name": "systemd-pcrlock-secureboot-authority.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-policy.service": { "name": "systemd-pcrlock-secureboot-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-soft-reboot.service": { "name": "systemd-soft-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-storagetm.service": { "name": "systemd-storagetm.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-sysext@.service": { "name": "systemd-sysext@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev-early.service": { "name": "systemd-tmpfiles-setup-dev-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup-early.service": { "name": "systemd-tpm2-setup-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup.service": { "name": "systemd-tpm2-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:39:53 -0400 (0:00:02.993) 0:01:44.987 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Cancel linger] ************************ task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:143 Saturday 27 July 2024 12:39:53 -0400 (0:00:00.040) 0:01:45.027 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Handle credential files - absent] ***** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:149 Saturday 27 July 2024 12:39:53 -0400 (0:00:00.036) 0:01:45.064 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ******** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:158 Saturday 27 July 2024 12:39:53 -0400 (0:00:00.037) 0:01:45.102 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Ensure no resources] ***************************************************** task path: /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:170 Saturday 27 July 2024 12:39:53 -0400 (0:00:00.062) 0:01:45.164 ********* ok: [managed_node1] => { "changed": false } MSG: All assertions passed PLAY RECAP ********************************************************************* managed_node1 : ok=245 changed=14 unreachable=0 failed=1 skipped=236 rescued=1 ignored=0 Saturday 27 July 2024 12:39:53 -0400 (0:00:00.040) 0:01:45.204 ********* =============================================================================== fedora.linux_system_roles.certificate : Ensure provider packages are installed --- 4.72s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:23 fedora.linux_system_roles.certificate : Ensure certificate role dependencies are installed --- 3.40s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:5 fedora.linux_system_roles.podman : For testing and debugging - services --- 3.21s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 fedora.linux_system_roles.podman : For testing and debugging - services --- 3.07s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 fedora.linux_system_roles.podman : For testing and debugging - services --- 3.00s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 fedora.linux_system_roles.podman : For testing and debugging - services --- 2.99s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 fedora.linux_system_roles.podman : For testing and debugging - services --- 2.99s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 fedora.linux_system_roles.podman : For testing and debugging - services --- 2.96s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Gathering Facts --------------------------------------------------------- 2.59s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_demo.yml:3 fedora.linux_system_roles.firewall : Install firewalld ------------------ 1.55s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:31 fedora.linux_system_roles.firewall : Install firewalld ------------------ 1.53s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:31 fedora.linux_system_roles.certificate : Ensure provider service is running --- 1.38s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:90 fedora.linux_system_roles.certificate : Slurp the contents of the files --- 1.34s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:152 fedora.linux_system_roles.podman : Gather the package facts ------------- 1.32s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.certificate : Remove files -------------------- 1.29s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:181 fedora.linux_system_roles.firewall : Configure firewall ----------------- 1.27s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:71 fedora.linux_system_roles.firewall : Configure firewall ----------------- 1.11s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:71 fedora.linux_system_roles.podman : Gather the package facts ------------- 1.11s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.certificate : Ensure certificate requests ----- 1.00s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:101 fedora.linux_system_roles.firewall : Unmask firewalld service ----------- 0.68s /tmp/tmp.KuQPUKsjNP/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:22