[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. ansible-playbook [core 2.17.2] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/tmp.UJtmxR54QA executable location = /usr/local/bin/ansible-playbook python version = 3.12.4 (main, Jul 17 2024, 00:00:00) [GCC 11.4.1 20231218 (Red Hat 11.4.1-3)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_quadlet_basic.yml ********************************************** 1 plays in /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml PLAY [Ensure that the role can manage quadlet specs] *************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:3 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.018) 0:00:00.018 ********* [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed_node1] TASK [Run role - do not pull images] ******************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:54 Saturday 27 July 2024 12:36:31 -0400 (0:00:01.078) 0:00:01.096 ********* included: fedora.linux_system_roles.podman for managed_node1 TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.048) 0:00:01.145 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.028) 0:00:01.174 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.021) 0:00:01.195 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.442) 0:00:01.638 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.025) 0:00:01.663 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.039) 0:00:01.703 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:36:32 -0400 (0:00:01.133) 0:00:02.837 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:36:32 -0400 (0:00:00.036) 0:00:02.874 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:36:32 -0400 (0:00:00.037) 0:00:02.912 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.027697", "end": "2024-07-27 12:36:33.240976", "rc": 0, "start": "2024-07-27 12:36:33.213279" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.486) 0:00:03.398 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.034) 0:00:03.433 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.032) 0:00:03.465 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.038) 0:00:03.504 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.043) 0:00:03.548 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.066) 0:00:03.614 ********* ok: [managed_node1] => { "ansible_facts": { "getent_passwd": { "root": [ "x", "0", "0", "root", "/root", "/bin/bash" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.459) 0:00:04.074 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.033) 0:00:04.107 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.062) 0:00:04.170 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.374) 0:00:04.544 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.041) 0:00:04.586 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097764.7689178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "fc3588658699d4eb56c9a9cdf6678e83337ea7ee", "ctime": 1722097725.4462144, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 6544612, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719992328.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15680, "uid": 0, "version": "648322932", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.377) 0:00:04.963 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.033) 0:00:04.996 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.030) 0:00:05.027 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.030) 0:00:05.057 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.029) 0:00:05.086 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.027) 0:00:05.114 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.028) 0:00:05.142 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.027) 0:00:05.170 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.028) 0:00:05.198 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/etc/containers/policy.json", "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/etc/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.056) 0:00:05.254 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.063) 0:00:05.318 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.029) 0:00:05.347 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.029) 0:00:05.377 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.063) 0:00:05.441 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.029) 0:00:05.470 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.053) 0:00:05.524 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.065) 0:00:05.590 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.030) 0:00:05.620 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.029) 0:00:05.650 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.068) 0:00:05.718 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.029) 0:00:05.748 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.030) 0:00:05.778 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.029) 0:00:05.808 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.030) 0:00:05.838 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.029) 0:00:05.868 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.029) 0:00:05.898 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.029) 0:00:05.927 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.026) 0:00:05.954 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.025) 0:00:05.980 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:129 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.027) 0:00:06.007 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:136 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.026) 0:00:06.034 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.105) 0:00:06.140 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "", "__podman_quadlet_spec": { "Container": { "ContainerName": "nopull", "Image": "quay.io/libpod/testimage:20210610" }, "Install": { "WantedBy": "default.target" } }, "__podman_quadlet_str": "", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.042) 0:00:06.182 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": false, "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.038) 0:00:06.221 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_spec | length == 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.030) 0:00:06.251 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "nopull", "__podman_quadlet_type": "container", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.046) 0:00:06.298 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.061) 0:00:06.360 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.033) 0:00:06.394 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.033) 0:00:06.427 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.041) 0:00:06.469 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.372) 0:00:06.841 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.041) 0:00:06.883 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097764.7689178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "fc3588658699d4eb56c9a9cdf6678e83337ea7ee", "ctime": 1722097725.4462144, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 6544612, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719992328.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15680, "uid": 0, "version": "648322932", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.373) 0:00:07.257 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.030) 0:00:07.288 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.029) 0:00:07.318 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.028) 0:00:07.346 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.028) 0:00:07.375 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.028) 0:00:07.404 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.027) 0:00:07.432 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.055) 0:00:07.487 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.029) 0:00:07.517 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": false, "__podman_images_found": [ "quay.io/libpod/testimage:20210610" ], "__podman_kube_yamls_raw": "", "__podman_service_name": "nopull.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.052) 0:00:07.570 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.032) 0:00:07.603 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_kube_yamls_raw | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.031) 0:00:07.634 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [ "quay.io/libpod/testimage:20210610" ], "__podman_quadlet_file": "/etc/containers/systemd/nopull.container", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.076) 0:00:07.711 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.038) 0:00:07.749 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state == \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.028) 0:00:07.778 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:2 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.072) 0:00:07.850 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.053) 0:00:07.903 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.029) 0:00:07.933 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.028) 0:00:07.961 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create host directories] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:7 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.029) 0:00:07.990 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Ensure container images are present] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.026) 0:00:08.017 ********* skipping: [managed_node1] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Ensure the quadlet directory is present] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.035) 0:00:08.052 ********* ok: [managed_node1] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/containers/systemd", "secontext": "system_u:object_r:etc_t:s0", "size": 6, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.podman : Ensure quadlet file is copied] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.478) 0:00:08.531 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_file_src | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure quadlet file content is present] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:58 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.032) 0:00:08.563 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_str | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure quadlet file is present] ******* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.030) 0:00:08.593 ********* changed: [managed_node1] => { "changed": true, "checksum": "670d64fc68a9768edb20cad26df2acc703542d85", "dest": "/etc/containers/systemd/nopull.container", "gid": 0, "group": "root", "md5sum": "cedb6667f6cd1b033fe06e2810fe6b19", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 151, "src": "/root/.ansible/tmp/ansible-tmp-1722098198.5436368-12426-233458295195767/.source.container", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.podman : Reload systemctl] ********************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:82 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.885) 0:00:09.478 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Start service] ************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.035) 0:00:09.514 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Restart service] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:125 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.034) 0:00:09.549 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Cancel linger] ************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:143 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.036) 0:00:09.586 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Handle credential files - absent] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:149 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.026) 0:00:09.613 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:158 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.027) 0:00:09.640 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Verify image not pulled] ************************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:70 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.044) 0:00:09.684 ********* ok: [managed_node1] => { "changed": false } MSG: All assertions passed TASK [Run role - try to pull bogus image] ************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:74 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.032) 0:00:09.717 ********* included: fedora.linux_system_roles.podman for managed_node1 TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.090) 0:00:09.807 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.056) 0:00:09.863 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.037) 0:00:09.900 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.029) 0:00:09.930 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.030) 0:00:09.960 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.063) 0:00:10.024 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:36:40 -0400 (0:00:00.984) 0:00:11.008 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:36:40 -0400 (0:00:00.031) 0:00:11.040 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:36:40 -0400 (0:00:00.035) 0:00:11.075 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.028478", "end": "2024-07-27 12:36:41.309822", "rc": 0, "start": "2024-07-27 12:36:41.281344" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:36:41 -0400 (0:00:00.390) 0:00:11.465 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:36:41 -0400 (0:00:00.034) 0:00:11.500 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:36:41 -0400 (0:00:00.029) 0:00:11.529 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:36:41 -0400 (0:00:00.032) 0:00:11.561 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:36:41 -0400 (0:00:00.037) 0:00:11.599 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:41 -0400 (0:00:00.066) 0:00:11.666 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:41 -0400 (0:00:00.034) 0:00:11.700 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:41 -0400 (0:00:00.033) 0:00:11.733 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:41 -0400 (0:00:00.042) 0:00:11.776 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.372) 0:00:12.148 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.042) 0:00:12.191 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097764.7689178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "fc3588658699d4eb56c9a9cdf6678e83337ea7ee", "ctime": 1722097725.4462144, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 6544612, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719992328.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15680, "uid": 0, "version": "648322932", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.376) 0:00:12.568 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.033) 0:00:12.601 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.062) 0:00:12.664 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.031) 0:00:12.695 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.029) 0:00:12.725 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.028) 0:00:12.753 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.029) 0:00:12.782 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.028) 0:00:12.811 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.029) 0:00:12.840 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/etc/containers/policy.json", "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/etc/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.039) 0:00:12.879 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.064) 0:00:12.944 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.030) 0:00:12.975 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.031) 0:00:13.006 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:36:42 -0400 (0:00:00.064) 0:00:13.071 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.029) 0:00:13.100 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.030) 0:00:13.131 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.065) 0:00:13.197 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.030) 0:00:13.228 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.029) 0:00:13.258 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.097) 0:00:13.355 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.031) 0:00:13.387 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.030) 0:00:13.418 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.030) 0:00:13.448 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.030) 0:00:13.479 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.029) 0:00:13.509 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.030) 0:00:13.540 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.029) 0:00:13.570 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.027) 0:00:13.597 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.027) 0:00:13.624 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:129 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.026) 0:00:13.651 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:136 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.027) 0:00:13.678 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.080) 0:00:13.759 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "", "__podman_quadlet_spec": { "Container": { "ContainerName": "bogus", "Image": "this_is_a_bogus_image" }, "Install": { "WantedBy": "default.target" } }, "__podman_quadlet_str": "", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.041) 0:00:13.800 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": true, "__podman_pull_image": true, "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.039) 0:00:13.840 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_spec | length == 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.030) 0:00:13.871 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "bogus", "__podman_quadlet_type": "container", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.047) 0:00:13.918 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.091) 0:00:14.010 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.035) 0:00:14.045 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:43 -0400 (0:00:00.033) 0:00:14.079 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:44 -0400 (0:00:00.042) 0:00:14.122 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:44 -0400 (0:00:00.373) 0:00:14.495 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:44 -0400 (0:00:00.042) 0:00:14.537 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097764.7689178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "fc3588658699d4eb56c9a9cdf6678e83337ea7ee", "ctime": 1722097725.4462144, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 6544612, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719992328.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15680, "uid": 0, "version": "648322932", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:44 -0400 (0:00:00.368) 0:00:14.906 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:44 -0400 (0:00:00.030) 0:00:14.936 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:44 -0400 (0:00:00.029) 0:00:14.966 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:44 -0400 (0:00:00.028) 0:00:14.994 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:44 -0400 (0:00:00.029) 0:00:15.024 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:44 -0400 (0:00:00.028) 0:00:15.053 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:44 -0400 (0:00:00.029) 0:00:15.082 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.029) 0:00:15.112 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.028) 0:00:15.140 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": false, "__podman_images_found": [ "this_is_a_bogus_image" ], "__podman_kube_yamls_raw": "", "__podman_service_name": "bogus.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.053) 0:00:15.194 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.032) 0:00:15.227 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_kube_yamls_raw | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.032) 0:00:15.259 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [ "this_is_a_bogus_image" ], "__podman_quadlet_file": "/etc/containers/systemd/bogus.container", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.077) 0:00:15.336 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.069) 0:00:15.406 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state == \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.030) 0:00:15.436 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:2 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.070) 0:00:15.507 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.053) 0:00:15.560 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.029) 0:00:15.589 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.030) 0:00:15.620 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create host directories] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:7 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.029) 0:00:15.649 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Ensure container images are present] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18 Saturday 27 July 2024 12:36:45 -0400 (0:00:00.028) 0:00:15.677 ********* ok: [managed_node1] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Ensure the quadlet directory is present] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39 Saturday 27 July 2024 12:36:46 -0400 (0:00:00.981) 0:00:16.658 ********* ok: [managed_node1] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/containers/systemd", "secontext": "system_u:object_r:etc_t:s0", "size": 30, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.podman : Ensure quadlet file is copied] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:46 -0400 (0:00:00.393) 0:00:17.052 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_file_src | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure quadlet file content is present] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:58 Saturday 27 July 2024 12:36:46 -0400 (0:00:00.034) 0:00:17.086 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_str | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure quadlet file is present] ******* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70 Saturday 27 July 2024 12:36:47 -0400 (0:00:00.031) 0:00:17.118 ********* changed: [managed_node1] => { "changed": true, "checksum": "1d087e679d135214e8ac9ccaf33b2222916efb7f", "dest": "/etc/containers/systemd/bogus.container", "gid": 0, "group": "root", "md5sum": "97480a9a73734d9f8007d2c06e7fed1f", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 138, "src": "/root/.ansible/tmp/ansible-tmp-1722098207.0700624-12590-149017880719930/.source.container", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.podman : Reload systemctl] ********************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:82 Saturday 27 July 2024 12:36:47 -0400 (0:00:00.704) 0:00:17.823 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Start service] ************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110 Saturday 27 July 2024 12:36:47 -0400 (0:00:00.033) 0:00:17.857 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Restart service] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:125 Saturday 27 July 2024 12:36:47 -0400 (0:00:00.035) 0:00:17.892 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Cancel linger] ************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:143 Saturday 27 July 2024 12:36:47 -0400 (0:00:00.036) 0:00:17.929 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Handle credential files - absent] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:149 Saturday 27 July 2024 12:36:47 -0400 (0:00:00.027) 0:00:17.956 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:158 Saturday 27 July 2024 12:36:47 -0400 (0:00:00.027) 0:00:17.983 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Verify image not pulled and no error] ************************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:90 Saturday 27 July 2024 12:36:47 -0400 (0:00:00.074) 0:00:18.058 ********* ok: [managed_node1] => { "changed": false } MSG: All assertions passed TASK [Cleanup] ***************************************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:97 Saturday 27 July 2024 12:36:48 -0400 (0:00:00.036) 0:00:18.094 ********* included: fedora.linux_system_roles.podman for managed_node1 => (item=nopull) included: fedora.linux_system_roles.podman for managed_node1 => (item=bogus) TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:36:48 -0400 (0:00:00.157) 0:00:18.251 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:36:48 -0400 (0:00:00.058) 0:00:18.310 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:36:48 -0400 (0:00:00.037) 0:00:18.347 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:36:48 -0400 (0:00:00.031) 0:00:18.379 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:36:48 -0400 (0:00:00.030) 0:00:18.409 ********* [WARNING]: TASK: fedora.linux_system_roles.podman : Set platform/version specific variables: The loop variable 'item' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:36:48 -0400 (0:00:00.066) 0:00:18.476 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:36:49 -0400 (0:00:00.922) 0:00:19.398 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:36:49 -0400 (0:00:00.032) 0:00:19.430 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:36:49 -0400 (0:00:00.036) 0:00:19.466 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.027671", "end": "2024-07-27 12:36:49.701486", "rc": 0, "start": "2024-07-27 12:36:49.673815" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:36:49 -0400 (0:00:00.425) 0:00:19.891 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:36:49 -0400 (0:00:00.034) 0:00:19.925 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:36:49 -0400 (0:00:00.029) 0:00:19.955 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:36:49 -0400 (0:00:00.033) 0:00:19.989 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:36:49 -0400 (0:00:00.038) 0:00:20.027 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:50 -0400 (0:00:00.068) 0:00:20.095 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:50 -0400 (0:00:00.035) 0:00:20.131 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:50 -0400 (0:00:00.034) 0:00:20.166 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:50 -0400 (0:00:00.042) 0:00:20.208 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:50 -0400 (0:00:00.374) 0:00:20.582 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:50 -0400 (0:00:00.040) 0:00:20.623 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097764.7689178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "fc3588658699d4eb56c9a9cdf6678e83337ea7ee", "ctime": 1722097725.4462144, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 6544612, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719992328.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15680, "uid": 0, "version": "648322932", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:50 -0400 (0:00:00.373) 0:00:20.996 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:50 -0400 (0:00:00.032) 0:00:21.029 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:50 -0400 (0:00:00.031) 0:00:21.061 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:50 -0400 (0:00:00.030) 0:00:21.092 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.030) 0:00:21.122 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.030) 0:00:21.152 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.028) 0:00:21.181 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.061) 0:00:21.243 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.030) 0:00:21.273 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/etc/containers/policy.json", "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/etc/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.041) 0:00:21.315 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.063) 0:00:21.378 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.031) 0:00:21.410 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.030) 0:00:21.440 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.065) 0:00:21.506 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.031) 0:00:21.537 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.030) 0:00:21.567 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.066) 0:00:21.634 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.031) 0:00:21.665 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.031) 0:00:21.696 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.068) 0:00:21.765 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.032) 0:00:21.797 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.030) 0:00:21.827 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.064) 0:00:21.891 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.032) 0:00:21.923 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.031) 0:00:21.955 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.030) 0:00:21.985 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.031) 0:00:22.017 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.027) 0:00:22.044 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:36:51 -0400 (0:00:00.028) 0:00:22.073 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:129 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.027) 0:00:22.100 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:136 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.027) 0:00:22.128 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.081) 0:00:22.210 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.042) 0:00:22.252 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": true, "__podman_state": "absent", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.040) 0:00:22.293 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.033) 0:00:22.326 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "nopull", "__podman_quadlet_type": "container", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.048) 0:00:22.375 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.062) 0:00:22.438 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.033) 0:00:22.471 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.035) 0:00:22.506 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.078) 0:00:22.585 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.372) 0:00:22.957 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:52 -0400 (0:00:00.043) 0:00:23.001 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097764.7689178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "fc3588658699d4eb56c9a9cdf6678e83337ea7ee", "ctime": 1722097725.4462144, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 6544612, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719992328.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15680, "uid": 0, "version": "648322932", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.373) 0:00:23.374 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.031) 0:00:23.405 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.029) 0:00:23.435 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.029) 0:00:23.465 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.029) 0:00:23.494 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.029) 0:00:23.524 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.029) 0:00:23.553 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.029) 0:00:23.583 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.029) 0:00:23.613 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "nopull.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.054) 0:00:23.667 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.033) 0:00:23.701 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.029) 0:00:23.730 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [], "__podman_quadlet_file": "/etc/containers/systemd/nopull.container", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.079) 0:00:23.809 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.038) 0:00:23.848 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.079) 0:00:23.928 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stop and disable service] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 Saturday 27 July 2024 12:36:53 -0400 (0:00:00.029) 0:00:23.957 ********* ok: [managed_node1] => { "changed": false, "failed_when_result": false } MSG: Could not find the requested service nopull.service: host TASK [fedora.linux_system_roles.podman : See if quadlet file exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33 Saturday 27 July 2024 12:36:54 -0400 (0:00:00.826) 0:00:24.784 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722098199.2850761, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "670d64fc68a9768edb20cad26df2acc703542d85", "ctime": 1722098199.289076, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 186646731, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1722098198.8770792, "nlink": 1, "path": "/etc/containers/systemd/nopull.container", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 151, "uid": 0, "version": "4006430064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38 Saturday 27 July 2024 12:36:55 -0400 (0:00:00.378) 0:00:25.162 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Slurp quadlet file] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6 Saturday 27 July 2024 12:36:55 -0400 (0:00:00.063) 0:00:25.226 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:12 Saturday 27 July 2024 12:36:55 -0400 (0:00:00.454) 0:00:25.681 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Parse quadlet yaml file] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:44 Saturday 27 July 2024 12:36:55 -0400 (0:00:00.052) 0:00:25.733 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Reset raw variable] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:52 Saturday 27 July 2024 12:36:55 -0400 (0:00:00.031) 0:00:25.765 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_raw": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Remove quadlet file] ****************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42 Saturday 27 July 2024 12:36:55 -0400 (0:00:00.032) 0:00:25.797 ********* changed: [managed_node1] => { "changed": true, "path": "/etc/containers/systemd/nopull.container", "state": "absent" } TASK [fedora.linux_system_roles.podman : Refresh systemd] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:56 -0400 (0:00:00.371) 0:00:26.168 ********* ok: [managed_node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.podman : Remove managed resource] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 Saturday 27 July 2024 12:36:56 -0400 (0:00:00.748) 0:00:26.917 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Remove volumes] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:95 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.432) 0:00:27.350 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:112 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.051) 0:00:27.401 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_parsed": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.033) 0:00:27.434 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_prune_images | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:127 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.031) 0:00:27.466 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.058) 0:00:27.525 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.030) 0:00:27.555 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.029) 0:00:27.585 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - images] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:137 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.029) 0:00:27.615 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:146 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.032) 0:00:27.647 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:155 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.067) 0:00:27.715 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:164 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.035) 0:00:27.750 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:173 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.032) 0:00:27.783 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : For testing and debugging - services] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.033) 0:00:27.817 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.033) 0:00:27.850 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Cancel linger] ************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:143 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.029) 0:00:27.880 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Handle credential files - absent] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:149 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.027) 0:00:27.907 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:158 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.027) 0:00:27.935 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.044) 0:00:27.979 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.057) 0:00:28.037 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:36:57 -0400 (0:00:00.037) 0:00:28.075 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:36:58 -0400 (0:00:00.030) 0:00:28.105 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:36:58 -0400 (0:00:00.031) 0:00:28.137 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:36:58 -0400 (0:00:00.064) 0:00:28.202 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.958) 0:00:29.160 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.067) 0:00:29.228 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.036) 0:00:29.264 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.028477", "end": "2024-07-27 12:36:59.501021", "rc": 0, "start": "2024-07-27 12:36:59.472544" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.396) 0:00:29.660 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.041) 0:00:29.702 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.032) 0:00:29.735 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.036) 0:00:29.771 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.038) 0:00:29.810 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.068) 0:00:29.878 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.035) 0:00:29.914 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.034) 0:00:29.948 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:59 -0400 (0:00:00.043) 0:00:29.991 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.370) 0:00:30.362 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.041) 0:00:30.404 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097764.7689178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "fc3588658699d4eb56c9a9cdf6678e83337ea7ee", "ctime": 1722097725.4462144, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 6544612, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719992328.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15680, "uid": 0, "version": "648322932", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.374) 0:00:30.778 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.032) 0:00:30.811 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.030) 0:00:30.841 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.031) 0:00:30.873 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.065) 0:00:30.938 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.030) 0:00:30.969 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.030) 0:00:31.000 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.029) 0:00:31.030 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:37:00 -0400 (0:00:00.030) 0:00:31.060 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/etc/containers/policy.json", "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/etc/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.040) 0:00:31.101 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.064) 0:00:31.166 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.031) 0:00:31.197 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.031) 0:00:31.229 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.065) 0:00:31.295 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.032) 0:00:31.327 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.032) 0:00:31.360 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.068) 0:00:31.429 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.031) 0:00:31.460 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.030) 0:00:31.491 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.107) 0:00:31.598 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.033) 0:00:31.631 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.033) 0:00:31.665 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.033) 0:00:31.699 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.034) 0:00:31.733 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.033) 0:00:31.767 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.033) 0:00:31.801 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.032) 0:00:31.833 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.028) 0:00:31.861 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.028) 0:00:31.890 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:129 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.028) 0:00:31.919 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:136 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.029) 0:00:31.948 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.082) 0:00:32.031 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:37:01 -0400 (0:00:00.042) 0:00:32.074 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": true, "__podman_state": "absent", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:37:02 -0400 (0:00:00.040) 0:00:32.114 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:37:02 -0400 (0:00:00.035) 0:00:32.150 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "bogus", "__podman_quadlet_type": "container", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:37:02 -0400 (0:00:00.048) 0:00:32.199 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:37:02 -0400 (0:00:00.064) 0:00:32.263 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:37:02 -0400 (0:00:00.071) 0:00:32.335 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:37:02 -0400 (0:00:00.036) 0:00:32.371 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:37:02 -0400 (0:00:00.043) 0:00:32.415 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:37:02 -0400 (0:00:00.377) 0:00:32.792 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:37:02 -0400 (0:00:00.042) 0:00:32.834 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097764.7689178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "fc3588658699d4eb56c9a9cdf6678e83337ea7ee", "ctime": 1722097725.4462144, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 6544612, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719992328.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15680, "uid": 0, "version": "648322932", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.377) 0:00:33.212 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.030) 0:00:33.243 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.030) 0:00:33.273 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.029) 0:00:33.303 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.030) 0:00:33.333 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.030) 0:00:33.364 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.029) 0:00:33.393 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.030) 0:00:33.423 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.029) 0:00:33.453 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "bogus.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.054) 0:00:33.507 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.033) 0:00:33.541 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.029) 0:00:33.571 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [], "__podman_quadlet_file": "/etc/containers/systemd/bogus.container", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.078) 0:00:33.649 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.039) 0:00:33.689 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.114) 0:00:33.804 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stop and disable service] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 Saturday 27 July 2024 12:37:03 -0400 (0:00:00.031) 0:00:33.836 ********* changed: [managed_node1] => { "changed": true, "enabled": false, "failed_when_result": false, "name": "bogus.service", "state": "stopped", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "sysinit.target system.slice basic.target -.mount systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "yes", "DelegateControllers": "cpu cpuacct cpuset io blkio memory devices pids bpf-firewall bpf-devices bpf-foreign bpf-socket-bind bpf-restrict-network-interfaces", "Description": "bogus.service", "DevicePolicy": "auto", "DynamicUser": "no", "Environment": "PODMAN_SYSTEMD_UNIT=bogus.service", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman run --name=bogus --cidfile=/run/bogus.cid --replace --rm --cgroups=split --sdnotify=conmon -d this_is_a_bogus_image ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman run --name=bogus --cidfile=/run/bogus.cid --replace --rm --cgroups=split --sdnotify=conmon -d this_is_a_bogus_image ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/bogus.cid ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/bogus.cid ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopPost": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/bogus.cid ; ignore_errors=yes ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopPostEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/bogus.cid ; flags=ignore-failure ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/bogus.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "bogus.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "bogus.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "all", "OOMPolicy": "continue", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "-.mount sysinit.target system.slice", "RequiresMountsFor": "/run/containers", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "SourcePath": "/etc/containers/systemd/bogus.container", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogIdentifier": "bogus", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22342", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "notify", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.podman : See if quadlet file exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33 Saturday 27 July 2024 12:37:04 -0400 (0:00:00.757) 0:00:34.594 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722098216.5879426, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1d087e679d135214e8ac9ccaf33b2222916efb7f", "ctime": 1722098207.6670115, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264241351, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1722098207.3960135, "nlink": 1, "path": "/etc/containers/systemd/bogus.container", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 138, "uid": 0, "version": "4226223103", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38 Saturday 27 July 2024 12:37:04 -0400 (0:00:00.381) 0:00:34.975 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Slurp quadlet file] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6 Saturday 27 July 2024 12:37:04 -0400 (0:00:00.059) 0:00:35.035 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:12 Saturday 27 July 2024 12:37:05 -0400 (0:00:00.356) 0:00:35.392 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Parse quadlet yaml file] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:44 Saturday 27 July 2024 12:37:05 -0400 (0:00:00.051) 0:00:35.443 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Reset raw variable] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:52 Saturday 27 July 2024 12:37:05 -0400 (0:00:00.032) 0:00:35.476 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_raw": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Remove quadlet file] ****************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42 Saturday 27 July 2024 12:37:05 -0400 (0:00:00.031) 0:00:35.507 ********* changed: [managed_node1] => { "changed": true, "path": "/etc/containers/systemd/bogus.container", "state": "absent" } TASK [fedora.linux_system_roles.podman : Refresh systemd] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 Saturday 27 July 2024 12:37:05 -0400 (0:00:00.373) 0:00:35.881 ********* ok: [managed_node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.podman : Remove managed resource] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 Saturday 27 July 2024 12:37:06 -0400 (0:00:00.760) 0:00:36.642 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Remove volumes] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:95 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.469) 0:00:37.111 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:112 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.047) 0:00:37.159 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_parsed": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.032) 0:00:37.191 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_prune_images | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:127 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.032) 0:00:37.224 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.057) 0:00:37.282 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.030) 0:00:37.312 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.067) 0:00:37.380 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - images] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:137 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.032) 0:00:37.412 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:146 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.033) 0:00:37.445 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:155 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.034) 0:00:37.480 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:164 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.033) 0:00:37.513 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:173 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.034) 0:00:37.548 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : For testing and debugging - services] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.033) 0:00:37.581 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.033) 0:00:37.615 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Cancel linger] ************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:143 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.029) 0:00:37.645 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Handle credential files - absent] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:149 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.028) 0:00:37.673 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:158 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.028) 0:00:37.701 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Create user for testing] ************************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:109 Saturday 27 July 2024 12:37:07 -0400 (0:00:00.044) 0:00:37.745 ********* changed: [managed_node1] => { "changed": true, "comment": "", "create_home": true, "group": 1111, "home": "/home/user_quadlet_basic", "name": "user_quadlet_basic", "shell": "/bin/bash", "state": "present", "system": false, "uid": 1111 } TASK [Get local machine ID] **************************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:122 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.701) 0:00:38.447 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution_version\"] is version(\"9\", \"<\")", "skip_reason": "Conditional result was False" } TASK [Skip test if cannot reboot] ********************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:128 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.041) 0:00:38.489 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [Enable cgroup controllers] *********************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:134 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.042) 0:00:38.531 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution_version\"] is version(\"9\", \"<\")", "skip_reason": "Conditional result was False" } TASK [Configure cgroups in kernel] ********************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:166 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.039) 0:00:38.571 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution_version\"] is version(\"9\", \"<\")", "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:172 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.038) 0:00:38.609 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution_version\"] is version(\"9\", \"<\")", "skip_reason": "Conditional result was False" } TASK [Run the role - user] ***************************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:175 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.078) 0:00:38.687 ********* included: fedora.linux_system_roles.podman for managed_node1 TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.082) 0:00:38.769 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.055) 0:00:38.825 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.037) 0:00:38.863 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.030) 0:00:38.894 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.031) 0:00:38.925 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:37:08 -0400 (0:00:00.065) 0:00:38.991 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:37:09 -0400 (0:00:00.915) 0:00:39.907 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:37:09 -0400 (0:00:00.031) 0:00:39.938 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:37:09 -0400 (0:00:00.035) 0:00:39.973 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.029699", "end": "2024-07-27 12:37:10.209218", "rc": 0, "start": "2024-07-27 12:37:10.179519" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:37:10 -0400 (0:00:00.391) 0:00:40.365 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:37:10 -0400 (0:00:00.034) 0:00:40.399 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:37:10 -0400 (0:00:00.030) 0:00:40.430 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:37:10 -0400 (0:00:00.039) 0:00:40.469 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:37:10 -0400 (0:00:00.081) 0:00:40.551 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:37:10 -0400 (0:00:00.067) 0:00:40.619 ********* ok: [managed_node1] => { "ansible_facts": { "getent_passwd": { "user_quadlet_basic": [ "x", "1111", "1111", "", "/home/user_quadlet_basic", "/bin/bash" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:37:10 -0400 (0:00:00.377) 0:00:40.996 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:37:10 -0400 (0:00:00.036) 0:00:41.033 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "1111" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:37:10 -0400 (0:00:00.042) 0:00:41.075 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "user_quadlet_basic": [ "x", "1111", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:37:11 -0400 (0:00:00.374) 0:00:41.450 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "user_quadlet_basic" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:37:11 -0400 (0:00:00.041) 0:00:41.491 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097764.7689178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "fc3588658699d4eb56c9a9cdf6678e83337ea7ee", "ctime": 1722097725.4462144, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 6544612, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719992328.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15680, "uid": 0, "version": "648322932", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:37:11 -0400 (0:00:00.373) 0:00:41.864 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "getsubids", "user_quadlet_basic" ], "delta": "0:00:00.003919", "end": "2024-07-27 12:37:12.083366", "rc": 0, "start": "2024-07-27 12:37:12.079447" } STDOUT: 0: user_quadlet_basic 231072 65536 TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.375) 0:00:42.240 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "getsubids", "-g", "user_quadlet_basic" ], "delta": "0:00:00.004115", "end": "2024-07-27 12:37:12.461843", "rc": 0, "start": "2024-07-27 12:37:12.457728" } STDOUT: 0: user_quadlet_basic 231072 65536 TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.383) 0:00:42.623 ********* ok: [managed_node1] => { "ansible_facts": { "podman_subgid_info": { "user_quadlet_basic": { "range": 65536, "start": 231072 } }, "podman_subuid_info": { "user_quadlet_basic": { "range": 65536, "start": 231072 } } }, "changed": false } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.055) 0:00:42.679 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.032) 0:00:42.712 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.031) 0:00:42.743 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.031) 0:00:42.775 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.030) 0:00:42.805 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.028) 0:00:42.834 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/root/.config/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/root/.config/containers/policy.json", "__podman_registries_conf_file": "/root/.config/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/root/.config/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.040) 0:00:42.875 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.098) 0:00:42.974 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.032) 0:00:43.007 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:37:12 -0400 (0:00:00.031) 0:00:43.038 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.062) 0:00:43.101 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.031) 0:00:43.133 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.031) 0:00:43.164 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.064) 0:00:43.229 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.031) 0:00:43.260 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.031) 0:00:43.292 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.066) 0:00:43.359 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.032) 0:00:43.391 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.031) 0:00:43.422 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.032) 0:00:43.454 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.031) 0:00:43.486 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.031) 0:00:43.517 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.030) 0:00:43.548 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.069) 0:00:43.617 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.028) 0:00:43.646 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.028) 0:00:43.675 ********* fatal: [managed_node1]: FAILED! => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result" } TASK [Debug3] ****************************************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:253 Saturday 27 July 2024 12:37:13 -0400 (0:00:00.034) 0:00:43.709 ********* fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "set -x\nset -o pipefail\nexec 1>&2\n#podman volume rm --all\n#podman network prune -f\npodman volume ls\npodman network ls\npodman secret ls\npodman container ls\npodman pod ls\npodman images\nsystemctl list-units | grep quadlet\n", "delta": "0:00:00.206847", "end": "2024-07-27 12:37:14.121913", "rc": 1, "start": "2024-07-27 12:37:13.915066" } STDERR: + set -o pipefail + exec + podman volume ls + podman network ls NETWORK ID NAME DRIVER 2f259bab93aa podman bridge 7f20b6d41069 podman-default-kube-network bridge + podman secret ls ID NAME DRIVER CREATED UPDATED + podman container ls CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES + podman pod ls POD ID NAME STATUS CREATED INFRA ID # OF CONTAINERS + podman images REPOSITORY TAG IMAGE ID CREATED SIZE localhost/podman-pause 5.1.2-1720678294 cdd133fcb081 6 minutes ago 814 kB quay.io/libpod/testimage 20210610 9f9ec7f2fdef 3 years ago 7.99 MB + systemctl list-units + grep quadlet MSG: non-zero return code TASK [Cleanup user] ************************************************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:282 Saturday 27 July 2024 12:37:14 -0400 (0:00:00.574) 0:00:44.283 ********* included: fedora.linux_system_roles.podman for managed_node1 TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:37:14 -0400 (0:00:00.083) 0:00:44.367 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:37:14 -0400 (0:00:00.059) 0:00:44.426 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:37:14 -0400 (0:00:00.037) 0:00:44.463 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:37:14 -0400 (0:00:00.031) 0:00:44.495 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:37:14 -0400 (0:00:00.031) 0:00:44.527 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:37:14 -0400 (0:00:00.065) 0:00:44.592 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:37:15 -0400 (0:00:00.924) 0:00:45.517 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:37:15 -0400 (0:00:00.033) 0:00:45.551 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:37:15 -0400 (0:00:00.035) 0:00:45.586 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.027970", "end": "2024-07-27 12:37:15.821947", "rc": 0, "start": "2024-07-27 12:37:15.793977" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:37:15 -0400 (0:00:00.440) 0:00:46.027 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:37:15 -0400 (0:00:00.037) 0:00:46.064 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:37:16 -0400 (0:00:00.030) 0:00:46.094 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:37:16 -0400 (0:00:00.042) 0:00:46.137 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:37:16 -0400 (0:00:00.045) 0:00:46.183 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:37:16 -0400 (0:00:00.067) 0:00:46.250 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:37:16 -0400 (0:00:00.036) 0:00:46.287 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:37:16 -0400 (0:00:00.034) 0:00:46.321 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "1111" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:37:16 -0400 (0:00:00.044) 0:00:46.365 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "user_quadlet_basic": [ "x", "1111", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:37:16 -0400 (0:00:00.378) 0:00:46.744 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "user_quadlet_basic" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:37:16 -0400 (0:00:00.044) 0:00:46.788 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097764.7689178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "fc3588658699d4eb56c9a9cdf6678e83337ea7ee", "ctime": 1722097725.4462144, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 6544612, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719992328.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15680, "uid": 0, "version": "648322932", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:37:17 -0400 (0:00:00.375) 0:00:47.164 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "getsubids", "user_quadlet_basic" ], "delta": "0:00:00.004165", "end": "2024-07-27 12:37:17.385837", "rc": 0, "start": "2024-07-27 12:37:17.381672" } STDOUT: 0: user_quadlet_basic 231072 65536 TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:37:17 -0400 (0:00:00.379) 0:00:47.543 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "getsubids", "-g", "user_quadlet_basic" ], "delta": "0:00:00.003957", "end": "2024-07-27 12:37:17.764053", "rc": 0, "start": "2024-07-27 12:37:17.760096" } STDOUT: 0: user_quadlet_basic 231072 65536 TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:37:17 -0400 (0:00:00.379) 0:00:47.923 ********* ok: [managed_node1] => { "ansible_facts": { "podman_subgid_info": { "user_quadlet_basic": { "range": 65536, "start": 231072 } }, "podman_subuid_info": { "user_quadlet_basic": { "range": 65536, "start": 231072 } } }, "changed": false } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:37:17 -0400 (0:00:00.052) 0:00:47.975 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:37:17 -0400 (0:00:00.031) 0:00:48.006 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:37:17 -0400 (0:00:00.029) 0:00:48.036 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.069) 0:00:48.105 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.031) 0:00:48.136 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.030) 0:00:48.167 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/root/.config/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/root/.config/containers/policy.json", "__podman_registries_conf_file": "/root/.config/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/root/.config/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.041) 0:00:48.209 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.064) 0:00:48.274 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.031) 0:00:48.305 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.032) 0:00:48.337 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.066) 0:00:48.404 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.032) 0:00:48.436 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.032) 0:00:48.468 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.067) 0:00:48.535 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.032) 0:00:48.567 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.030) 0:00:48.598 ********* included: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.070) 0:00:48.669 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.031) 0:00:48.700 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.071) 0:00:48.772 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.033) 0:00:48.805 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.033) 0:00:48.839 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.032) 0:00:48.872 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.033) 0:00:48.906 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.031) 0:00:48.938 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.028) 0:00:48.967 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.028) 0:00:48.995 ********* fatal: [managed_node1]: FAILED! => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result" } TASK [Dump journal] ************************************************************ task path: /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:319 Saturday 27 July 2024 12:37:18 -0400 (0:00:00.037) 0:00:49.032 ********* fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": [ "journalctl", "-ex" ], "delta": "0:00:00.042890", "end": "2024-07-27 12:37:19.281897", "failed_when_result": true, "rc": 0, "start": "2024-07-27 12:37:19.239007" } STDOUT: Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Called cleanup.PersistentPreRunE(/usr/bin/podman --root /home/podman_basic_user/.local/share/containers/storage --runroot /run/user/3001/containers --log-level debug --cgroup-manager systemd --tmpdir /run/user/3001/libpod/tmp --network-config-dir --network-backend netavark --volumepath /home/podman_basic_user/.local/share/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --events-backend file --syslog container cleanup bd5b5f2b8d741189048b1599b4adc1dccdae53554bfd8524a0bf13898af88553)" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Setting custom database backend: \"sqlite\"" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=info msg="Using sqlite as database backend" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth0 left promiscuous mode Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="systemd-logind: Unknown object '/'." Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Using graph driver overlay" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Using graph root /home/podman_basic_user/.local/share/containers/storage" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Using run root /run/user/3001/containers" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Using static dir /home/podman_basic_user/.local/share/containers/storage/libpod" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Using tmp dir /run/user/3001/libpod/tmp" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Using volume path /home/podman_basic_user/.local/share/containers/storage/volumes" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Using transient store: false" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Cached value indicated that metacopy is not being used" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Cached value indicated that native-diff is usable" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=true, usingMetacopy=false" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Initializing event backend file" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=info msg="Setting parallel job count to 7" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Called cleanup.PersistentPostRunE(/usr/bin/podman --root /home/podman_basic_user/.local/share/containers/storage --runroot /run/user/3001/containers --log-level debug --cgroup-manager systemd --tmpdir /run/user/3001/libpod/tmp --network-config-dir --network-backend netavark --volumepath /home/podman_basic_user/.local/share/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --events-backend file --syslog container cleanup bd5b5f2b8d741189048b1599b4adc1dccdae53554bfd8524a0bf13898af88553)" Jul 27 12:32:09 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25641]: time="2024-07-27T12:32:09-04:00" level=debug msg="Shutting down engines" Jul 27 12:32:18 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: time="2024-07-27T12:32:18-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd1-httpd1 in 10 seconds, resorting to SIGKILL" Jul 27 12:32:18 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[25274]: conmon bf6d03679b77aabcc7d9 : container 25276 exited with status 137 Jul 27 12:32:18 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[25274]: conmon bf6d03679b77aabcc7d9 : Failed to open cgroups file: /sys/fs/cgroup/user.slice/user-3001.slice/user@3001.service/user.slice/user-libpod_pod_4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2.slice/libpod-bf6d03679b77aabcc7d9599cc9f8dddced3072899c1ce0de02814218a9088a32.scope/container/memory.events Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25685]: time="2024-07-27T12:32:19-04:00" level=debug msg="Called cleanup.PersistentPreRunE(/usr/bin/podman --root /home/podman_basic_user/.local/share/containers/storage --runroot /run/user/3001/containers --log-level debug --cgroup-manager systemd --tmpdir /run/user/3001/libpod/tmp --network-config-dir --network-backend netavark --volumepath /home/podman_basic_user/.local/share/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --events-backend file --syslog container cleanup bf6d03679b77aabcc7d9599cc9f8dddced3072899c1ce0de02814218a9088a32)" Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25685]: time="2024-07-27T12:32:19-04:00" level=debug msg="Setting custom database backend: \"sqlite\"" Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25685]: time="2024-07-27T12:32:19-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25685]: time="2024-07-27T12:32:19-04:00" level=info msg="Using sqlite as database backend" Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25685]: time="2024-07-27T12:32:19-04:00" level=info msg="Received shutdown signal \"terminated\", terminating!" PID=25685 Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopping libpod-conmon-bf6d03679b77aabcc7d9599cc9f8dddced3072899c1ce0de02814218a9088a32.scope... ░░ Subject: A stop job for unit UNIT has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has begun execution. ░░ ░░ The job identifier is 88. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[25685]: time="2024-07-27T12:32:19-04:00" level=info msg="Invoking shutdown handler \"libpod\"" PID=25685 Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped libpod-conmon-bf6d03679b77aabcc7d9599cc9f8dddced3072899c1ce0de02814218a9088a32.scope. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 88 and the job result is done. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Removed slice cgroup user-libpod_pod_4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2.slice. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 87 and the job result is done. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: user-libpod_pod_4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2.slice: Failed to open /run/user/3001/systemd/transient/user-libpod_pod_4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2.slice: No such file or directory Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: time="2024-07-27T12:32:19-04:00" level=error msg="Checking if infra needs to be stopped: removing pod 4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2 cgroup: Unit user-libpod_pod_4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2.slice not loaded." Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: user-libpod_pod_4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2.slice: Failed to open /run/user/3001/systemd/transient/user-libpod_pod_4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2.slice: No such file or directory Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: Pods stopped: Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: 4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2 Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: Pods removed: Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: Error: removing pod 4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2 cgroup: removing pod 4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2 cgroup: Unit user-libpod_pod_4441c6f3d70f7c015eb6f9c9222e486e49594f0047a38717abfd6bfd507ec0f2.slice not loaded. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: Secrets removed: Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: Error: %!s() Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: Volumes removed: Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Created slice cgroup user-libpod_pod_b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066.slice. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 89. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started libcrun container. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 93. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started rootless-netns-97558de3.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 97. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth0 entered promiscuous mode Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: IPv6: ADDRCONF(NETDEV_CHANGE): veth0: link becomes ready Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered forwarding state Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started /usr/libexec/podman/aardvark-dns --config /run/user/3001/containers/networks/aardvark-dns -p 53 run. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 101. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started libcrun container. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 105. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started libcrun container. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 110. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: Pod: Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066 Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: Container: Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com podman[25631]: 7bf805548a794144af77c1d1181503babd6ce6686ca4ba623f4458466f8055f2 Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started A template for running K8s workloads via podman-kube-play. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 74. Jul 27 12:32:19 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[25626]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:32:20 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[25906]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:32:20 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[26014]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:32:21 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[26122]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:22 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[26231]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd2.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:23 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[26339]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman/httpd2 state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:23 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[26446]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman/httpd2-create state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:24 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:32:24 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:32:24 ip-10-31-12-229.us-east-1.aws.redhat.com podman[26570]: 2024-07-27 12:32:24.825463549 -0400 EDT m=+0.460810648 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:32:25 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[26691]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:25 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:32:25 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[26798]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:26 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[26905]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:32:26 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[26990]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/ansible-kubernetes.d/httpd2.yml owner=root group=0 mode=0644 src=/root/.ansible/tmp/ansible-tmp-1722097945.7948165-8817-249596236438368/.source.yml _original_basename=.jq7_mvzx follow=False checksum=b3561c8f986bfa70cd5476459ea6f2110d65a0a2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:26 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[27097]: ansible-containers.podman.podman_play Invoked with state=started debug=True log_level=debug kube_file=/etc/containers/ansible-kubernetes.d/httpd2.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None quiet=None recreate=None userns=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:32:26 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Created slice cgroup machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice. ░░ Subject: A start job for unit machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice has finished successfully. ░░ ░░ The job identifier is 1609. Jul 27 12:32:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27104]: 2024-07-27 12:32:26.95605532 -0400 EDT m=+0.088958376 container create afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 (image=localhost/podman-pause:5.1.2-1720678294, name=c904233a5e6a-infra, pod_id=c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799, io.buildah.version=1.36.0) Jul 27 12:32:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27104]: 2024-07-27 12:32:26.963183181 -0400 EDT m=+0.096086222 pod create c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799 (image=, name=httpd2) Jul 27 12:32:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27104]: 2024-07-27 12:32:26.966734432 -0400 EDT m=+0.099637650 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:32:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27104]: 2024-07-27 12:32:26.997651581 -0400 EDT m=+0.130554625 container create 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799, io.containers.autoupdate=registry, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, app=test) Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0223] manager: (podman1): new Bridge device (/org/freedesktop/NetworkManager/Devices/3) Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth0 entered promiscuous mode Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0363] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/4) Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-udevd[27116]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: IPv6: ADDRCONF(NETDEV_CHANGE): veth0: link becomes ready Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered forwarding state Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0460] device (veth0): carrier: link connected Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0462] device (podman1): carrier: link connected Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-udevd[27117]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0705] device (podman1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0716] device (podman1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0724] device (podman1): Activation: starting connection 'podman1' (5babca86-8d14-45b2-87bb-27d56fe6a6a3) Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0728] device (podman1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0731] device (podman1): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0733] device (podman1): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.0737] device (podman1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Starting Network Manager Script Dispatcher Service... ░░ Subject: A start job for unit NetworkManager-dispatcher.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has begun execution. ░░ ░░ The job identifier is 1614. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started Network Manager Script Dispatcher Service. ░░ Subject: A start job for unit NetworkManager-dispatcher.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has finished successfully. ░░ ░░ The job identifier is 1614. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.1167] device (podman1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.1171] device (podman1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097947.1178] device (podman1): Activation: successful, device activated. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started /usr/libexec/podman/aardvark-dns --config /run/containers/networks/aardvark-dns -p 53 run. ░░ Subject: A start job for unit run-rf635b93922984e84a00f3e304e18977c.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit run-rf635b93922984e84a00f3e304e18977c.scope has finished successfully. ░░ ░░ The job identifier is 1678. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com aardvark-dns[27209]: starting aardvark on a child with pid 27210 Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com aardvark-dns[27210]: Successfully parsed config Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com aardvark-dns[27210]: Listen v4 ip {"podman-default-kube-network": [10.89.0.1]} Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com aardvark-dns[27210]: Listen v6 ip {} Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com aardvark-dns[27210]: Will Forward dns requests to udp://1.1.1.1:53 Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com aardvark-dns[27210]: Starting listen on udp 10.89.0.1:53 Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libpod-conmon-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope. ░░ Subject: A start job for unit libpod-conmon-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-conmon-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope has finished successfully. ░░ ░░ The job identifier is 1682. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27217]: conmon afcc88467a690a167268 : addr{sun_family=AF_UNIX, sun_path=/proc/self/fd/12/attach} Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27217]: conmon afcc88467a690a167268 : terminal_ctrl_fd: 12 Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27217]: conmon afcc88467a690a167268 : winsz read side: 16, winsz write side: 17 Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.fGaMyc.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.fGaMyc.mount has successfully entered the 'dead' state. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libcrun container. ░░ Subject: A start job for unit libpod-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope has finished successfully. ░░ ░░ The job identifier is 1687. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27217]: conmon afcc88467a690a167268 : container PID: 27219 Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27104]: 2024-07-27 12:32:27.3182544 -0400 EDT m=+0.451157635 container init afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 (image=localhost/podman-pause:5.1.2-1720678294, name=c904233a5e6a-infra, pod_id=c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799, io.buildah.version=1.36.0) Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27104]: 2024-07-27 12:32:27.323054071 -0400 EDT m=+0.455957309 container start afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 (image=localhost/podman-pause:5.1.2-1720678294, name=c904233a5e6a-infra, pod_id=c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799, io.buildah.version=1.36.0) Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope. ░░ Subject: A start job for unit libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope has finished successfully. ░░ ░░ The job identifier is 1692. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27222]: conmon 3bb4f888f7fc283d2f9a : addr{sun_family=AF_UNIX, sun_path=/proc/self/fd/11/attach} Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27222]: conmon 3bb4f888f7fc283d2f9a : terminal_ctrl_fd: 11 Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27222]: conmon 3bb4f888f7fc283d2f9a : winsz read side: 15, winsz write side: 16 Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libcrun container. ░░ Subject: A start job for unit libpod-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope has finished successfully. ░░ ░░ The job identifier is 1697. Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27222]: conmon 3bb4f888f7fc283d2f9a : container PID: 27224 Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27104]: 2024-07-27 12:32:27.383518314 -0400 EDT m=+0.516421687 container init 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test, created_at=2021-06-10T18:55:36Z) Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27104]: 2024-07-27 12:32:27.38757565 -0400 EDT m=+0.520479049 container start 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799, io.containers.autoupdate=registry, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0) Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27104]: 2024-07-27 12:32:27.394443653 -0400 EDT m=+0.527346715 pod start c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799 (image=, name=httpd2) Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[27097]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE command: /usr/bin/podman play kube --start=true --log-level=debug /etc/containers/ansible-kubernetes.d/httpd2.yml Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[27097]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stdout: Pod: c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799 Container: 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[27097]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stderr: time="2024-07-27T12:32:26-04:00" level=info msg="/usr/bin/podman filtering at log level debug" time="2024-07-27T12:32:26-04:00" level=debug msg="Called kube.PersistentPreRunE(/usr/bin/podman play kube --start=true --log-level=debug /etc/containers/ansible-kubernetes.d/httpd2.yml)" time="2024-07-27T12:32:26-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" time="2024-07-27T12:32:26-04:00" level=info msg="Using sqlite as database backend" time="2024-07-27T12:32:26-04:00" level=debug msg="Using graph driver overlay" time="2024-07-27T12:32:26-04:00" level=debug msg="Using graph root /var/lib/containers/storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Using run root /run/containers/storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Using static dir /var/lib/containers/storage/libpod" time="2024-07-27T12:32:26-04:00" level=debug msg="Using tmp dir /run/libpod" time="2024-07-27T12:32:26-04:00" level=debug msg="Using volume path /var/lib/containers/storage/volumes" time="2024-07-27T12:32:26-04:00" level=debug msg="Using transient store: false" time="2024-07-27T12:32:26-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:32:26-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:32:26-04:00" level=debug msg="Cached value indicated that metacopy is being used" time="2024-07-27T12:32:26-04:00" level=debug msg="Cached value indicated that native-diff is not being used" time="2024-07-27T12:32:26-04:00" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" time="2024-07-27T12:32:26-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=false, usingMetacopy=true" time="2024-07-27T12:32:26-04:00" level=debug msg="Initializing event backend journald" time="2024-07-27T12:32:26-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" time="2024-07-27T12:32:26-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" time="2024-07-27T12:32:26-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" time="2024-07-27T12:32:26-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" time="2024-07-27T12:32:26-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" time="2024-07-27T12:32:26-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" time="2024-07-27T12:32:26-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" time="2024-07-27T12:32:26-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" time="2024-07-27T12:32:26-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" time="2024-07-27T12:32:26-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" time="2024-07-27T12:32:26-04:00" level=info msg="Setting parallel job count to 7" time="2024-07-27T12:32:26-04:00" level=debug msg="Successfully loaded network podman-default-kube-network: &{podman-default-kube-network 7f20b6d410693ba0d53098bbbd51b58cf4a1522ca416836f34fe03bcda7fa5b0 bridge podman1 2024-07-27 12:30:23.859968267 -0400 EDT [{{{10.89.0.0 ffffff00}} 10.89.0.1 }] [] false false true [] map[] map[] map[driver:host-local]}" time="2024-07-27T12:32:26-04:00" level=debug msg="Successfully loaded 2 networks" time="2024-07-27T12:32:26-04:00" level=debug msg="Looking up image \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:32:26-04:00" level=debug msg="Trying \"localhost/podman-pause:5.1.2-1720678294\" ..." time="2024-07-27T12:32:26-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720678294\" as \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720678294\" as \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3)" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Pod using bridge network mode" time="2024-07-27T12:32:26-04:00" level=debug msg="Created cgroup path machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice for parent machine.slice and name libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799" time="2024-07-27T12:32:26-04:00" level=debug msg="Created cgroup machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice" time="2024-07-27T12:32:26-04:00" level=debug msg="Got pod cgroup as machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice" time="2024-07-27T12:32:26-04:00" level=debug msg="Looking up image \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:32:26-04:00" level=debug msg="Trying \"localhost/podman-pause:5.1.2-1720678294\" ..." time="2024-07-27T12:32:26-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720678294\" as \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720678294\" as \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3)" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Inspecting image cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Inspecting image cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3" time="2024-07-27T12:32:26-04:00" level=debug msg="Inspecting image cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3" time="2024-07-27T12:32:26-04:00" level=debug msg="using systemd mode: false" time="2024-07-27T12:32:26-04:00" level=debug msg="setting container name c904233a5e6a-infra" time="2024-07-27T12:32:26-04:00" level=debug msg="Loading seccomp profile from \"/usr/share/containers/seccomp.json\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Allocated lock 1 for container afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Cached value indicated that idmapped mounts for overlay are not supported" time="2024-07-27T12:32:26-04:00" level=debug msg="Check for idmapped mounts support " time="2024-07-27T12:32:26-04:00" level=debug msg="Created container \"afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Container \"afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2\" has work directory \"/var/lib/containers/storage/overlay-containers/afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2/userdata\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Container \"afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2\" has run directory \"/run/containers/storage/overlay-containers/afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2/userdata\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:32:26-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2024-07-27T12:32:26-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Pulling image quay.io/libpod/testimage:20210610 (policy: missing)" time="2024-07-27T12:32:26-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:32:26-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2024-07-27T12:32:26-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:32:26-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2024-07-27T12:32:26-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:32:26-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2024-07-27T12:32:26-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:32:26-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2024-07-27T12:32:26-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2024-07-27T12:32:26-04:00" level=debug msg="using systemd mode: false" time="2024-07-27T12:32:26-04:00" level=debug msg="adding container to pod httpd2" time="2024-07-27T12:32:26-04:00" level=debug msg="setting container name httpd2-httpd2" time="2024-07-27T12:32:26-04:00" level=debug msg="Loading seccomp profile from \"/usr/share/containers/seccomp.json\"" time="2024-07-27T12:32:26-04:00" level=info msg="Sysctl net.ipv4.ping_group_range=0 0 ignored in containers.conf, since Network Namespace set to host" time="2024-07-27T12:32:26-04:00" level=debug msg="Adding mount /proc" time="2024-07-27T12:32:26-04:00" level=debug msg="Adding mount /dev" time="2024-07-27T12:32:26-04:00" level=debug msg="Adding mount /dev/pts" time="2024-07-27T12:32:26-04:00" level=debug msg="Adding mount /dev/mqueue" time="2024-07-27T12:32:26-04:00" level=debug msg="Adding mount /sys" time="2024-07-27T12:32:26-04:00" level=debug msg="Adding mount /sys/fs/cgroup" time="2024-07-27T12:32:26-04:00" level=debug msg="Allocated lock 2 for container 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc" time="2024-07-27T12:32:26-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Created container \"3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Container \"3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc\" has work directory \"/var/lib/containers/storage/overlay-containers/3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc/userdata\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Container \"3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc\" has run directory \"/run/containers/storage/overlay-containers/3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc/userdata\"" time="2024-07-27T12:32:26-04:00" level=debug msg="Strongconnecting node afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2" time="2024-07-27T12:32:26-04:00" level=debug msg="Pushed afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 onto stack" time="2024-07-27T12:32:26-04:00" level=debug msg="Finishing node afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2. Popped afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 off stack" time="2024-07-27T12:32:26-04:00" level=debug msg="Strongconnecting node 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc" time="2024-07-27T12:32:26-04:00" level=debug msg="Pushed 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc onto stack" time="2024-07-27T12:32:26-04:00" level=debug msg="Finishing node 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc. Popped 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc off stack" time="2024-07-27T12:32:26-04:00" level=debug msg="overlay: mount_data=lowerdir=/var/lib/containers/storage/overlay/l/DVYPS57NAT2PVRUITZZJSWK23Y,upperdir=/var/lib/containers/storage/overlay/83cb103a9de7ad353fd0f2688ab5da4f2595a5970f0f38ac37e366fb0c220a1b/diff,workdir=/var/lib/containers/storage/overlay/83cb103a9de7ad353fd0f2688ab5da4f2595a5970f0f38ac37e366fb0c220a1b/work,nodev,metacopy=on,context=\"system_u:object_r:container_file_t:s0:c459,c881\"" time="2024-07-27T12:32:27-04:00" level=debug msg="Mounted container \"afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2\" at \"/var/lib/containers/storage/overlay/83cb103a9de7ad353fd0f2688ab5da4f2595a5970f0f38ac37e366fb0c220a1b/merged\"" time="2024-07-27T12:32:27-04:00" level=debug msg="Created root filesystem for container afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 at /var/lib/containers/storage/overlay/83cb103a9de7ad353fd0f2688ab5da4f2595a5970f0f38ac37e366fb0c220a1b/merged" time="2024-07-27T12:32:27-04:00" level=debug msg="Made network namespace at /run/netns/netns-3c95832c-65e9-4497-a783-ab962ad96ef4 for container afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2" [DEBUG netavark::network::validation] "Validating network namespace..." [DEBUG netavark::commands::setup] "Setting up..." [INFO netavark::firewall] Using iptables firewall driver [DEBUG netavark::network::bridge] Setup network podman-default-kube-network [DEBUG netavark::network::bridge] Container interface name: eth0 with IP addresses [10.89.0.2/24] [DEBUG netavark::network::bridge] Bridge name: podman1 with IP addresses [10.89.0.1/24] [DEBUG netavark::network::core_utils] Setting sysctl value for net.ipv4.ip_forward to 1 [DEBUG netavark::network::core_utils] Setting sysctl value for /proc/sys/net/ipv4/conf/podman1/rp_filter to 2 [DEBUG netavark::network::core_utils] Setting sysctl value for /proc/sys/net/ipv6/conf/eth0/autoconf to 0 [DEBUG netavark::network::core_utils] Setting sysctl value for /proc/sys/net/ipv4/conf/eth0/arp_notify to 1 [DEBUG netavark::network::core_utils] Setting sysctl value for /proc/sys/net/ipv4/conf/eth0/rp_filter to 2 [INFO netavark::network::netlink] Adding route (dest: 0.0.0.0/0 ,gw: 10.89.0.1, metric 100) [DEBUG netavark::firewall::varktables::helpers] chain NETAVARK-4B9D9135B29BA created on table nat [DEBUG netavark::firewall::varktables::helpers] chain NETAVARK_ISOLATION_2 created on table filter [DEBUG netavark::firewall::varktables::helpers] chain NETAVARK_ISOLATION_3 created on table filter [DEBUG netavark::firewall::varktables::helpers] chain NETAVARK_INPUT created on table filter [DEBUG netavark::firewall::varktables::helpers] chain NETAVARK_FORWARD created on table filter [DEBUG netavark::firewall::varktables::helpers] rule -d 10.89.0.0/24 -j ACCEPT created on table nat and chain NETAVARK-4B9D9135B29BA [DEBUG netavark::firewall::varktables::helpers] rule ! -d 224.0.0.0/4 -j MASQUERADE created on table nat and chain NETAVARK-4B9D9135B29BA [DEBUG netavark::firewall::varktables::helpers] rule -s 10.89.0.0/24 -j NETAVARK-4B9D9135B29BA created on table nat and chain POSTROUTING [DEBUG netavark::firewall::varktables::helpers] rule -p udp -s 10.89.0.0/24 --dport 53 -j ACCEPT created on table filter and chain NETAVARK_INPUT [DEBUG netavark::firewall::varktables::helpers] rule -m conntrack --ctstate INVALID -j DROP created on table filter and chain NETAVARK_FORWARD [DEBUG netavark::firewall::varktables::helpers] rule -d 10.89.0.0/24 -m conntrack --ctstate RELATED,ESTABLISHED -j ACCEPT created on table filter and chain NETAVARK_FORWARD [DEBUG netavark::firewall::varktables::helpers] rule -s 10.89.0.0/24 -j ACCEPT created on table filter and chain NETAVARK_FORWARD [DEBUG netavark::firewall::firewalld] Adding firewalld rules for network 10.89.0.0/24 [DEBUG netavark::firewall::firewalld] Adding subnet 10.89.0.0/24 to zone trusted as source [DEBUG netavark::network::core_utils] Setting sysctl value for net.ipv4.conf.podman1.route_localnet to 1 [DEBUG netavark::firewall::varktables::helpers] chain NETAVARK-HOSTPORT-SETMARK created on table nat [DEBUG netavark::firewall::varktables::helpers] chain NETAVARK-HOSTPORT-MASQ created on table nat [DEBUG netavark::firewall::varktables::helpers] chain NETAVARK-DN-4B9D9135B29BA created on table nat [DEBUG netavark::firewall::varktables::helpers] chain NETAVARK-HOSTPORT-DNAT created on table nat [DEBUG netavark::firewall::varktables::helpers] rule -j MARK --set-xmark 0x2000/0x2000 created on table nat and chain NETAVARK-HOSTPORT-SETMARK [DEBUG netavark::firewall::varktables::helpers] rule -j MASQUERADE -m comment --comment 'netavark portfw masq mark' -m mark --mark 0x2000/0x2000 created on table nat and chain NETAVARK-HOSTPORT-MASQ [DEBUG netavark::firewall::varktables::helpers] rule -j NETAVARK-HOSTPORT-SETMARK -s 10.89.0.0/24 -p tcp --dport 15002 created on table nat and chain NETAVARK-DN-4B9D9135B29BA [DEBUG netavark::firewall::varktables::helpers] rule -j NETAVARK-HOSTPORT-SETMARK -s 127.0.0.1 -p tcp --dport 15002 created on table nat and chain NETAVARK-DN-4B9D9135B29BA [DEBUG netavark::firewall::varktables::helpers] rule -j DNAT -p tcp --to-destination 10.89.0.2:80 --destination-port 15002 created on table nat and chain NETAVARK-DN-4B9D9135B29BA [DEBUG netavark::firewall::varktables::helpers] rule -j NETAVARK-DN-4B9D9135B29BA -p tcp --dport 15002 -m comment --comment 'dnat name: podman-default-kube-network id: afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2' created on table nat and chain NETAVARK-HOSTPORT-DNAT [DEBUG netavark::firewall::varktables::helpers] rule -j NETAVARK-HOSTPORT-DNAT -m addrtype --dst-type LOCAL created on table nat and chain PREROUTING [DEBUG netavark::firewall::varktables::helpers] rule -j NETAVARK-HOSTPORT-DNAT -m addrtype --dst-type LOCAL created on table nat and chain OUTPUT [DEBUG netavark::dns::aardvark] Spawning aardvark server [DEBUG netavark::dns::aardvark] start aardvark-dns: ["systemd-run", "-q", "--scope", "/usr/libexec/podman/aardvark-dns", "--config", "/run/containers/networks/aardvark-dns", "-p", "53", "run"] [DEBUG netavark::commands::setup] { "podman-default-kube-network": StatusBlock { dns_search_domains: Some( [ "dns.podman", ], ), dns_server_ips: Some( [ 10.89.0.1, ], ), interfaces: Some( { "eth0": NetInterface { mac_address: "26:3b:f3:ac:71:fd", subnets: Some( [ NetAddress { gateway: Some( 10.89.0.1, ), ipnet: 10.89.0.2/24, }, ], ), }, }, ), }, } [DEBUG netavark::commands::setup] "Setup complete" time="2024-07-27T12:32:27-04:00" level=debug msg="/etc/system-fips does not exist on host, not mounting FIPS mode subscription" time="2024-07-27T12:32:27-04:00" level=debug msg="Setting Cgroups for container afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 to machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice:libpod:afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2" time="2024-07-27T12:32:27-04:00" level=debug msg="reading hooks from /usr/share/containers/oci/hooks.d" time="2024-07-27T12:32:27-04:00" level=debug msg="Workdir \"/\" resolved to host path \"/var/lib/containers/storage/overlay/83cb103a9de7ad353fd0f2688ab5da4f2595a5970f0f38ac37e366fb0c220a1b/merged\"" time="2024-07-27T12:32:27-04:00" level=debug msg="Created OCI spec for container afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 at /var/lib/containers/storage/overlay-containers/afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2/userdata/config.json" time="2024-07-27T12:32:27-04:00" level=debug msg="Created cgroup path machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice for parent machine.slice and name libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799" time="2024-07-27T12:32:27-04:00" level=debug msg="Created cgroup machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice" time="2024-07-27T12:32:27-04:00" level=debug msg="Got pod cgroup as machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice" time="2024-07-27T12:32:27-04:00" level=debug msg="/usr/bin/conmon messages will be logged to syslog" time="2024-07-27T12:32:27-04:00" level=debug msg="running conmon: /usr/bin/conmon" args="[--api-version 1 -c afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 -u afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 -r /usr/bin/crun -b /var/lib/containers/storage/overlay-containers/afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2/userdata -p /run/containers/storage/overlay-containers/afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2/userdata/pidfile -n c904233a5e6a-infra --exit-dir /run/libpod/exits --persist-dir /run/libpod/persist/afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 --full-attach -s -l journald --log-level debug --syslog --conmon-pidfile /run/containers/storage/overlay-containers/afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2/userdata/conmon.pid --exit-command /usr/bin/podman --exit-command-arg --root --exit-command-arg /var/lib/containers/storage --exit-command-arg --runroot --exit-command-arg /run/containers/storage --exit-command-arg --log-level --exit-command-arg debug --exit-command-arg --cgroup-manager --exit-command-arg systemd --exit-command-arg --tmpdir --exit-command-arg /run/libpod --exit-command-arg --network-config-dir --exit-command-arg --exit-command-arg --network-backend --exit-command-arg netavark --exit-command-arg --volumepath --exit-command-arg /var/lib/containers/storage/volumes --exit-command-arg --db-backend --exit-command-arg sqlite --exit-command-arg --transient-store=false --exit-command-arg --runtime --exit-command-arg crun --exit-command-arg --storage-driver --exit-command-arg overlay --exit-command-arg --storage-opt --exit-command-arg overlay.mountopt=nodev,metacopy=on --exit-command-arg --events-backend --exit-command-arg journald --exit-command-arg --syslog --exit-command-arg container --exit-command-arg cleanup --exit-command-arg afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2]" time="2024-07-27T12:32:27-04:00" level=info msg="Running conmon under slice machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice and unitName libpod-conmon-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope" time="2024-07-27T12:32:27-04:00" level=debug msg="Received: 27219" time="2024-07-27T12:32:27-04:00" level=info msg="Got Conmon PID as 27217" time="2024-07-27T12:32:27-04:00" level=debug msg="Created container afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 in OCI runtime" time="2024-07-27T12:32:27-04:00" level=debug msg="Adding nameserver(s) from network status of '[\"10.89.0.1\"]'" time="2024-07-27T12:32:27-04:00" level=debug msg="Adding search domain(s) from network status of '[\"dns.podman\"]'" time="2024-07-27T12:32:27-04:00" level=debug msg="Starting container afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 with command [/catatonit -P]" time="2024-07-27T12:32:27-04:00" level=debug msg="Started container afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2" time="2024-07-27T12:32:27-04:00" level=debug msg="overlay: mount_data=lowerdir=/var/lib/containers/storage/overlay/l/3WDTKE7TBSQ2L4AUK7DIBLPBKR,upperdir=/var/lib/containers/storage/overlay/37385b1296d2430128fb86fd05380b1a877dc672ada5c2c1f396e9da2d20e84b/diff,workdir=/var/lib/containers/storage/overlay/37385b1296d2430128fb86fd05380b1a877dc672ada5c2c1f396e9da2d20e84b/work,nodev,metacopy=on,context=\"system_u:object_r:container_file_t:s0:c459,c881\"" time="2024-07-27T12:32:27-04:00" level=debug msg="Mounted container \"3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc\" at \"/var/lib/containers/storage/overlay/37385b1296d2430128fb86fd05380b1a877dc672ada5c2c1f396e9da2d20e84b/merged\"" time="2024-07-27T12:32:27-04:00" level=debug msg="Created root filesystem for container 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc at /var/lib/containers/storage/overlay/37385b1296d2430128fb86fd05380b1a877dc672ada5c2c1f396e9da2d20e84b/merged" time="2024-07-27T12:32:27-04:00" level=debug msg="/etc/system-fips does not exist on host, not mounting FIPS mode subscription" time="2024-07-27T12:32:27-04:00" level=debug msg="Setting Cgroups for container 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc to machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice:libpod:3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc" time="2024-07-27T12:32:27-04:00" level=debug msg="reading hooks from /usr/share/containers/oci/hooks.d" time="2024-07-27T12:32:27-04:00" level=debug msg="Workdir \"/var/www\" resolved to a volume or mount" time="2024-07-27T12:32:27-04:00" level=debug msg="Created OCI spec for container 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc at /var/lib/containers/storage/overlay-containers/3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc/userdata/config.json" time="2024-07-27T12:32:27-04:00" level=debug msg="Created cgroup path machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice for parent machine.slice and name libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799" time="2024-07-27T12:32:27-04:00" level=debug msg="Created cgroup machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice" time="2024-07-27T12:32:27-04:00" level=debug msg="Got pod cgroup as machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice" time="2024-07-27T12:32:27-04:00" level=debug msg="/usr/bin/conmon messages will be logged to syslog" time="2024-07-27T12:32:27-04:00" level=debug msg="running conmon: /usr/bin/conmon" args="[--api-version 1 -c 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc -u 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc -r /usr/bin/crun -b /var/lib/containers/storage/overlay-containers/3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc/userdata -p /run/containers/storage/overlay-containers/3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc/userdata/pidfile -n httpd2-httpd2 --exit-dir /run/libpod/exits --persist-dir /run/libpod/persist/3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc --full-attach -s -l journald --log-level debug --syslog --conmon-pidfile /run/containers/storage/overlay-containers/3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc/userdata/conmon.pid --exit-command /usr/bin/podman --exit-command-arg --root --exit-command-arg /var/lib/containers/storage --exit-command-arg --runroot --exit-command-arg /run/containers/storage --exit-command-arg --log-level --exit-command-arg debug --exit-command-arg --cgroup-manager --exit-command-arg systemd --exit-command-arg --tmpdir --exit-command-arg /run/libpod --exit-command-arg --network-config-dir --exit-command-arg --exit-command-arg --network-backend --exit-command-arg netavark --exit-command-arg --volumepath --exit-command-arg /var/lib/containers/storage/volumes --exit-command-arg --db-backend --exit-command-arg sqlite --exit-command-arg --transient-store=false --exit-command-arg --runtime --exit-command-arg crun --exit-command-arg --storage-driver --exit-command-arg overlay --exit-command-arg --storage-opt --exit-command-arg overlay.mountopt=nodev,metacopy=on --exit-command-arg --events-backend --exit-command-arg journald --exit-command-arg --syslog --exit-command-arg container --exit-command-arg cleanup --exit-command-arg 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc]" time="2024-07-27T12:32:27-04:00" level=info msg="Running conmon under slice machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice and unitName libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope" time="2024-07-27T12:32:27-04:00" level=debug msg="Received: 27224" time="2024-07-27T12:32:27-04:00" level=info msg="Got Conmon PID as 27222" time="2024-07-27T12:32:27-04:00" level=debug msg="Created container 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc in OCI runtime" time="2024-07-27T12:32:27-04:00" level=debug msg="Starting container 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc with command [/bin/busybox-extras httpd -f -p 80]" time="2024-07-27T12:32:27-04:00" level=debug msg="Started container 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc" time="2024-07-27T12:32:27-04:00" level=debug msg="Called kube.PersistentPostRunE(/usr/bin/podman play kube --start=true --log-level=debug /etc/containers/ansible-kubernetes.d/httpd2.yml)" time="2024-07-27T12:32:27-04:00" level=debug msg="Shutting down engines" Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[27097]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE rc: 0 Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[27332]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None Jul 27 12:32:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Reloading. Jul 27 12:32:28 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-rc-local-generator[27350]: /etc/rc.d/rc.local is not marked executable, skipping. Jul 27 12:32:28 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[27472]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service scope=system enabled=True daemon_reload=False daemon_reexec=False no_block=False state=None force=None masked=None Jul 27 12:32:28 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Reloading. Jul 27 12:32:28 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-rc-local-generator[27492]: /etc/rc.d/rc.local is not marked executable, skipping. Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[27614]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service scope=system state=started daemon_reload=False daemon_reexec=False no_block=False enabled=None force=None masked=None Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Created slice Slice /system/podman-kube. ░░ Subject: A start job for unit system-podman\x2dkube.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit system-podman\x2dkube.slice has finished successfully. ░░ ░░ The job identifier is 1703. Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Starting A template for running K8s workloads via podman-kube-play... ░░ Subject: A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has begun execution. ░░ ░░ The job identifier is 1702. Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:29.62383584 -0400 EDT m=+0.033616663 pod stop c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799 (image=, name=httpd2) Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope has successfully entered the 'dead' state. Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27217]: conmon afcc88467a690a167268 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice/libpod-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope/container/memory.events Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:29.65822663 -0400 EDT m=+0.068007814 container died afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 (image=localhost/podman-pause:5.1.2-1720678294, name=c904233a5e6a-infra, io.buildah.version=1.36.0) Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com aardvark-dns[27210]: Received SIGHUP will refresh servers: 1 Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com aardvark-dns[27210]: No configuration found stopping the sever Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: run-rf635b93922984e84a00f3e304e18977c.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-rf635b93922984e84a00f3e304e18977c.scope has successfully entered the 'dead' state. Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth0 left promiscuous mode Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Called cleanup.PersistentPreRunE(/usr/bin/podman --root /var/lib/containers/storage --runroot /run/containers/storage --log-level debug --cgroup-manager systemd --tmpdir /run/libpod --network-config-dir --network-backend netavark --volumepath /var/lib/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --storage-opt overlay.mountopt=nodev,metacopy=on --events-backend journald --syslog container cleanup afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2)" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Setting custom database backend: \"sqlite\"" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=info msg="Using sqlite as database backend" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Using graph driver overlay" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Using graph root /var/lib/containers/storage" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Using run root /run/containers/storage" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Using static dir /var/lib/containers/storage/libpod" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Using tmp dir /run/libpod" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Using volume path /var/lib/containers/storage/volumes" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Using transient store: false" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Cached value indicated that metacopy is being used" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Cached value indicated that native-diff is not being used" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=false, usingMetacopy=true" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Initializing event backend journald" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=info msg="Setting parallel job count to 7" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097949.7231] device (podman1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: run-netns-netns\x2d3c95832c\x2d65e9\x2d4497\x2da783\x2dab962ad96ef4.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-netns-netns\x2d3c95832c\x2d65e9\x2d4497\x2da783\x2dab962ad96ef4.mount has successfully entered the 'dead' state. Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.Cf1YTY.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.Cf1YTY.mount has successfully entered the 'dead' state. Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:29.944053013 -0400 EDT m=+0.353833701 container cleanup afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 (image=localhost/podman-pause:5.1.2-1720678294, name=c904233a5e6a-infra, pod_id=c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799, io.buildah.version=1.36.0) Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Called cleanup.PersistentPostRunE(/usr/bin/podman --root /var/lib/containers/storage --runroot /run/containers/storage --log-level debug --cgroup-manager systemd --tmpdir /run/libpod --network-config-dir --network-backend netavark --volumepath /var/lib/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --storage-opt overlay.mountopt=nodev,metacopy=on --events-backend journald --syslog container cleanup afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2)" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27629]: time="2024-07-27T12:32:29-04:00" level=debug msg="Shutting down engines" Jul 27 12:32:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-conmon-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-conmon-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2.scope has successfully entered the 'dead' state. Jul 27 12:32:30 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-83cb103a9de7ad353fd0f2688ab5da4f2595a5970f0f38ac37e366fb0c220a1b-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-83cb103a9de7ad353fd0f2688ab5da4f2595a5970f0f38ac37e366fb0c220a1b-merged.mount has successfully entered the 'dead' state. Jul 27 12:32:30 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: time="2024-07-27T12:32:39-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd2-httpd2 in 10 seconds, resorting to SIGKILL" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope has successfully entered the 'dead' state. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27222]: conmon 3bb4f888f7fc283d2f9a : container 27224 exited with status 137 Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27222]: conmon 3bb4f888f7fc283d2f9a : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice/libpod-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope/container/memory.events Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.653213967 -0400 EDT m=+10.062994886 container died 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.Phcwvg.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.Phcwvg.mount has successfully entered the 'dead' state. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Called cleanup.PersistentPreRunE(/usr/bin/podman --root /var/lib/containers/storage --runroot /run/containers/storage --log-level debug --cgroup-manager systemd --tmpdir /run/libpod --network-config-dir --network-backend netavark --volumepath /var/lib/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --storage-opt overlay.mountopt=nodev,metacopy=on --events-backend journald --syslog container cleanup 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc)" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-37385b1296d2430128fb86fd05380b1a877dc672ada5c2c1f396e9da2d20e84b-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-37385b1296d2430128fb86fd05380b1a877dc672ada5c2c1f396e9da2d20e84b-merged.mount has successfully entered the 'dead' state. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Setting custom database backend: \"sqlite\"" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=info msg="Using sqlite as database backend" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Using graph driver overlay" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Using graph root /var/lib/containers/storage" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Using run root /run/containers/storage" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Using static dir /var/lib/containers/storage/libpod" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Using tmp dir /run/libpod" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Using volume path /var/lib/containers/storage/volumes" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Using transient store: false" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Cached value indicated that metacopy is being used" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Cached value indicated that native-diff is not being used" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=false, usingMetacopy=true" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Initializing event backend journald" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=info msg="Setting parallel job count to 7" Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.708577608 -0400 EDT m=+10.118358452 container cleanup 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test) Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopping libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope... ░░ Subject: A stop job for unit libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope has begun execution. ░░ ░░ The job identifier is 1773. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=info msg="Received shutdown signal \"terminated\", terminating!" PID=27676 Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com /usr/bin/podman[27676]: time="2024-07-27T12:32:39-04:00" level=info msg="Invoking shutdown handler \"libpod\"" PID=27676 Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope has successfully entered the 'dead' state. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopped libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope. ░░ Subject: A stop job for unit libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit libpod-conmon-3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc.scope has finished. ░░ ░░ The job identifier is 1773 and the job result is done. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit NetworkManager-dispatcher.service has successfully entered the 'dead' state. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Removed slice cgroup machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice. ░░ Subject: A stop job for unit machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice has finished. ░░ ░░ The job identifier is 1772 and the job result is done. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.791403868 -0400 EDT m=+10.201184564 container remove 3bb4f888f7fc283d2f9a25d9d6091e2fa0698bff0816aef3fb9278b2b03bf9bc (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test, created_at=2021-06-10T18:55:36Z) Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.826950245 -0400 EDT m=+10.236731093 container remove afcc88467a690a167268abd4061385e7596b316ffb69c8501ef0b7ad2f4754e2 (image=localhost/podman-pause:5.1.2-1720678294, name=c904233a5e6a-infra, pod_id=c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799, io.buildah.version=1.36.0) Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice: Failed to open /run/systemd/transient/machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice: No such file or directory Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.837001605 -0400 EDT m=+10.246782294 pod remove c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799 (image=, name=httpd2) Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: Pods stopped: Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799 Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: Pods removed: Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: Error: removing pod c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799 cgroup: removing pod c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799 cgroup: Unit machine-libpod_pod_c904233a5e6a411a5b0cd113119363dcc08244961471679127e26fc7f5668799.slice not loaded. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: Secrets removed: Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: Error: %!s() Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: Volumes removed: Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.865904349 -0400 EDT m=+10.275685452 container create 63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666 (image=localhost/podman-pause:5.1.2-1720678294, name=ec7e5af3b69e-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Created slice cgroup machine-libpod_pod_0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9.slice. ░░ Subject: A start job for unit machine-libpod_pod_0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9.slice has finished successfully. ░░ ░░ The job identifier is 1774. Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.919440365 -0400 EDT m=+10.329221053 container create c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7 (image=localhost/podman-pause:5.1.2-1720678294, name=0333edc697be-infra, pod_id=0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, io.buildah.version=1.36.0) Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.931626856 -0400 EDT m=+10.341407529 pod create 0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9 (image=, name=httpd2) Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.96445956 -0400 EDT m=+10.374240537 container create 6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3 (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage) Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.964950787 -0400 EDT m=+10.374731789 container restart 63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666 (image=localhost/podman-pause:5.1.2-1720678294, name=ec7e5af3b69e-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:32:39 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:39.937430155 -0400 EDT m=+10.347211342 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libcrun container. ░░ Subject: A start job for unit libpod-63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666.scope has finished successfully. ░░ ░░ The job identifier is 1778. Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:40.038794318 -0400 EDT m=+10.448575223 container init 63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666 (image=localhost/podman-pause:5.1.2-1720678294, name=ec7e5af3b69e-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:40.042678877 -0400 EDT m=+10.452459761 container start 63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666 (image=localhost/podman-pause:5.1.2-1720678294, name=ec7e5af3b69e-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.0594] manager: (podman1): new Bridge device (/org/freedesktop/NetworkManager/Devices/5) Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth0 entered promiscuous mode Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered forwarding state Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: IPv6: ADDRCONF(NETDEV_CHANGE): veth0: link becomes ready Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.0931] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/6) Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.0941] device (veth0): carrier: link connected Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.0944] device (podman1): carrier: link connected Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-udevd[27694]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-udevd[27693]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.1160] device (podman1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.1176] device (podman1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.1192] device (podman1): Activation: starting connection 'podman1' (63cadcf3-8c0d-471e-b3d8-39c1ef709808) Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.1194] device (podman1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.1204] device (podman1): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.1211] device (podman1): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.1218] device (podman1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Starting Network Manager Script Dispatcher Service... ░░ Subject: A start job for unit NetworkManager-dispatcher.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has begun execution. ░░ ░░ The job identifier is 1783. Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started Network Manager Script Dispatcher Service. ░░ Subject: A start job for unit NetworkManager-dispatcher.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has finished successfully. ░░ ░░ The job identifier is 1783. Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.1635] device (podman1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.1638] device (podman1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097960.1644] device (podman1): Activation: successful, device activated. Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started /usr/libexec/podman/aardvark-dns --config /run/containers/networks/aardvark-dns -p 53 run. ░░ Subject: A start job for unit run-rc5239bbd4bb64814889097c0b095abdd.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit run-rc5239bbd4bb64814889097c0b095abdd.scope has finished successfully. ░░ ░░ The job identifier is 1847. Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libcrun container. ░░ Subject: A start job for unit libpod-c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7.scope has finished successfully. ░░ ░░ The job identifier is 1851. Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:40.287177045 -0400 EDT m=+10.696957973 container init c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7 (image=localhost/podman-pause:5.1.2-1720678294, name=0333edc697be-infra, pod_id=0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9, io.buildah.version=1.36.0, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:40.291170114 -0400 EDT m=+10.700950916 container start c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7 (image=localhost/podman-pause:5.1.2-1720678294, name=0333edc697be-infra, pod_id=0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9, io.buildah.version=1.36.0, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libcrun container. ░░ Subject: A start job for unit libpod-6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3.scope has finished successfully. ░░ ░░ The job identifier is 1856. Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:40.338536993 -0400 EDT m=+10.748318145 container init 6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3 (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, app=test, created_at=2021-06-10T18:55:36Z) Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:40.342523818 -0400 EDT m=+10.752304787 container start 6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3 (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 2024-07-27 12:32:40.348684184 -0400 EDT m=+10.758464877 pod start 0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9 (image=, name=httpd2) Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started A template for running K8s workloads via podman-kube-play. ░░ Subject: A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has finished successfully. ░░ ░░ The job identifier is 1702. Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: Pod: Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9 Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: Container: Jul 27 12:32:40 ip-10-31-12-229.us-east-1.aws.redhat.com podman[27618]: 6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3 Jul 27 12:32:41 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[27892]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:32:41 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[28000]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:42 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[28109]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd3.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:43 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[28217]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman/httpd3 state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:44 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[28324]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman/httpd3-create state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:45 ip-10-31-12-229.us-east-1.aws.redhat.com podman[28448]: 2024-07-27 12:32:45.247024401 -0400 EDT m=+0.535824759 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:32:45 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[28569]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:46 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[28676]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:46 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[28783]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:32:46 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[28868]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/ansible-kubernetes.d/httpd3.yml owner=root group=0 mode=0644 src=/root/.ansible/tmp/ansible-tmp-1722097966.1948674-8956-160637284820936/.source.yml _original_basename=.rck8z22o follow=False checksum=357a4dee3ead538c9b0b23b7f6bad0dfb461c402 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[28975]: ansible-containers.podman.podman_play Invoked with state=started kube_file=/etc/containers/ansible-kubernetes.d/httpd3.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None debug=None quiet=None recreate=None userns=None log_level=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Created slice cgroup machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice. ░░ Subject: A start job for unit machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice has finished successfully. ░░ ░░ The job identifier is 1861. Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com podman[28982]: 2024-07-27 12:32:47.307705136 -0400 EDT m=+0.074336331 container create a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5 (image=localhost/podman-pause:5.1.2-1720678294, name=d06e81f27e10-infra, pod_id=d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f, io.buildah.version=1.36.0) Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com podman[28982]: 2024-07-27 12:32:47.314087377 -0400 EDT m=+0.080718475 pod create d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f (image=, name=httpd3) Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com podman[28982]: 2024-07-27 12:32:47.341693521 -0400 EDT m=+0.108324696 container create 91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test, created_at=2021-06-10T18:55:36Z) Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com podman[28982]: 2024-07-27 12:32:47.316825714 -0400 EDT m=+0.083456951 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered blocking state Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth1 entered promiscuous mode Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: IPv6: ADDRCONF(NETDEV_CHANGE): veth1: link becomes ready Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered blocking state Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered forwarding state Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097967.3723] manager: (veth1): new Veth device (/org/freedesktop/NetworkManager/Devices/7) Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097967.3796] device (veth1): carrier: link connected Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-udevd[28995]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libpod-conmon-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5.scope. ░░ Subject: A start job for unit libpod-conmon-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-conmon-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5.scope has finished successfully. ░░ ░░ The job identifier is 1866. Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libcrun container. ░░ Subject: A start job for unit libpod-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5.scope has finished successfully. ░░ ░░ The job identifier is 1871. Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com podman[28982]: 2024-07-27 12:32:47.525380472 -0400 EDT m=+0.292011755 container init a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5 (image=localhost/podman-pause:5.1.2-1720678294, name=d06e81f27e10-infra, pod_id=d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f, io.buildah.version=1.36.0) Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com podman[28982]: 2024-07-27 12:32:47.532760241 -0400 EDT m=+0.299391458 container start a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5 (image=localhost/podman-pause:5.1.2-1720678294, name=d06e81f27e10-infra, pod_id=d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f, io.buildah.version=1.36.0) Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope. ░░ Subject: A start job for unit libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope has finished successfully. ░░ ░░ The job identifier is 1876. Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libcrun container. ░░ Subject: A start job for unit libpod-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope has finished successfully. ░░ ░░ The job identifier is 1881. Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com podman[28982]: 2024-07-27 12:32:47.586556373 -0400 EDT m=+0.353187760 container init 91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com podman[28982]: 2024-07-27 12:32:47.590512214 -0400 EDT m=+0.357143418 container start 91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:32:47 ip-10-31-12-229.us-east-1.aws.redhat.com podman[28982]: 2024-07-27 12:32:47.596842145 -0400 EDT m=+0.363473248 pod start d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f (image=, name=httpd3) Jul 27 12:32:48 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[29173]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None Jul 27 12:32:48 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Reloading. Jul 27 12:32:48 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-rc-local-generator[29191]: /etc/rc.d/rc.local is not marked executable, skipping. Jul 27 12:32:48 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[29313]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service scope=system enabled=True daemon_reload=False daemon_reexec=False no_block=False state=None force=None masked=None Jul 27 12:32:48 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Reloading. Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-rc-local-generator[29334]: /etc/rc.d/rc.local is not marked executable, skipping. Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[29455]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service scope=system state=started daemon_reload=False daemon_reexec=False no_block=False enabled=None force=None masked=None Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Starting A template for running K8s workloads via podman-kube-play... ░░ Subject: A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has begun execution. ░░ ░░ The job identifier is 1886. Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:32:49.739768845 -0400 EDT m=+0.030946202 pod stop d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f (image=, name=httpd3) Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.T6fkwb.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.T6fkwb.mount has successfully entered the 'dead' state. Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5.scope has successfully entered the 'dead' state. Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:32:49.766815764 -0400 EDT m=+0.057993143 container died a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5 (image=localhost/podman-pause:5.1.2-1720678294, name=d06e81f27e10-infra, io.buildah.version=1.36.0) Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.v3a1LS.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.v3a1LS.mount has successfully entered the 'dead' state. Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth1 left promiscuous mode Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:32:49.881245334 -0400 EDT m=+0.172422449 container cleanup a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5 (image=localhost/podman-pause:5.1.2-1720678294, name=d06e81f27e10-infra, pod_id=d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f, io.buildah.version=1.36.0) Jul 27 12:32:49 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-conmon-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-conmon-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5.scope has successfully entered the 'dead' state. Jul 27 12:32:50 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit NetworkManager-dispatcher.service has successfully entered the 'dead' state. Jul 27 12:32:50 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: run-netns-netns\x2d0a0f767b\x2d3815\x2d181e\x2d50b8\x2dcc9f05394d40.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-netns-netns\x2d0a0f767b\x2d3815\x2d181e\x2d50b8\x2dcc9f05394d40.mount has successfully entered the 'dead' state. Jul 27 12:32:50 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-e6ca1f02269c4e27e744017cc37b453bc0c2a10423bb94b4c52428e1176258e3-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-e6ca1f02269c4e27e744017cc37b453bc0c2a10423bb94b4c52428e1176258e3-merged.mount has successfully entered the 'dead' state. Jul 27 12:32:50 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: time="2024-07-27T12:32:59-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd3-httpd3 in 10 seconds, resorting to SIGKILL" Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.HCBiMc.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.HCBiMc.mount has successfully entered the 'dead' state. Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope has successfully entered the 'dead' state. Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:32:59.807739519 -0400 EDT m=+10.098916831 container died 91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:32:59.858840768 -0400 EDT m=+10.150017997 container cleanup 91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage) Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopping libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope... ░░ Subject: A stop job for unit libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope has begun execution. ░░ ░░ The job identifier is 1957. Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope has successfully entered the 'dead' state. Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopped libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope. ░░ Subject: A stop job for unit libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit libpod-conmon-91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089.scope has finished. ░░ ░░ The job identifier is 1957 and the job result is done. Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Removed slice cgroup machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice. ░░ Subject: A stop job for unit machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice has finished. ░░ ░░ The job identifier is 1956 and the job result is done. Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:32:59.867628064 -0400 EDT m=+10.158805172 pod stop d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f (image=, name=httpd3) Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice: Failed to open /run/systemd/transient/machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice: No such file or directory Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: time="2024-07-27T12:32:59-04:00" level=error msg="Checking if infra needs to be stopped: removing pod d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f cgroup: Unit machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice not loaded." Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:32:59.905798629 -0400 EDT m=+10.196976060 container remove 91cbca44d77745a073ff57356579198f89820a0789289cf02353011f3e8eb089 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage) Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:32:59.935227514 -0400 EDT m=+10.226404645 container remove a499fdb33559053866c9b1eddc1b8872c990b2284d827de46d398bcb15acdad5 (image=localhost/podman-pause:5.1.2-1720678294, name=d06e81f27e10-infra, pod_id=d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f, io.buildah.version=1.36.0) Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice: Failed to open /run/systemd/transient/machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice: No such file or directory Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:32:59.94422623 -0400 EDT m=+10.235403338 pod remove d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f (image=, name=httpd3) Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: Pods stopped: Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: Pods removed: Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: Error: removing pod d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f cgroup: removing pod d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f cgroup: Unit machine-libpod_pod_d06e81f27e109d8befb434a48b23264b083f955e3817a99c7737a5ad5d5d344f.slice not loaded. Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: Secrets removed: Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: Error: %!s() Jul 27 12:32:59 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: Volumes removed: Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.050504933 -0400 EDT m=+10.341682044 container create 32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d (image=localhost/podman-pause:5.1.2-1720678294, name=df543e1580ae-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Created slice cgroup machine-libpod_pod_23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7.slice. ░░ Subject: A start job for unit machine-libpod_pod_23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7.slice has finished successfully. ░░ ░░ The job identifier is 1958. Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.091305941 -0400 EDT m=+10.382483053 container create 2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7 (image=localhost/podman-pause:5.1.2-1720678294, name=23fad04f45a5-infra, pod_id=23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, io.buildah.version=1.36.0) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.098505007 -0400 EDT m=+10.389682221 pod create 23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7 (image=, name=httpd3) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.101362242 -0400 EDT m=+10.392539534 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.128704213 -0400 EDT m=+10.419881321 container create c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7, created_by=test/system/build-testimage, io.buildah.version=1.21.0, app=test, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, created_at=2021-06-10T18:55:36Z) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.129863671 -0400 EDT m=+10.421041006 container restart 32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d (image=localhost/podman-pause:5.1.2-1720678294, name=df543e1580ae-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libcrun container. ░░ Subject: A start job for unit libpod-32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d.scope has finished successfully. ░░ ░░ The job identifier is 1962. Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.175233893 -0400 EDT m=+10.466411435 container init 32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d (image=localhost/podman-pause:5.1.2-1720678294, name=df543e1580ae-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.178995069 -0400 EDT m=+10.470172324 container start 32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d (image=localhost/podman-pause:5.1.2-1720678294, name=df543e1580ae-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered blocking state Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth1 entered promiscuous mode Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097980.2002] manager: (veth1): new Veth device (/org/freedesktop/NetworkManager/Devices/8) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: IPv6: ADDRCONF(NETDEV_CHANGE): veth1: link becomes ready Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered blocking state Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered forwarding state Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722097980.2042] device (veth1): carrier: link connected Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-udevd[29515]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libcrun container. ░░ Subject: A start job for unit libpod-2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7.scope has finished successfully. ░░ ░░ The job identifier is 1967. Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.345067781 -0400 EDT m=+10.636245127 container init 2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7 (image=localhost/podman-pause:5.1.2-1720678294, name=23fad04f45a5-infra, pod_id=23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, io.buildah.version=1.36.0) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.348790804 -0400 EDT m=+10.639968227 container start 2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7 (image=localhost/podman-pause:5.1.2-1720678294, name=23fad04f45a5-infra, pod_id=23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, io.buildah.version=1.36.0) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started libcrun container. ░░ Subject: A start job for unit libpod-c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016.scope has finished successfully. ░░ ░░ The job identifier is 1972. Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.39985027 -0400 EDT m=+10.691027650 container init c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.40372514 -0400 EDT m=+10.694902385 container start c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, app=test) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 2024-07-27 12:33:00.410518376 -0400 EDT m=+10.701695620 pod start 23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7 (image=, name=httpd3) Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: Pod: Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: 23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7 Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: Container: Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com podman[29459]: c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016 Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started A template for running K8s workloads via podman-kube-play. ░░ Subject: A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has finished successfully. ░░ ░░ The job identifier is 1886. Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.f54lvk.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.f54lvk.mount has successfully entered the 'dead' state. Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-536d27d99f9004a8cd5fc963cec31b69f39bd13eec1488b17190cf4450215db3-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-536d27d99f9004a8cd5fc963cec31b69f39bd13eec1488b17190cf4450215db3-merged.mount has successfully entered the 'dead' state. Jul 27 12:33:00 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[29692]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpxhgnorkqazskrgcapmcpcedjywvlcw ; /usr/bin/python3.9 /var/tmp/ansible-tmp-1722097980.8589134-9008-141686773631681/AnsiballZ_command.py' Jul 27 12:33:01 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[29692]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:01 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[29694]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd1 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:01 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-29702.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 115. Jul 27 12:33:01 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[29692]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:01 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[29819]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd2 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:01 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[29934]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd3 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:02 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[30049]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwpfcxnepqymoriudyyvuebgiuiabxgn ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722097982.1569126-9030-98283515000115/AnsiballZ_command.py' Jul 27 12:33:02 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[30049]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:02 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[30051]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --user list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd1[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:02 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[30049]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:02 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[30161]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd2[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:03 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[30271]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd3[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:03 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[30381]: ansible-ansible.legacy.uri Invoked with url=http://localhost:15001/index.txt return_content=True force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False use_gssapi=False body_format=raw method=GET follow_redirects=safe status_code=[200] timeout=30 headers={} remote_src=False unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None ca_path=None ciphers=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:04 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[30489]: ansible-ansible.legacy.uri Invoked with url=http://localhost:15002/index.txt return_content=True force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False use_gssapi=False body_format=raw method=GET follow_redirects=safe status_code=[200] timeout=30 headers={} remote_src=False unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None ca_path=None ciphers=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:04 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[30597]: ansible-ansible.legacy.command Invoked with _raw_params=ls -alrtF /tmp/lsr_2r2po57x_podman/httpd1-create _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:04 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[30705]: ansible-ansible.legacy.command Invoked with _raw_params=ls -alrtF /tmp/lsr_2r2po57x_podman/httpd2-create _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:05 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[30813]: ansible-ansible.legacy.command Invoked with _raw_params=ls -alrtF /tmp/lsr_2r2po57x_podman/httpd3-create _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:07 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[31028]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:08 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[31141]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:33:08 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[31249]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:10 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[31358]: ansible-ansible.legacy.dnf Invoked with name=['firewalld'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jul 27 12:33:12 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[31466]: ansible-systemd Invoked with name=firewalld masked=False daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None Jul 27 12:33:12 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[31575]: ansible-ansible.legacy.systemd Invoked with name=firewalld state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Jul 27 12:33:13 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[31684]: ansible-fedora.linux_system_roles.firewall_lib Invoked with port=['15001-15003/tcp'] permanent=True runtime=True state=enabled __report_changed=True service=[] source_port=[] forward_port=[] rich_rule=[] source=[] interface=[] interface_pci_id=[] icmp_block=[] timeout=0 ipset_entries=[] protocol=[] helper_module=[] destination=[] firewalld_conf=None masquerade=None icmp_block_inversion=None target=None zone=None set_default_zone=None ipset=None ipset_type=None description=None short=None Jul 27 12:33:14 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[31791]: ansible-ansible.legacy.dnf Invoked with name=['python3-libselinux', 'python3-policycoreutils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jul 27 12:33:15 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[31899]: ansible-ansible.legacy.dnf Invoked with name=['policycoreutils-python-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jul 27 12:33:17 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[32007]: ansible-setup Invoked with filter=['ansible_selinux'] gather_subset=['all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Jul 27 12:33:18 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[32153]: ansible-fedora.linux_system_roles.local_seport Invoked with ports=['15001-15003'] proto=tcp setype=http_port_t state=present local=False ignore_selinux_state=False reload=True Jul 27 12:33:19 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[32260]: ansible-fedora.linux_system_roles.selinux_modules_facts Invoked Jul 27 12:33:24 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[32367]: ansible-getent Invoked with database=passwd key=podman_basic_user fail_key=False service=None split=None Jul 27 12:33:24 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[32475]: ansible-getent Invoked with database=group key=3001 fail_key=False service=None split=None Jul 27 12:33:25 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[32583]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:25 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[32692]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:26 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[32800]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[32908]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33016]: ansible-ansible.legacy.command Invoked with creates=/var/lib/systemd/linger/podman_basic_user _raw_params=loginctl enable-linger podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None removes=None stdin=None Jul 27 12:33:28 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33123]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman/httpd1 state=directory owner=podman_basic_user group=3001 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:28 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33230]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman/httpd1-create state=directory owner=podman_basic_user group=3001 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:28 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[33337]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrtddghvizboghotwabztwswimnwqmcq ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098008.80704-9383-81213612032775/AnsiballZ_podman_image.py' Jul 27 12:33:28 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[33337]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-33340.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 119. Jul 27 12:33:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-33347.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 123. Jul 27 12:33:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-33354.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 127. Jul 27 12:33:29 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-33361.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 131. Jul 27 12:33:29 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[33337]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:30 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33475]: ansible-stat Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:30 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33584]: ansible-file Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d state=directory owner=podman_basic_user group=3001 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33691]: ansible-ansible.legacy.stat Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33745]: ansible-ansible.legacy.file Invoked with owner=podman_basic_user group=3001 mode=0644 dest=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml _original_basename=.q3o678z2 recurse=False state=file path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[33852]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnczxunrardrqfcssdbcsxqkyliifgld ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098011.3888984-9419-159959416239066/AnsiballZ_podman_play.py' Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[33852]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33854]: ansible-containers.podman.podman_play Invoked with state=started debug=True log_level=debug kube_file=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None quiet=None recreate=None userns=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-33861.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 135. Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Created slice cgroup user-libpod_pod_b0265840b5794cd2683857fe98c60ce0418994a3cfd679c18930ffb4543990cc.slice. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 139. Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33854]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE command: /bin/podman play kube --start=true --log-level=debug /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33854]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stdout: Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33854]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stderr: time="2024-07-27T12:33:31-04:00" level=info msg="/bin/podman filtering at log level debug" time="2024-07-27T12:33:31-04:00" level=debug msg="Called kube.PersistentPreRunE(/bin/podman play kube --start=true --log-level=debug /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml)" time="2024-07-27T12:33:31-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" time="2024-07-27T12:33:31-04:00" level=info msg="Using sqlite as database backend" time="2024-07-27T12:33:31-04:00" level=debug msg="systemd-logind: Unknown object '/'." time="2024-07-27T12:33:31-04:00" level=debug msg="Using graph driver overlay" time="2024-07-27T12:33:31-04:00" level=debug msg="Using graph root /home/podman_basic_user/.local/share/containers/storage" time="2024-07-27T12:33:31-04:00" level=debug msg="Using run root /run/user/3001/containers" time="2024-07-27T12:33:31-04:00" level=debug msg="Using static dir /home/podman_basic_user/.local/share/containers/storage/libpod" time="2024-07-27T12:33:31-04:00" level=debug msg="Using tmp dir /run/user/3001/libpod/tmp" time="2024-07-27T12:33:31-04:00" level=debug msg="Using volume path /home/podman_basic_user/.local/share/containers/storage/volumes" time="2024-07-27T12:33:31-04:00" level=debug msg="Using transient store: false" time="2024-07-27T12:33:31-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2024-07-27T12:33:31-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:33:31-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:33:31-04:00" level=debug msg="Cached value indicated that metacopy is not being used" time="2024-07-27T12:33:31-04:00" level=debug msg="Cached value indicated that native-diff is usable" time="2024-07-27T12:33:31-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=true, usingMetacopy=false" time="2024-07-27T12:33:31-04:00" level=debug msg="Initializing event backend file" time="2024-07-27T12:33:31-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" time="2024-07-27T12:33:31-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" time="2024-07-27T12:33:31-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" time="2024-07-27T12:33:31-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" time="2024-07-27T12:33:31-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" time="2024-07-27T12:33:31-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" time="2024-07-27T12:33:31-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" time="2024-07-27T12:33:31-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" time="2024-07-27T12:33:31-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" time="2024-07-27T12:33:31-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" time="2024-07-27T12:33:31-04:00" level=info msg="Setting parallel job count to 7" time="2024-07-27T12:33:31-04:00" level=debug msg="Successfully loaded network podman-default-kube-network: &{podman-default-kube-network 71b1a60f73e1e868b6ea683735758103546553888726eb48f91ae1a6ca2d9c5b bridge podman1 2024-07-27 12:32:05.92116415 -0400 EDT [{{{10.89.0.0 ffffff00}} 10.89.0.1 }] [] false false true [] map[] map[] map[driver:host-local]}" time="2024-07-27T12:33:31-04:00" level=debug msg="Successfully loaded 2 networks" time="2024-07-27T12:33:31-04:00" level=debug msg="Looking up image \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage" time="2024-07-27T12:33:31-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:33:31-04:00" level=debug msg="Trying \"localhost/podman-pause:5.1.2-1720678294\" ..." time="2024-07-27T12:33:31-04:00" level=debug msg="parsed reference into \"[overlay@/home/podman_basic_user/.local/share/containers/storage+/run/user/3001/containers]@0e807b318beb575f6f204f06ccab43013f496f1efa9354d0b186f422b369f7b0\"" time="2024-07-27T12:33:31-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720678294\" as \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage" time="2024-07-27T12:33:31-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720678294\" as \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage ([overlay@/home/podman_basic_user/.local/share/containers/storage+/run/user/3001/containers]@0e807b318beb575f6f204f06ccab43013f496f1efa9354d0b186f422b369f7b0)" time="2024-07-27T12:33:31-04:00" level=debug msg="exporting opaque data as blob \"sha256:0e807b318beb575f6f204f06ccab43013f496f1efa9354d0b186f422b369f7b0\"" time="2024-07-27T12:33:31-04:00" level=debug msg="Pod using bridge network mode" time="2024-07-27T12:33:31-04:00" level=debug msg="Created cgroup path user.slice/user-libpod_pod_b0265840b5794cd2683857fe98c60ce0418994a3cfd679c18930ffb4543990cc.slice for parent user.slice and name libpod_pod_b0265840b5794cd2683857fe98c60ce0418994a3cfd679c18930ffb4543990cc" time="2024-07-27T12:33:31-04:00" level=debug msg="Created cgroup user.slice/user-libpod_pod_b0265840b5794cd2683857fe98c60ce0418994a3cfd679c18930ffb4543990cc.slice" time="2024-07-27T12:33:31-04:00" level=debug msg="Got pod cgroup as user.slice/user-3001.slice/user@3001.service/user.slice/user-libpod_pod_b0265840b5794cd2683857fe98c60ce0418994a3cfd679c18930ffb4543990cc.slice" Error: adding pod to state: name "httpd1" is in use: pod already exists time="2024-07-27T12:33:31-04:00" level=debug msg="Shutting down engines" Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33854]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE rc: 125 Jul 27 12:33:31 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[33852]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:32 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[33975]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:33:33 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[34083]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:33:33 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[34191]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:34 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[34300]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd2.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:35 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[34408]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman/httpd2 state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:35 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[34515]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman/httpd2-create state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:36 ip-10-31-12-229.us-east-1.aws.redhat.com podman[34637]: 2024-07-27 12:33:36.834126392 -0400 EDT m=+0.436502131 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:33:37 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[34759]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:37 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[34868]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[34975]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:33:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35029]: ansible-ansible.legacy.file Invoked with owner=root group=0 mode=0644 dest=/etc/containers/ansible-kubernetes.d/httpd2.yml _original_basename=.6xjksi44 recurse=False state=file path=/etc/containers/ansible-kubernetes.d/httpd2.yml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35136]: ansible-containers.podman.podman_play Invoked with state=started debug=True log_level=debug kube_file=/etc/containers/ansible-kubernetes.d/httpd2.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None quiet=None recreate=None userns=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:33:38 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Created slice cgroup machine-libpod_pod_fb72e13a1e887ac47b69f42bbe6428bf552038e72bbaf45c119262c9551f6743.slice. ░░ Subject: A start job for unit machine-libpod_pod_fb72e13a1e887ac47b69f42bbe6428bf552038e72bbaf45c119262c9551f6743.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_fb72e13a1e887ac47b69f42bbe6428bf552038e72bbaf45c119262c9551f6743.slice has finished successfully. ░░ ░░ The job identifier is 1977. Jul 27 12:33:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35136]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE command: /usr/bin/podman play kube --start=true --log-level=debug /etc/containers/ansible-kubernetes.d/httpd2.yml Jul 27 12:33:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35136]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stdout: Jul 27 12:33:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35136]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stderr: time="2024-07-27T12:33:38-04:00" level=info msg="/usr/bin/podman filtering at log level debug" time="2024-07-27T12:33:38-04:00" level=debug msg="Called kube.PersistentPreRunE(/usr/bin/podman play kube --start=true --log-level=debug /etc/containers/ansible-kubernetes.d/httpd2.yml)" time="2024-07-27T12:33:38-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" time="2024-07-27T12:33:38-04:00" level=info msg="Using sqlite as database backend" time="2024-07-27T12:33:38-04:00" level=debug msg="Using graph driver overlay" time="2024-07-27T12:33:38-04:00" level=debug msg="Using graph root /var/lib/containers/storage" time="2024-07-27T12:33:38-04:00" level=debug msg="Using run root /run/containers/storage" time="2024-07-27T12:33:38-04:00" level=debug msg="Using static dir /var/lib/containers/storage/libpod" time="2024-07-27T12:33:38-04:00" level=debug msg="Using tmp dir /run/libpod" time="2024-07-27T12:33:38-04:00" level=debug msg="Using volume path /var/lib/containers/storage/volumes" time="2024-07-27T12:33:38-04:00" level=debug msg="Using transient store: false" time="2024-07-27T12:33:38-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2024-07-27T12:33:38-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:33:38-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:33:38-04:00" level=debug msg="Cached value indicated that metacopy is being used" time="2024-07-27T12:33:38-04:00" level=debug msg="Cached value indicated that native-diff is not being used" time="2024-07-27T12:33:38-04:00" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" time="2024-07-27T12:33:38-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=false, usingMetacopy=true" time="2024-07-27T12:33:38-04:00" level=debug msg="Initializing event backend journald" time="2024-07-27T12:33:38-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" time="2024-07-27T12:33:38-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" time="2024-07-27T12:33:38-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" time="2024-07-27T12:33:38-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" time="2024-07-27T12:33:38-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" time="2024-07-27T12:33:38-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" time="2024-07-27T12:33:38-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" time="2024-07-27T12:33:38-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" time="2024-07-27T12:33:38-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" time="2024-07-27T12:33:38-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" time="2024-07-27T12:33:38-04:00" level=info msg="Setting parallel job count to 7" time="2024-07-27T12:33:38-04:00" level=debug msg="Successfully loaded network podman-default-kube-network: &{podman-default-kube-network 7f20b6d410693ba0d53098bbbd51b58cf4a1522ca416836f34fe03bcda7fa5b0 bridge podman1 2024-07-27 12:30:23.859968267 -0400 EDT [{{{10.89.0.0 ffffff00}} 10.89.0.1 }] [] false false true [] map[] map[] map[driver:host-local]}" time="2024-07-27T12:33:38-04:00" level=debug msg="Successfully loaded 2 networks" time="2024-07-27T12:33:38-04:00" level=debug msg="Looking up image \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage" time="2024-07-27T12:33:38-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:33:38-04:00" level=debug msg="Trying \"localhost/podman-pause:5.1.2-1720678294\" ..." time="2024-07-27T12:33:38-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3\"" time="2024-07-27T12:33:38-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720678294\" as \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage" time="2024-07-27T12:33:38-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720678294\" as \"localhost/podman-pause:5.1.2-1720678294\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3)" time="2024-07-27T12:33:38-04:00" level=debug msg="exporting opaque data as blob \"sha256:cdd133fcb08165701150c9b626930a3f00e504cebf4548d2cf419252395ad3a3\"" time="2024-07-27T12:33:38-04:00" level=debug msg="Pod using bridge network mode" time="2024-07-27T12:33:38-04:00" level=debug msg="Created cgroup path machine.slice/machine-libpod_pod_fb72e13a1e887ac47b69f42bbe6428bf552038e72bbaf45c119262c9551f6743.slice for parent machine.slice and name libpod_pod_fb72e13a1e887ac47b69f42bbe6428bf552038e72bbaf45c119262c9551f6743" time="2024-07-27T12:33:38-04:00" level=debug msg="Created cgroup machine.slice/machine-libpod_pod_fb72e13a1e887ac47b69f42bbe6428bf552038e72bbaf45c119262c9551f6743.slice" time="2024-07-27T12:33:38-04:00" level=debug msg="Got pod cgroup as machine.slice/machine-libpod_pod_fb72e13a1e887ac47b69f42bbe6428bf552038e72bbaf45c119262c9551f6743.slice" Error: adding pod to state: name "httpd2" is in use: pod already exists time="2024-07-27T12:33:38-04:00" level=debug msg="Shutting down engines" Jul 27 12:33:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35136]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE rc: 125 Jul 27 12:33:39 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35256]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:33:40 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35364]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:41 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35473]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd3.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:42 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35581]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman/httpd3 state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:42 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35688]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman/httpd3-create state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:43 ip-10-31-12-229.us-east-1.aws.redhat.com podman[35810]: 2024-07-27 12:33:43.815789139 -0400 EDT m=+0.602209546 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:33:44 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[35931]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:44 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[36040]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:45 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[36147]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:33:45 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[36201]: ansible-ansible.legacy.file Invoked with owner=root group=0 mode=0644 dest=/etc/containers/ansible-kubernetes.d/httpd3.yml _original_basename=.x2cp5_lq recurse=False state=file path=/etc/containers/ansible-kubernetes.d/httpd3.yml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:45 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[36308]: ansible-containers.podman.podman_play Invoked with state=started kube_file=/etc/containers/ansible-kubernetes.d/httpd3.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None debug=None quiet=None recreate=None userns=None log_level=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:33:45 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Created slice cgroup machine-libpod_pod_63e1553facac05c8c1d67c45f694e1e49e011ff7e2b1577b1506ccc0a9b051eb.slice. ░░ Subject: A start job for unit machine-libpod_pod_63e1553facac05c8c1d67c45f694e1e49e011ff7e2b1577b1506ccc0a9b051eb.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_63e1553facac05c8c1d67c45f694e1e49e011ff7e2b1577b1506ccc0a9b051eb.slice has finished successfully. ░░ ░░ The job identifier is 1981. Jul 27 12:33:46 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[36428]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocoigutfqtvekmbpaoupozcpjgvlcsui ; /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098026.4230695-9672-73775375525050/AnsiballZ_command.py' Jul 27 12:33:46 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[36428]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:46 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[36430]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd1 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:46 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-36439.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 143. Jul 27 12:33:46 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[36428]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:47 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[36553]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd2 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:47 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[36668]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd3 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:47 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[36782]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knavlbtsmmdlbqkgmkzkieaydyhyxqrp ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098027.6707408-9694-113719573933069/AnsiballZ_command.py' Jul 27 12:33:47 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[36782]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:47 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[36784]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --user list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd1[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:47 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[36782]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:48 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[36894]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd2[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:48 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[37004]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd3[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:49 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[37114]: ansible-ansible.legacy.uri Invoked with url=http://localhost:15001/index.txt return_content=True force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False use_gssapi=False body_format=raw method=GET follow_redirects=safe status_code=[200] timeout=30 headers={} remote_src=False unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None ca_path=None ciphers=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:49 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[37222]: ansible-ansible.legacy.uri Invoked with url=http://localhost:15002/index.txt return_content=True force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False use_gssapi=False body_format=raw method=GET follow_redirects=safe status_code=[200] timeout=30 headers={} remote_src=False unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None ca_path=None ciphers=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:50 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[37330]: ansible-ansible.legacy.uri Invoked with url=http://localhost:15003/index.txt return_content=True force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False use_gssapi=False body_format=raw method=GET follow_redirects=safe status_code=[200] timeout=30 headers={} remote_src=False unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None ca_path=None ciphers=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:52 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[37545]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:53 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[37658]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:33:53 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[37766]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:55 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[37875]: ansible-getent Invoked with database=passwd key=podman_basic_user fail_key=False service=None split=None Jul 27 12:33:56 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[37983]: ansible-getent Invoked with database=group key=3001 fail_key=False service=None split=None Jul 27 12:33:56 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38091]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:57 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38200]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:57 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38308]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:58 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38416]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:59 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38524]: ansible-stat Invoked with path=/run/user/3001 follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:59 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[38633]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sotixveccphabexqyjardnewfxenmlpx ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098039.336067-9891-274610954778057/AnsiballZ_systemd.py' Jul 27 12:33:59 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[38633]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:59 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38635]: ansible-systemd Invoked with name=podman-kube@-home-podman_basic_user-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service scope=user state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:33:59 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Reloading. Jul 27 12:33:59 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopping A template for running K8s workloads via podman-kube-play... ░░ Subject: A stop job for unit UNIT has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has begun execution. ░░ ░░ The job identifier is 147. Jul 27 12:33:59 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[25790]: conmon fc10fd8b20d610a9a93d : Failed to open cgroups file: /sys/fs/cgroup/user.slice/user-3001.slice/user@3001.service/user.slice/user-libpod_pod_b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066.slice/libpod-fc10fd8b20d610a9a93d43da2f4652d1ccb56dba94dbcedc3c7f62c4cce8d377.scope/container/memory.events Jul 27 12:33:59 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:33:59 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth0 left promiscuous mode Jul 27 12:33:59 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com podman[38650]: time="2024-07-27T12:34:10-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd1-httpd1 in 10 seconds, resorting to SIGKILL" Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[25796]: conmon 7bf805548a794144af77 : Failed to open cgroups file: /sys/fs/cgroup/user.slice/user-3001.slice/user@3001.service/user.slice/user-libpod_pod_b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066.slice/libpod-7bf805548a794144af77c1d1181503babd6ce6686ca4ba623f4458466f8055f2.scope/container/memory.events Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Removed slice cgroup user-libpod_pod_b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066.slice. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 148 and the job result is done. Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: user-libpod_pod_b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066.slice: Failed to open /run/user/3001/systemd/transient/user-libpod_pod_b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066.slice: No such file or directory Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com podman[38650]: Pods stopped: Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com podman[38650]: b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066 Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com podman[38650]: Pods removed: Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com podman[38650]: Error: removing pod b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066 cgroup: removing pod b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066 cgroup: Unit user-libpod_pod_b1f83afa77f626592adeaf4b11b8c8fde3fe3cacc77ae51e7c555024910fb066.slice not loaded. Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com podman[38650]: Secrets removed: Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com podman[38650]: Error: %!s() Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com podman[38650]: Volumes removed: Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped A template for running K8s workloads via podman-kube-play. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 147 and the job result is done. Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[38633]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38829]: ansible-stat Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[38938]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fauxkumjbwoyzuqxlbyaheerehrjmjeb ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098050.7810347-9907-54776336517899/AnsiballZ_podman_play.py' Jul 27 12:34:10 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[38938]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38940]: ansible-containers.podman.podman_play Invoked with state=absent debug=True log_level=debug kube_file=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None quiet=None recreate=None userns=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38940]: ansible-containers.podman.podman_play version: 5.1.2, kube file /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-38947.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 149. Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38940]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE command: /bin/podman kube play --down /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38940]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stdout: Pods stopped: Pods removed: Secrets removed: Volumes removed: Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38940]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stderr: Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[38940]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE rc: 0 Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[38938]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[39062]: ansible-file Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[39169]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krohkscrqdrhlnzgzwagbauhcbnuobcu ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098051.6949046-9923-82802512226359/AnsiballZ_command.py' Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[39169]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[39171]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:12 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-39172.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 153. Jul 27 12:34:12 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[39169]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:12 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[39285]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:34:13 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[39393]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:34:13 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[39501]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:15 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[39610]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd2.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:15 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[39718]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service scope=system state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:34:15 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Reloading. Jul 27 12:34:15 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-rc-local-generator[39738]: /etc/rc.d/rc.local is not marked executable, skipping. Jul 27 12:34:16 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopping A template for running K8s workloads via podman-kube-play... ░░ Subject: A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has begun execution. ░░ ░░ The job identifier is 1986. Jul 27 12:34:16 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:16.091284639 -0400 EDT m=+0.032399316 pod stop 0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9 (image=, name=httpd2) Jul 27 12:34:16 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.tvu2SP.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.tvu2SP.mount has successfully entered the 'dead' state. Jul 27 12:34:16 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7.scope has successfully entered the 'dead' state. Jul 27 12:34:16 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:16.116459469 -0400 EDT m=+0.057574291 container died c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7 (image=localhost/podman-pause:5.1.2-1720678294, name=0333edc697be-infra, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, io.buildah.version=1.36.0) Jul 27 12:34:16 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.f0Q8Ap.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.f0Q8Ap.mount has successfully entered the 'dead' state. Jul 27 12:34:16 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:34:16 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth0 left promiscuous mode Jul 27 12:34:16 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:34:16 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:16.236475926 -0400 EDT m=+0.177590469 container cleanup c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7 (image=localhost/podman-pause:5.1.2-1720678294, name=0333edc697be-infra, pod_id=0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9, io.buildah.version=1.36.0, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:34:17 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-d356ea4dbe9387ef997467eb4a93cba3486e1a2e2fdd32f2474ea26ed0d3341d-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-d356ea4dbe9387ef997467eb4a93cba3486e1a2e2fdd32f2474ea26ed0d3341d-merged.mount has successfully entered the 'dead' state. Jul 27 12:34:17 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: run-netns-netns\x2d9bd34587\x2d91a6\x2d138f\x2d1648\x2d5739cb15f2b2.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-netns-netns\x2d9bd34587\x2d91a6\x2d138f\x2d1648\x2d5739cb15f2b2.mount has successfully entered the 'dead' state. Jul 27 12:34:17 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: time="2024-07-27T12:34:26-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd2-httpd2 in 10 seconds, resorting to SIGKILL" Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.rzi7mD.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.rzi7mD.mount has successfully entered the 'dead' state. Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3.scope has successfully entered the 'dead' state. Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27782]: conmon 6e1f5612be551253f2fe : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/machine-libpod_pod_0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9.slice/libpod-6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3.scope/container/memory.events Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:26.149508411 -0400 EDT m=+10.090623235 container died 6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3 (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.qMW270.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.qMW270.mount has successfully entered the 'dead' state. Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:26.197462822 -0400 EDT m=+10.138577488 container cleanup 6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3 (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Removed slice cgroup machine-libpod_pod_0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9.slice. ░░ Subject: A stop job for unit machine-libpod_pod_0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9.slice has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit machine-libpod_pod_0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9.slice has finished. ░░ ░░ The job identifier is 1988 and the job result is done. Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:26.272941688 -0400 EDT m=+10.214056348 container remove 6e1f5612be551253f2fe79871a7d9548668434308bbd846a6277503e79c2dcb3 (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:26.304687182 -0400 EDT m=+10.245801734 container remove c6e2144414a77982f865350c330feb65ebeafe6b100d99dcba676c3ae3e6dcc7 (image=localhost/podman-pause:5.1.2-1720678294, name=0333edc697be-infra, pod_id=0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9, io.buildah.version=1.36.0, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9.slice: Failed to open /run/systemd/transient/machine-libpod_pod_0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9.slice: No such file or directory Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:26.314550824 -0400 EDT m=+10.255665364 pod remove 0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9 (image=, name=httpd2) Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:26.318845055 -0400 EDT m=+10.259959855 container kill 63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666 (image=localhost/podman-pause:5.1.2-1720678294, name=ec7e5af3b69e-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666.scope has successfully entered the 'dead' state. Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[27686]: conmon 63d4d0a11ecde0e4852d : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666.scope/container/memory.events Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:26.325802449 -0400 EDT m=+10.266917244 container died 63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666 (image=localhost/podman-pause:5.1.2-1720678294, name=ec7e5af3b69e-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 2024-07-27 12:34:26.401509886 -0400 EDT m=+10.342624436 container remove 63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666 (image=localhost/podman-pause:5.1.2-1720678294, name=ec7e5af3b69e-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: Pods stopped: Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: 0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9 Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: Pods removed: Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: Error: removing pod 0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9 cgroup: removing pod 0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9 cgroup: Unit machine-libpod_pod_0333edc697be4d98d762999054d7c0ff5cfb31a70f64421cc7923e6327db66a9.slice not loaded. Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: Secrets removed: Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: Error: %!s() Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com podman[39756]: Volumes removed: Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has successfully entered the 'dead' state. Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopped A template for running K8s workloads via podman-kube-play. ░░ Subject: A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has finished. ░░ ░░ The job identifier is 1986 and the job result is done. Jul 27 12:34:26 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[39921]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-19805a002ba0855c535e4527c9f64000317aa4762aeed0d8dc1432b83c3e954d-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-19805a002ba0855c535e4527c9f64000317aa4762aeed0d8dc1432b83c3e954d-merged.mount has successfully entered the 'dead' state. Jul 27 12:34:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-320f6585d76d041e1a84ae5e0fd228a66df066999c96402d5380545d4543e3f7-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-320f6585d76d041e1a84ae5e0fd228a66df066999c96402d5380545d4543e3f7-merged.mount has successfully entered the 'dead' state. Jul 27 12:34:27 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-63d4d0a11ecde0e4852d7ed91937adec3edcbbc368dee21956dd1ffd89e38666-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:34:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40030]: ansible-containers.podman.podman_play Invoked with state=absent debug=True log_level=debug kube_file=/etc/containers/ansible-kubernetes.d/httpd2.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None quiet=None recreate=None userns=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:34:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40030]: ansible-containers.podman.podman_play version: 5.1.2, kube file /etc/containers/ansible-kubernetes.d/httpd2.yml Jul 27 12:34:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40030]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE command: /usr/bin/podman kube play --down /etc/containers/ansible-kubernetes.d/httpd2.yml Jul 27 12:34:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40030]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stdout: Pods stopped: Pods removed: Secrets removed: Volumes removed: Jul 27 12:34:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40030]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stderr: Jul 27 12:34:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40030]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE rc: 0 Jul 27 12:34:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40150]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:28 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40257]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:29 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40371]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:34:29 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40479]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:30 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40588]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd3.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40696]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service scope=system state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Reloading. Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-rc-local-generator[40716]: /etc/rc.d/rc.local is not marked executable, skipping. Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopping A template for running K8s workloads via podman-kube-play... ░░ Subject: A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has begun execution. ░░ ░░ The job identifier is 1989. Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:31.806109207 -0400 EDT m=+0.032479857 pod stop 23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7 (image=, name=httpd3) Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.2sNu3U.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.2sNu3U.mount has successfully entered the 'dead' state. Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7.scope has successfully entered the 'dead' state. Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:31.831291215 -0400 EDT m=+0.057661969 container died 2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7 (image=localhost/podman-pause:5.1.2-1720678294, name=23fad04f45a5-infra, io.buildah.version=1.36.0, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.sKTUiE.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.sKTUiE.mount has successfully entered the 'dead' state. Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: run-rc5239bbd4bb64814889097c0b095abdd.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-rc5239bbd4bb64814889097c0b095abdd.scope has successfully entered the 'dead' state. Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: device veth1 left promiscuous mode Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com NetworkManager[614]: [1722098071.8942] device (podman1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Starting Network Manager Script Dispatcher Service... ░░ Subject: A start job for unit NetworkManager-dispatcher.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has begun execution. ░░ ░░ The job identifier is 1991. Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Started Network Manager Script Dispatcher Service. ░░ Subject: A start job for unit NetworkManager-dispatcher.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has finished successfully. ░░ ░░ The job identifier is 1991. Jul 27 12:34:31 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: run-netns-netns\x2d7258b6c7\x2d1195\x2dd974\x2d4d65\x2d0bbed5ea90aa.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-netns-netns\x2d7258b6c7\x2d1195\x2dd974\x2d4d65\x2d0bbed5ea90aa.mount has successfully entered the 'dead' state. Jul 27 12:34:32 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:32.030705553 -0400 EDT m=+0.257076065 container cleanup 2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7 (image=localhost/podman-pause:5.1.2-1720678294, name=23fad04f45a5-infra, pod_id=23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, io.buildah.version=1.36.0) Jul 27 12:34:32 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-03dd982b3a8196841e05e6e56966423c1695a8c3a7112695f21554710e2b184f-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-03dd982b3a8196841e05e6e56966423c1695a8c3a7112695f21554710e2b184f-merged.mount has successfully entered the 'dead' state. Jul 27 12:34:32 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:34:41 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: time="2024-07-27T12:34:41-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd3-httpd3 in 10 seconds, resorting to SIGKILL" Jul 27 12:34:41 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.yc1Tdt.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.yc1Tdt.mount has successfully entered the 'dead' state. Jul 27 12:34:41 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016.scope has successfully entered the 'dead' state. Jul 27 12:34:41 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:41.870762818 -0400 EDT m=+10.097133605 container died c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:34:41 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:41.922609186 -0400 EDT m=+10.148979698 container cleanup c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0) Jul 27 12:34:41 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Removed slice cgroup machine-libpod_pod_23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7.slice. ░░ Subject: A stop job for unit machine-libpod_pod_23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7.slice has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit machine-libpod_pod_23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7.slice has finished. ░░ ░░ The job identifier is 2055 and the job result is done. Jul 27 12:34:41 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit NetworkManager-dispatcher.service has successfully entered the 'dead' state. Jul 27 12:34:41 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:41.95622688 -0400 EDT m=+10.182597481 container remove c03c9747a3de31bde17b5fc07ff3e52768ce8c38d6ea62a0d1efc0335a30a016 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0) Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:42.010337631 -0400 EDT m=+10.236708265 container remove 2b27832b0d5b8477ee720890ce36e9d6a47ba00f9de9cea525112f09ae499aa7 (image=localhost/podman-pause:5.1.2-1720678294, name=23fad04f45a5-infra, pod_id=23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7, io.buildah.version=1.36.0, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7.slice: Failed to open /run/systemd/transient/machine-libpod_pod_23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7.slice: No such file or directory Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:42.020069709 -0400 EDT m=+10.246440225 pod remove 23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7 (image=, name=httpd3) Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:42.024530379 -0400 EDT m=+10.250901034 container kill 32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d (image=localhost/podman-pause:5.1.2-1720678294, name=df543e1580ae-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com conmon[29507]: conmon 32527d92ff1496aec930 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d.scope/container/memory.events Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: libpod-32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d.scope has successfully entered the 'dead' state. Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:42.034084282 -0400 EDT m=+10.260455047 container died 32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d (image=localhost/podman-pause:5.1.2-1720678294, name=df543e1580ae-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 2024-07-27 12:34:42.114386618 -0400 EDT m=+10.340757233 container remove 32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d (image=localhost/podman-pause:5.1.2-1720678294, name=df543e1580ae-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: Pods stopped: Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: 23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7 Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: Pods removed: Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: Error: removing pod 23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7 cgroup: removing pod 23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7 cgroup: Unit machine-libpod_pod_23fad04f45a51991a380a328a6a2f0ffaa2b75ae917fb0b9592e446e71bdf6d7.slice not loaded. Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: Secrets removed: Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: Error: %!s() Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com podman[40734]: Volumes removed: Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has successfully entered the 'dead' state. Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopped A template for running K8s workloads via podman-kube-play. ░░ Subject: A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has finished. ░░ ░░ The job identifier is 1989 and the job result is done. Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[40926]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: tmp-crun.aTwSo9.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.aTwSo9.mount has successfully entered the 'dead' state. Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-a465fb4025cf473524c2ad165868699f7bd67e9d2eb9a070d215e04c315085e9-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-a465fb4025cf473524c2ad165868699f7bd67e9d2eb9a070d215e04c315085e9-merged.mount has successfully entered the 'dead' state. Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-815b9adb3f6bd9cf8fedcdf3dbb240a941b852d237bd1509e321d3cffdad3fd4-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-815b9adb3f6bd9cf8fedcdf3dbb240a941b852d237bd1509e321d3cffdad3fd4-merged.mount has successfully entered the 'dead' state. Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-32527d92ff1496aec930848e159d7db10ac3d35b3467bd4fc6fd1bb370de549d-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:34:42 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[41035]: ansible-containers.podman.podman_play Invoked with state=absent kube_file=/etc/containers/ansible-kubernetes.d/httpd3.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None debug=None quiet=None recreate=None userns=None log_level=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:34:43 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[41035]: ansible-containers.podman.podman_play version: 5.1.2, kube file /etc/containers/ansible-kubernetes.d/httpd3.yml Jul 27 12:34:43 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:34:43 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[41155]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:43 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[41262]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:43 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:34:44 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[41376]: ansible-getent Invoked with database=passwd key=podman_basic_user fail_key=True service=None split=None Jul 27 12:34:45 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[41484]: ansible-stat Invoked with path=/run/user/3001 follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:45 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[41593]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uygnptkbgppkiebownaqzkxfcetfhfut ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098085.4120166-10146-26933035321417/AnsiballZ_podman_container_info.py' Jul 27 12:34:45 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[41593]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:45 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[41595]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Jul 27 12:34:45 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-41596.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 157. Jul 27 12:34:45 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[41593]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:46 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[41710]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thnnloyspczaowblkxcgmqsayxmetozl ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098085.9641516-10154-146661791574890/AnsiballZ_command.py' Jul 27 12:34:46 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[41710]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:46 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[41712]: ansible-ansible.legacy.command Invoked with _raw_params=podman network ls -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:46 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-41713.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 161. Jul 27 12:34:46 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[41710]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:46 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[41826]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uukzxcwqbcekriwwnrdnwdjxwpxsuono ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098086.4144602-10162-69763212571512/AnsiballZ_command.py' Jul 27 12:34:46 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[41826]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:46 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[41828]: ansible-ansible.legacy.command Invoked with _raw_params=podman secret ls -n -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:46 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Started podman-41829.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 165. Jul 27 12:34:46 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[41826]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[41942]: ansible-ansible.legacy.command Invoked with removes=/var/lib/systemd/linger/podman_basic_user _raw_params=loginctl disable-linger podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None stdin=None Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopping User Manager for UID 3001... ░░ Subject: A stop job for unit user@3001.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit user@3001.service has begun execution. ░░ ░░ The job identifier is 2056. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Activating special unit Exit the Session... Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopping podman-pause-f6326b37.scope... ░░ Subject: A stop job for unit UNIT has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has begun execution. ░░ ░░ The job identifier is 182. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Removed slice Slice /app/podman-kube. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 177 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Removed slice cgroup user-libpod_pod_b0265840b5794cd2683857fe98c60ce0418994a3cfd679c18930ffb4543990cc.slice. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 181 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped target Main User Target. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 173 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped target Basic System. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 172 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped target Paths. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 179 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped target Sockets. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 187 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped target Timers. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 176 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped Mark boot as successful after the user session has run 2 minutes. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 184 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped Daily Cleanup of User's Temporary Directories. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 186 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com dbus-broker[24566]: Dispatched 2321 messages @ 4(±14)μs / message. ░░ Subject: Dispatched 2321 messages ░░ Defined-By: dbus-broker ░░ Support: https://groups.google.com/forum/#!forum/bus1-devel ░░ ░░ This message is printed by dbus-broker when shutting down. It includes metric ░░ information collected during the runtime of dbus-broker. ░░ ░░ The message lists the number of dispatched messages ░░ (in this case 2321) as well as the mean time to ░░ handling a single message. The time measurements exclude the time spent on ░░ writing to and reading from the kernel. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopping D-Bus User Message Bus... ░░ Subject: A stop job for unit UNIT has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has begun execution. ░░ ░░ The job identifier is 175. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped Create User's Volatile Files and Directories. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 185 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped D-Bus User Message Bus. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 175 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Stopped podman-pause-f6326b37.scope. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 182 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Removed slice Slice /user. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 180 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: user.slice: Consumed 1.492s CPU time. ░░ Subject: Resources consumed by unit runtime ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit UNIT completed and consumed the indicated resources. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Closed D-Bus User Message Bus Socket. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 178 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Removed slice User Application Slice. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 188 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Reached target Shutdown. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 171. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Finished Exit the Session. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 170. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[24213]: Reached target Exit the Session. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 169. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: user@3001.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit user@3001.service has successfully entered the 'dead' state. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopped User Manager for UID 3001. ░░ Subject: A stop job for unit user@3001.service has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit user@3001.service has finished. ░░ ░░ The job identifier is 2056 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: user@3001.service: Consumed 3.081s CPU time. ░░ Subject: Resources consumed by unit runtime ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit user@3001.service completed and consumed the indicated resources. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopping User Runtime Directory /run/user/3001... ░░ Subject: A stop job for unit user-runtime-dir@3001.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit user-runtime-dir@3001.service has begun execution. ░░ ░░ The job identifier is 2057. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: run-user-3001.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-user-3001.mount has successfully entered the 'dead' state. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: user-runtime-dir@3001.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit user-runtime-dir@3001.service has successfully entered the 'dead' state. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Stopped User Runtime Directory /run/user/3001. ░░ Subject: A stop job for unit user-runtime-dir@3001.service has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit user-runtime-dir@3001.service has finished. ░░ ░░ The job identifier is 2057 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Removed slice User Slice of UID 3001. ░░ Subject: A stop job for unit user-3001.slice has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit user-3001.slice has finished. ░░ ░░ The job identifier is 2059 and the job result is done. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: user-3001.slice: Consumed 3.106s CPU time. ░░ Subject: Resources consumed by unit runtime ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit user-3001.slice completed and consumed the indicated resources. Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[42051]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xibbghgvwdleaszwjivpcpoxtfdsyfvt ; /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098087.3883593-10180-130411933652340/AnsiballZ_command.py' Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[42051]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[42053]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod exists httpd1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:47 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[42051]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:48 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[42167]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod exists httpd2 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:48 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:34:48 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[42281]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod exists httpd3 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:48 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:34:48 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[42395]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dinqziaphdwtohqdprqjehgnrmkwjgxu ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.9 /var/tmp/ansible-tmp-1722098088.6110847-10202-102724840693194/AnsiballZ_command.py' Jul 27 12:34:48 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[42395]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:48 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[42397]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --user list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd1[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:48 ip-10-31-12-229.us-east-1.aws.redhat.com sudo[42395]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:49 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[42507]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd2[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:49 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[42617]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd3[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:49 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[42727]: ansible-stat Invoked with path=/var/lib/systemd/linger/podman_basic_user follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:52 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[42941]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:52 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[43055]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:34:53 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[43163]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:34:53 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[43271]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:56 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[43380]: ansible-getent Invoked with database=passwd key=podman_basic_user fail_key=False service=None split=None Jul 27 12:34:56 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[43488]: ansible-getent Invoked with database=group key=3001 fail_key=False service=None split=None Jul 27 12:34:57 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[43596]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:57 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[43705]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:58 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[43813]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:59 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[43921]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:59 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[44029]: ansible-stat Invoked with path=/run/user/3001 follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:00 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[44136]: ansible-stat Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:00 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[44243]: ansible-file Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:01 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[44350]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:35:02 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[44458]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:02 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[44566]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:03 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[44675]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd2.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:04 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[44783]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service scope=system state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:35:04 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[44892]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:05 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[44999]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:06 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[45106]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:06 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[45214]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:07 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[45323]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd3.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:08 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[45431]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service scope=system state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:35:09 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[45540]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:09 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[45647]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:10 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[45754]: ansible-getent Invoked with database=passwd key=podman_basic_user fail_key=True service=None split=None Jul 27 12:35:10 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[45862]: ansible-stat Invoked with path=/run/user/3001 follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[45969]: ansible-file Invoked with path=/etc/containers/storage.conf state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[46076]: ansible-file Invoked with path=/tmp/lsr_2r2po57x_podman state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:14 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[46218]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 27 12:35:15 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[46351]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:17 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[46565]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:17 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[46678]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:35:18 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[46786]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:18 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[46894]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:20 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[47003]: ansible-tempfile Invoked with state=directory prefix=lsr_podman_config_ suffix= path=None Jul 27 12:35:20 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[47110]: ansible-ansible.legacy.command Invoked with _raw_params=tar --ignore-failed-read -c -P -v -p -f /tmp/lsr_podman_config_8qpnebdr/backup.tar /etc/containers/containers.conf.d/50-systemroles.conf /etc/containers/registries.conf.d/50-systemroles.conf /etc/containers/storage.conf /etc/containers/policy.json _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:21 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[47218]: ansible-user Invoked with name=user1 state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on ip-10-31-12-229.us-east-1.aws.redhat.com update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Jul 27 12:35:21 ip-10-31-12-229.us-east-1.aws.redhat.com useradd[47220]: new group: name=user1, GID=3002 Jul 27 12:35:21 ip-10-31-12-229.us-east-1.aws.redhat.com useradd[47220]: new user: name=user1, UID=3002, GID=3002, home=/home/user1, shell=/bin/bash, from=/dev/pts/0 Jul 27 12:35:21 ip-10-31-12-229.us-east-1.aws.redhat.com rsyslogd[783]: imjournal: journal files changed, reloading... [v8.2310.0-4.el9 try https://www.rsyslog.com/e/0 ] Jul 27 12:35:23 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[47441]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:23 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[47554]: ansible-getent Invoked with database=passwd key=user1 fail_key=False service=None split=None Jul 27 12:35:24 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[47662]: ansible-getent Invoked with database=group key=3002 fail_key=False service=None split=None Jul 27 12:35:24 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[47770]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:24 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[47879]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:25 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[47987]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:26 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[48095]: ansible-file Invoked with path=/home/user1/.config/containers/containers.conf.d state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:26 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[48202]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:26 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[48287]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098126.232166-10902-165566967604309/.source.conf dest=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf owner=user1 mode=0644 follow=False _original_basename=toml.j2 checksum=94370d6e765779f1c58daf02f667b8f0b74d91f6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[48394]: ansible-file Invoked with path=/home/user1/.config/containers/registries.conf.d state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:27 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[48501]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:28 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[48586]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098127.4847517-10926-205661664857682/.source.conf dest=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf owner=user1 mode=0644 follow=False _original_basename=toml.j2 checksum=dfb9cd7094a81b3d1bb06512cc9b49a09c75639b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:28 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[48693]: ansible-file Invoked with path=/home/user1/.config/containers state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:28 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[48800]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/storage.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:29 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[48885]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098128.6270175-10950-253187412711286/.source.conf dest=/home/user1/.config/containers/storage.conf owner=user1 mode=0644 follow=False _original_basename=toml.j2 checksum=d08574b6a1df63dbe1c939ff0bcc7c0b61d03044 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:29 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[48992]: ansible-file Invoked with path=/home/user1/.config/containers state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:30 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[49099]: ansible-stat Invoked with path=/home/user1/.config/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:30 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[49206]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/policy.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:30 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[49291]: ansible-ansible.legacy.copy Invoked with dest=/home/user1/.config/containers/policy.json owner=user1 mode=0644 src=/root/.ansible/tmp/ansible-tmp-1722098130.2069187-10983-118981900026467/.source.json _original_basename=.0jw1ef5a follow=False checksum=6746c079ad563b735fc39f73d4876654b80b0a0d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[49398]: ansible-stat Invoked with path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[49507]: ansible-stat Invoked with path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:32 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[49616]: ansible-stat Invoked with path=/home/user1/.config/containers/storage.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:32 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[49725]: ansible-stat Invoked with path=/home/user1/.config/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:34 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[49941]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:34 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[50054]: ansible-getent Invoked with database=group key=3002 fail_key=False service=None split=None Jul 27 12:35:35 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[50162]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:35 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[50271]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:36 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[50379]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:36 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[50487]: ansible-file Invoked with path=/home/user1/.config/containers/containers.conf.d state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:37 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[50594]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:37 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[50648]: ansible-ansible.legacy.file Invoked with owner=user1 mode=0644 dest=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf _original_basename=toml.j2 recurse=False state=file path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:37 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[50755]: ansible-file Invoked with path=/home/user1/.config/containers/registries.conf.d state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[50862]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[50916]: ansible-ansible.legacy.file Invoked with owner=user1 mode=0644 dest=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf _original_basename=toml.j2 recurse=False state=file path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:39 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[51023]: ansible-file Invoked with path=/home/user1/.config/containers state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:39 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[51130]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/storage.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:39 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[51184]: ansible-ansible.legacy.file Invoked with owner=user1 mode=0644 dest=/home/user1/.config/containers/storage.conf _original_basename=toml.j2 recurse=False state=file path=/home/user1/.config/containers/storage.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:40 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[51291]: ansible-file Invoked with path=/home/user1/.config/containers state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:40 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[51398]: ansible-stat Invoked with path=/home/user1/.config/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:41 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[51507]: ansible-slurp Invoked with path=/home/user1/.config/containers/policy.json src=/home/user1/.config/containers/policy.json Jul 27 12:35:41 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[51614]: ansible-stat Invoked with path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:42 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[51723]: ansible-stat Invoked with path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:42 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[51832]: ansible-stat Invoked with path=/home/user1/.config/containers/storage.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:42 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[51941]: ansible-stat Invoked with path=/home/user1/.config/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:44 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[52157]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:45 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[52270]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:35:45 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[52378]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:46 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[52486]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:46 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[52595]: ansible-file Invoked with path=/etc/containers/containers.conf.d state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:47 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[52702]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:47 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[52787]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098146.98505-11311-157702852656223/.source.conf dest=/etc/containers/containers.conf.d/50-systemroles.conf owner=root mode=0644 follow=False _original_basename=toml.j2 checksum=94370d6e765779f1c58daf02f667b8f0b74d91f6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:48 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[52894]: ansible-file Invoked with path=/etc/containers/registries.conf.d state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:48 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[53001]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:48 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[53086]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098148.153968-11335-155670780427611/.source.conf dest=/etc/containers/registries.conf.d/50-systemroles.conf owner=root mode=0644 follow=False _original_basename=toml.j2 checksum=dfb9cd7094a81b3d1bb06512cc9b49a09c75639b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:49 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[53193]: ansible-file Invoked with path=/etc/containers state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:49 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[53300]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/storage.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:49 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[53385]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098149.3348846-11359-104175979065363/.source.conf dest=/etc/containers/storage.conf owner=root mode=0644 follow=False _original_basename=toml.j2 checksum=d08574b6a1df63dbe1c939ff0bcc7c0b61d03044 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:50 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[53492]: ansible-file Invoked with path=/etc/containers state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:50 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[53599]: ansible-stat Invoked with path=/etc/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:51 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[53708]: ansible-slurp Invoked with path=/etc/containers/policy.json src=/etc/containers/policy.json Jul 27 12:35:51 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[53815]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/policy.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:51 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[53902]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/policy.json owner=root mode=0644 src=/root/.ansible/tmp/ansible-tmp-1722098151.2868085-11399-254438282606790/.source.json _original_basename=.mk9o5nx4 follow=False checksum=6746c079ad563b735fc39f73d4876654b80b0a0d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:52 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[54009]: ansible-stat Invoked with path=/etc/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:52 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[54118]: ansible-stat Invoked with path=/etc/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:53 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[54227]: ansible-stat Invoked with path=/etc/containers/storage.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:53 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[54336]: ansible-stat Invoked with path=/etc/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:55 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[54552]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:56 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[54665]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:56 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[54773]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:57 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[54882]: ansible-file Invoked with path=/etc/containers/containers.conf.d state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:57 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[54989]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:58 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[55043]: ansible-ansible.legacy.file Invoked with owner=root mode=0644 dest=/etc/containers/containers.conf.d/50-systemroles.conf _original_basename=toml.j2 recurse=False state=file path=/etc/containers/containers.conf.d/50-systemroles.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:58 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[55150]: ansible-file Invoked with path=/etc/containers/registries.conf.d state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:58 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[55257]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:59 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[55311]: ansible-ansible.legacy.file Invoked with owner=root mode=0644 dest=/etc/containers/registries.conf.d/50-systemroles.conf _original_basename=toml.j2 recurse=False state=file path=/etc/containers/registries.conf.d/50-systemroles.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:59 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[55418]: ansible-file Invoked with path=/etc/containers state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:59 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[55525]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/storage.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:36:00 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[55579]: ansible-ansible.legacy.file Invoked with owner=root mode=0644 dest=/etc/containers/storage.conf _original_basename=toml.j2 recurse=False state=file path=/etc/containers/storage.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:00 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[55686]: ansible-file Invoked with path=/etc/containers state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:01 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[55793]: ansible-stat Invoked with path=/etc/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:01 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[55902]: ansible-slurp Invoked with path=/etc/containers/policy.json src=/etc/containers/policy.json Jul 27 12:36:02 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[56009]: ansible-stat Invoked with path=/etc/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:02 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[56118]: ansible-stat Invoked with path=/etc/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:02 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[56227]: ansible-stat Invoked with path=/etc/containers/storage.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:03 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[56336]: ansible-stat Invoked with path=/etc/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:03 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[56445]: ansible-slurp Invoked with path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf src=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf Jul 27 12:36:04 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[56552]: ansible-slurp Invoked with path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf src=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf Jul 27 12:36:04 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[56659]: ansible-slurp Invoked with path=/home/user1/.config/containers/storage.conf src=/home/user1/.config/containers/storage.conf Jul 27 12:36:05 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[56766]: ansible-slurp Invoked with path=/etc/containers/containers.conf.d/50-systemroles.conf src=/etc/containers/containers.conf.d/50-systemroles.conf Jul 27 12:36:05 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[56873]: ansible-slurp Invoked with path=/etc/containers/registries.conf.d/50-systemroles.conf src=/etc/containers/registries.conf.d/50-systemroles.conf Jul 27 12:36:06 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[56980]: ansible-slurp Invoked with path=/etc/containers/storage.conf src=/etc/containers/storage.conf Jul 27 12:36:06 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[57087]: ansible-file Invoked with state=absent path=/etc/containers/containers.conf.d/50-systemroles.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:06 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[57194]: ansible-file Invoked with state=absent path=/etc/containers/registries.conf.d/50-systemroles.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:07 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[57301]: ansible-file Invoked with state=absent path=/etc/containers/storage.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:07 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[57408]: ansible-file Invoked with state=absent path=/etc/containers/policy.json recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:07 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[57515]: ansible-file Invoked with state=absent path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:08 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[57622]: ansible-file Invoked with state=absent path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:08 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[57729]: ansible-file Invoked with state=absent path=/home/user1/.config/containers/storage.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:08 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[57836]: ansible-file Invoked with state=absent path=/home/user1/.config/containers/policy.json recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:09 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[57943]: ansible-ansible.legacy.command Invoked with _raw_params=tar xfvpP /tmp/lsr_podman_config_8qpnebdr/backup.tar _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:09 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[58051]: ansible-file Invoked with state=absent path=/tmp/lsr_podman_config_8qpnebdr recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:12 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[58193]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 27 12:36:12 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[58302]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:14 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[58516]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:15 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[58629]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:36:15 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[58737]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:16 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[58845]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:19 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[58989]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 27 12:36:22 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[59122]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:24 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[59336]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:24 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[59449]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:36:25 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[59557]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:25 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[59665]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:30 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[59809]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 27 12:36:31 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[59942]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:33 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[60156]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:33 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[60269]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:36:34 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[60377]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:34 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[60485]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:36 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[60594]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:37 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[60702]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[60811]: ansible-file Invoked with path=/etc/containers/systemd state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:38 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[60918]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/systemd/nopull.container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:36:39 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[61003]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098198.5436368-12426-233458295195767/.source.container dest=/etc/containers/systemd/nopull.container owner=root group=0 mode=0644 follow=False _original_basename=systemd.j2 checksum=670d64fc68a9768edb20cad26df2acc703542d85 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:41 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[61217]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:41 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[61330]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:42 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[61438]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:44 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[61547]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:44 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[61655]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:46 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1835587247-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-metacopy\x2dcheck1835587247-merged.mount has successfully entered the 'dead' state. Jul 27 12:36:46 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:36:46 ip-10-31-12-229.us-east-1.aws.redhat.com podman[61780]: 2024-07-27 12:36:46.49069517 -0400 EDT m=+0.134779378 image pull-error this_is_a_bogus_image:latest short-name resolution enforced but cannot prompt without a TTY Jul 27 12:36:46 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[61893]: ansible-file Invoked with path=/etc/containers/systemd state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:47 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[62000]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/systemd/bogus.container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:36:47 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[62085]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098207.0700624-12590-149017880719930/.source.container dest=/etc/containers/systemd/bogus.container owner=root group=0 mode=0644 follow=False _original_basename=systemd.j2 checksum=1d087e679d135214e8ac9ccaf33b2222916efb7f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:49 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[62299]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:50 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[62412]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:50 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[62520]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:52 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[62629]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:53 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[62737]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:54 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[62846]: ansible-systemd Invoked with name=nopull.service scope=system state=stopped enabled=False force=True daemon_reload=False daemon_reexec=False no_block=False masked=None Jul 27 12:36:54 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[62954]: ansible-stat Invoked with path=/etc/containers/systemd/nopull.container follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:56 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[63170]: ansible-file Invoked with path=/etc/containers/systemd/nopull.container state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:56 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[63277]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None Jul 27 12:36:56 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Reloading. Jul 27 12:36:56 ip-10-31-12-229.us-east-1.aws.redhat.com quadlet-generator[63283]: Warning: bogus.container specifies the image "this_is_a_bogus_image" which not a fully qualified image name. This is not ideal for performance and security reasons. See the podman-pull manpage discussion of short-name-aliases.conf for details. Jul 27 12:36:56 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-rc-local-generator[63295]: /etc/rc.d/rc.local is not marked executable, skipping. Jul 27 12:36:57 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:36:59 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[63638]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:37:00 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[63751]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:37:00 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[63859]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:37:02 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[63968]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:37:03 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[64076]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:37:04 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[64185]: ansible-systemd Invoked with name=bogus.service scope=system state=stopped enabled=False force=True daemon_reload=False daemon_reexec=False no_block=False masked=None Jul 27 12:37:04 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Reloading. Jul 27 12:37:04 ip-10-31-12-229.us-east-1.aws.redhat.com quadlet-generator[64193]: Warning: bogus.container specifies the image "this_is_a_bogus_image" which not a fully qualified image name. This is not ideal for performance and security reasons. See the podman-pull manpage discussion of short-name-aliases.conf for details. Jul 27 12:37:04 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-rc-local-generator[64205]: /etc/rc.d/rc.local is not marked executable, skipping. Jul 27 12:37:04 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[64327]: ansible-stat Invoked with path=/etc/containers/systemd/bogus.container follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:37:05 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[64543]: ansible-file Invoked with path=/etc/containers/systemd/bogus.container state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:37:06 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[64650]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None Jul 27 12:37:06 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Reloading. Jul 27 12:37:06 ip-10-31-12-229.us-east-1.aws.redhat.com systemd-rc-local-generator[64668]: /etc/rc.d/rc.local is not marked executable, skipping. Jul 27 12:37:06 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Starting dnf makecache... ░░ Subject: A start job for unit dnf-makecache.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit dnf-makecache.service has begun execution. ░░ ░░ The job identifier is 2060. Jul 27 12:37:06 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: Failed determining last makecache time. Jul 27 12:37:06 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:37:06 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: Beaker Client - RedHatEnterpriseLinux9 7.7 kB/s | 1.5 kB 00:00 Jul 27 12:37:06 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: Beaker harness 71 kB/s | 1.3 kB 00:00 Jul 27 12:37:07 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: Copr repo for beakerlib-libraries owned by bgon 33 kB/s | 1.8 kB 00:00 Jul 27 12:37:07 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: CentOS Stream 9 - BaseOS 45 kB/s | 5.8 kB 00:00 Jul 27 12:37:07 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: CentOS Stream 9 - AppStream 44 kB/s | 5.9 kB 00:00 Jul 27 12:37:07 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: CentOS Stream 9 - Extras packages 56 kB/s | 6.3 kB 00:00 Jul 27 12:37:07 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: Extra Packages for Enterprise Linux 9 openh264 22 kB/s | 993 B 00:00 Jul 27 12:37:07 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: Extra Packages for Enterprise Linux 9 - Next - 90 kB/s | 27 kB 00:00 Jul 27 12:37:08 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: Copr repo for qa-tools owned by lpol 27 kB/s | 1.8 kB 00:00 Jul 27 12:37:08 ip-10-31-12-229.us-east-1.aws.redhat.com dnf[64684]: Metadata cache created. Jul 27 12:37:08 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: dnf-makecache.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit dnf-makecache.service has successfully entered the 'dead' state. Jul 27 12:37:08 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: Finished dnf makecache. ░░ Subject: A start job for unit dnf-makecache.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit dnf-makecache.service has finished successfully. ░░ ░░ The job identifier is 2060. Jul 27 12:37:08 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[64914]: ansible-user Invoked with name=user_quadlet_basic uid=1111 state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on ip-10-31-12-229.us-east-1.aws.redhat.com update_password=always group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Jul 27 12:37:08 ip-10-31-12-229.us-east-1.aws.redhat.com useradd[64916]: new group: name=user_quadlet_basic, GID=1111 Jul 27 12:37:08 ip-10-31-12-229.us-east-1.aws.redhat.com useradd[64916]: new user: name=user_quadlet_basic, UID=1111, GID=1111, home=/home/user_quadlet_basic, shell=/bin/bash, from=/dev/pts/0 Jul 27 12:37:10 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[65136]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:37:10 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[65249]: ansible-getent Invoked with database=passwd key=user_quadlet_basic fail_key=False service=None split=None Jul 27 12:37:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[65357]: ansible-getent Invoked with database=group key=1111 fail_key=False service=None split=None Jul 27 12:37:11 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[65465]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:37:12 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[65574]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids user_quadlet_basic _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:37:12 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[65682]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g user_quadlet_basic _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:37:13 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[65790]: ansible-ansible.legacy.command Invoked with _raw_params=set -x set -o pipefail exec 1>&2 #podman volume rm --all #podman network prune -f podman volume ls podman network ls podman secret ls podman container ls podman pod ls podman images systemctl list-units | grep quadlet _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:37:13 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:37:13 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:37:14 ip-10-31-12-229.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:37:15 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[66049]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:37:16 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[66162]: ansible-getent Invoked with database=group key=1111 fail_key=False service=None split=None Jul 27 12:37:16 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[66270]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:37:17 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[66379]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids user_quadlet_basic _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:37:17 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[66487]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g user_quadlet_basic _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:37:19 ip-10-31-12-229.us-east-1.aws.redhat.com python3.9[66595]: ansible-ansible.legacy.command Invoked with _raw_params=journalctl -ex _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None PLAY RECAP ********************************************************************* managed_node1 : ok=201 changed=6 unreachable=0 failed=2 skipped=297 rescued=2 ignored=0 Saturday 27 July 2024 12:37:19 -0400 (0:00:00.435) 0:00:49.468 ********* =============================================================================== fedora.linux_system_roles.podman : Gather the package facts ------------- 1.13s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Gathering Facts --------------------------------------------------------- 1.08s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:3 fedora.linux_system_roles.podman : Gather the package facts ------------- 0.98s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Ensure container images are present --- 0.98s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18 fedora.linux_system_roles.podman : Gather the package facts ------------- 0.96s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Gather the package facts ------------- 0.92s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Gather the package facts ------------- 0.92s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Gather the package facts ------------- 0.92s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Ensure quadlet file is present ------- 0.89s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70 fedora.linux_system_roles.podman : Stop and disable service ------------- 0.83s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 fedora.linux_system_roles.podman : Refresh systemd ---------------------- 0.76s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 fedora.linux_system_roles.podman : Stop and disable service ------------- 0.76s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 fedora.linux_system_roles.podman : Refresh systemd ---------------------- 0.75s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 fedora.linux_system_roles.podman : Ensure quadlet file is present ------- 0.70s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70 Create user for testing ------------------------------------------------- 0.70s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:109 Debug3 ------------------------------------------------------------------ 0.57s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:253 fedora.linux_system_roles.podman : Get podman version ------------------- 0.49s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 fedora.linux_system_roles.podman : Ensure the quadlet directory is present --- 0.48s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39 fedora.linux_system_roles.podman : Remove managed resource -------------- 0.47s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 fedora.linux_system_roles.podman : Get user information ----------------- 0.46s /tmp/tmp.UJtmxR54QA/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2