[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. ansible-playbook [core 2.17.3] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-43F executable location = /usr/local/bin/ansible-playbook python version = 3.12.4 (main, Jun 7 2024, 00:00:00) [GCC 14.1.1 20240607 (Red Hat 14.1.1-5)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_stratis.yml **************************************************** 1 plays in /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml PLAY [Test stratis pool management] ******************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:2 Saturday 17 August 2024 19:29:59 -0400 (0:00:00.031) 0:00:00.031 ******* [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed_node2] TASK [Run the role] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:14 Saturday 17 August 2024 19:30:02 -0400 (0:00:03.370) 0:00:03.402 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:30:02 -0400 (0:00:00.078) 0:00:03.480 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:30:02 -0400 (0:00:00.093) 0:00:03.574 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:30:03 -0400 (0:00:00.137) 0:00:03.712 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:30:03 -0400 (0:00:00.108) 0:00:03.820 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:30:03 -0400 (0:00:00.737) 0:00:04.557 ******* ok: [managed_node2] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:30:03 -0400 (0:00:00.079) 0:00:04.637 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:30:03 -0400 (0:00:00.031) 0:00:04.669 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:30:04 -0400 (0:00:00.034) 0:00:04.703 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:30:04 -0400 (0:00:00.117) 0:00:04.820 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-fs libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:30:05 -0400 (0:00:01.768) 0:00:06.589 ******* ok: [managed_node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:30:05 -0400 (0:00:00.022) 0:00:06.611 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:30:05 -0400 (0:00:00.022) 0:00:06.634 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:30:06 -0400 (0:00:00.760) 0:00:07.394 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 17 August 2024 19:30:06 -0400 (0:00:00.144) 0:00:07.539 ******* skipping: [managed_node2] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "false_condition": "repo.packages | intersect(copr_packages) | length > 0", "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 17 August 2024 19:30:07 -0400 (0:00:00.181) 0:00:07.720 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 17 August 2024 19:30:07 -0400 (0:00:00.112) 0:00:07.833 ******* skipping: [managed_node2] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "false_condition": "repo.packages | intersect(copr_packages) | length > 0", "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:30:07 -0400 (0:00:00.121) 0:00:07.954 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:30:08 -0400 (0:00:01.513) 0:00:09.468 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "audit-rules.service": { "name": "audit-rules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.home1.service": { "name": "dbus-org.freedesktop.home1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd.service": { "name": "dhcpcd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dhcpcd@.service": { "name": "dhcpcd@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fsidd.service": { "name": "fsidd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@dm_mod.service": { "name": "modprobe@dm_mod.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@dm_multipath.service": { "name": "modprobe@dm_multipath.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@loop.service": { "name": "modprobe@loop.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "passim.service": { "name": "passim.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ssh-host-keys-migration.service": { "name": "ssh-host-keys-migration.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-battery-check.service": { "name": "systemd-battery-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-bsod.service": { "name": "systemd-bsod.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-confext.service": { "name": "systemd-confext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume.service": { "name": "systemd-hibernate-resume.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd-wait-online@.service": { "name": "systemd-networkd-wait-online@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-pcrextend@.service": { "name": "systemd-pcrextend@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrlock-file-system.service": { "name": "systemd-pcrlock-file-system.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-code.service": { "name": "systemd-pcrlock-firmware-code.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-firmware-config.service": { "name": "systemd-pcrlock-firmware-config.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-machine-id.service": { "name": "systemd-pcrlock-machine-id.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-make-policy.service": { "name": "systemd-pcrlock-make-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-authority.service": { "name": "systemd-pcrlock-secureboot-authority.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrlock-secureboot-policy.service": { "name": "systemd-pcrlock-secureboot-policy.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-soft-reboot.service": { "name": "systemd-soft-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-storagetm.service": { "name": "systemd-storagetm.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-sysext@.service": { "name": "systemd-sysext@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev-early.service": { "name": "systemd-tmpfiles-setup-dev-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup-early.service": { "name": "systemd-tpm2-setup-early.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tpm2-setup.service": { "name": "systemd-tpm2-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:30:11 -0400 (0:00:02.605) 0:00:12.073 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:30:11 -0400 (0:00:00.114) 0:00:12.187 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:30:11 -0400 (0:00:00.035) 0:00:12.222 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:30:12 -0400 (0:00:00.704) 0:00:12.927 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:30:12 -0400 (0:00:00.107) 0:00:13.034 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937269.8527687, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "040ba4405b5492ce3b98ec92daf6841922885fc7", "ctime": 1723937269.8517687, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937269.8517687, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:30:12 -0400 (0:00:00.463) 0:00:13.498 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:30:12 -0400 (0:00:00.053) 0:00:13.552 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:30:12 -0400 (0:00:00.024) 0:00:13.577 ******* ok: [managed_node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:30:12 -0400 (0:00:00.032) 0:00:13.609 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:30:12 -0400 (0:00:00.033) 0:00:13.643 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:30:12 -0400 (0:00:00.031) 0:00:13.674 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:30:13 -0400 (0:00:00.054) 0:00:13.729 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:30:13 -0400 (0:00:00.023) 0:00:13.752 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:30:13 -0400 (0:00:00.046) 0:00:13.799 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:30:13 -0400 (0:00:00.045) 0:00:13.844 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:30:13 -0400 (0:00:00.021) 0:00:13.866 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:30:13 -0400 (0:00:00.411) 0:00:14.278 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:30:13 -0400 (0:00:00.020) 0:00:14.298 ******* ok: [managed_node2] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:18 Saturday 17 August 2024 19:30:15 -0400 (0:00:02.280) 0:00:16.579 ******* ok: [managed_node2] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [Gather package facts] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:25 Saturday 17 August 2024 19:30:15 -0400 (0:00:00.050) 0:00:16.629 ******* ok: [managed_node2] => { "ansible_facts": { "packages": { "ModemManager-glib": [ { "arch": "x86_64", "epoch": null, "name": "ModemManager-glib", "release": "3.fc40", "source": "rpm", "version": "1.22.0" } ], "NetworkManager": [ { "arch": "x86_64", "epoch": 1, "name": "NetworkManager", "release": "1.fc40", "source": "rpm", "version": "1.46.2" } ], "NetworkManager-libnm": [ { "arch": "x86_64", "epoch": 1, "name": "NetworkManager-libnm", "release": "1.fc40", "source": "rpm", "version": "1.46.2" } ], "abattis-cantarell-vf-fonts": [ { "arch": "noarch", "epoch": null, "name": "abattis-cantarell-vf-fonts", "release": "12.fc40", "source": "rpm", "version": "0.301" } ], "alternatives": [ { "arch": "x86_64", "epoch": null, "name": "alternatives", "release": "1.fc40", "source": "rpm", "version": "1.27" } ], "amd-gpu-firmware": [ { "arch": "noarch", "epoch": null, "name": "amd-gpu-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "amd-ucode-firmware": [ { "arch": "noarch", "epoch": null, "name": "amd-ucode-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "aspell": [ { "arch": "x86_64", "epoch": 12, "name": "aspell", "release": "1.fc40", "source": "rpm", "version": "0.60.8.1" } ], "aspell-en": [ { "arch": "x86_64", "epoch": 50, "name": "aspell-en", "release": "10.fc40", "source": "rpm", "version": "2020.12.07" } ], "atheros-firmware": [ { "arch": "noarch", "epoch": null, "name": "atheros-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "audit": [ { "arch": "x86_64", "epoch": null, "name": "audit", "release": "1.fc40", "source": "rpm", "version": "4.0.1" } ], "audit-libs": [ { "arch": "x86_64", "epoch": null, "name": "audit-libs", "release": "1.fc40", "source": "rpm", "version": "4.0.1" } ], "audit-rules": [ { "arch": "x86_64", "epoch": null, "name": "audit-rules", "release": "1.fc40", "source": "rpm", "version": "4.0.1" } ], "authselect": [ { "arch": "x86_64", "epoch": null, "name": "authselect", "release": "5.fc40", "source": "rpm", "version": "1.5.0" } ], "authselect-libs": [ { "arch": "x86_64", "epoch": null, "name": "authselect-libs", "release": "5.fc40", "source": "rpm", "version": "1.5.0" } ], "avahi-libs": [ { "arch": "x86_64", "epoch": null, "name": "avahi-libs", "release": "26.fc40", "source": "rpm", "version": "0.8" } ], "basesystem": [ { "arch": "noarch", "epoch": null, "name": "basesystem", "release": "20.fc40", "source": "rpm", "version": "11" } ], "bash": [ { "arch": "x86_64", "epoch": null, "name": "bash", "release": "3.fc40", "source": "rpm", "version": "5.2.26" } ], "bc": [ { "arch": "x86_64", "epoch": null, "name": "bc", "release": "21.fc40", "source": "rpm", "version": "1.07.1" } ], "beakerlib": [ { "arch": "noarch", "epoch": null, "name": "beakerlib", "release": "1.fc40", "source": "rpm", "version": "1.31.2" } ], "beakerlib-redhat": [ { "arch": "noarch", "epoch": null, "name": "beakerlib-redhat", "release": "35.fc40eng", "source": "rpm", "version": "1" } ], "binutils": [ { "arch": "x86_64", "epoch": null, "name": "binutils", "release": "37.fc40", "source": "rpm", "version": "2.41" } ], "binutils-gold": [ { "arch": "x86_64", "epoch": null, "name": "binutils-gold", "release": "37.fc40", "source": "rpm", "version": "2.41" } ], "bison": [ { "arch": "x86_64", "epoch": null, "name": "bison", "release": "7.fc40", "source": "rpm", "version": "3.8.2" } ], "blivet-data": [ { "arch": "noarch", "epoch": 1, "name": "blivet-data", "release": "1.fc40", "source": "rpm", "version": "3.10.0" } ], "bluez": [ { "arch": "x86_64", "epoch": null, "name": "bluez", "release": "1.fc40", "source": "rpm", "version": "5.77" } ], "boost-atomic": [ { "arch": "x86_64", "epoch": null, "name": "boost-atomic", "release": "5.fc40", "source": "rpm", "version": "1.83.0" } ], "boost-filesystem": [ { "arch": "x86_64", "epoch": null, "name": "boost-filesystem", "release": "5.fc40", "source": "rpm", "version": "1.83.0" } ], "boost-system": [ { "arch": "x86_64", "epoch": null, "name": "boost-system", "release": "5.fc40", "source": "rpm", "version": "1.83.0" } ], "boost-thread": [ { "arch": "x86_64", "epoch": null, "name": "boost-thread", "release": "5.fc40", "source": "rpm", "version": "1.83.0" } ], "brcmfmac-firmware": [ { "arch": "noarch", "epoch": null, "name": "brcmfmac-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "btrfs-progs": [ { "arch": "x86_64", "epoch": null, "name": "btrfs-progs", "release": "1.fc40", "source": "rpm", "version": "6.9.2" } ], "bzip2-libs": [ { "arch": "x86_64", "epoch": null, "name": "bzip2-libs", "release": "18.fc40", "source": "rpm", "version": "1.0.8" } ], "c-ares": [ { "arch": "x86_64", "epoch": null, "name": "c-ares", "release": "1.fc40", "source": "rpm", "version": "1.28.1" } ], "ca-certificates": [ { "arch": "noarch", "epoch": null, "name": "ca-certificates", "release": "6.fc40", "source": "rpm", "version": "2023.2.62_v7.0.401" } ], "checkpolicy": [ { "arch": "x86_64", "epoch": null, "name": "checkpolicy", "release": "3.fc40", "source": "rpm", "version": "3.6" } ], "chrony": [ { "arch": "x86_64", "epoch": null, "name": "chrony", "release": "3.fc40", "source": "rpm", "version": "4.5" } ], "cirrus-audio-firmware": [ { "arch": "noarch", "epoch": null, "name": "cirrus-audio-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "clevis": [ { "arch": "x86_64", "epoch": null, "name": "clevis", "release": "2.fc40", "source": "rpm", "version": "20" } ], "clevis-luks": [ { "arch": "x86_64", "epoch": null, "name": "clevis-luks", "release": "2.fc40", "source": "rpm", "version": "20" } ], "clevis-pin-tpm2": [ { "arch": "x86_64", "epoch": null, "name": "clevis-pin-tpm2", "release": "5.fc40", "source": "rpm", "version": "0.5.3" } ], "cloud-init": [ { "arch": "noarch", "epoch": null, "name": "cloud-init", "release": "2.fc40", "source": "rpm", "version": "24.1.4" } ], "cloud-utils-growpart": [ { "arch": "noarch", "epoch": null, "name": "cloud-utils-growpart", "release": "7.fc40", "source": "rpm", "version": "0.33" } ], "cmake-filesystem": [ { "arch": "x86_64", "epoch": null, "name": "cmake-filesystem", "release": "1.fc40", "source": "rpm", "version": "3.28.2" } ], "coreutils": [ { "arch": "x86_64", "epoch": null, "name": "coreutils", "release": "7.fc40", "source": "rpm", "version": "9.4" } ], "coreutils-common": [ { "arch": "x86_64", "epoch": null, "name": "coreutils-common", "release": "7.fc40", "source": "rpm", "version": "9.4" } ], "cpio": [ { "arch": "x86_64", "epoch": null, "name": "cpio", "release": "1.fc40", "source": "rpm", "version": "2.15" } ], "cpp": [ { "arch": "x86_64", "epoch": null, "name": "cpp", "release": "1.fc40", "source": "rpm", "version": "14.2.1" } ], "cracklib": [ { "arch": "x86_64", "epoch": null, "name": "cracklib", "release": "5.fc40", "source": "rpm", "version": "2.9.11" } ], "cracklib-dicts": [ { "arch": "x86_64", "epoch": null, "name": "cracklib-dicts", "release": "5.fc40", "source": "rpm", "version": "2.9.11" } ], "crypto-policies": [ { "arch": "noarch", "epoch": null, "name": "crypto-policies", "release": "1.git28d3e2d.fc40", "source": "rpm", "version": "20240725" } ], "crypto-policies-scripts": [ { "arch": "noarch", "epoch": null, "name": "crypto-policies-scripts", "release": "1.git28d3e2d.fc40", "source": "rpm", "version": "20240725" } ], "cryptsetup": [ { "arch": "x86_64", "epoch": null, "name": "cryptsetup", "release": "1.fc40", "source": "rpm", "version": "2.7.4" } ], "cryptsetup-libs": [ { "arch": "x86_64", "epoch": null, "name": "cryptsetup-libs", "release": "1.fc40", "source": "rpm", "version": "2.7.4" } ], "curl": [ { "arch": "x86_64", "epoch": null, "name": "curl", "release": "10.fc40", "source": "rpm", "version": "8.6.0" } ], "cyrus-sasl-lib": [ { "arch": "x86_64", "epoch": null, "name": "cyrus-sasl-lib", "release": "19.fc40", "source": "rpm", "version": "2.1.28" } ], "dbus": [ { "arch": "x86_64", "epoch": 1, "name": "dbus", "release": "3.fc40", "source": "rpm", "version": "1.14.10" } ], "dbus-broker": [ { "arch": "x86_64", "epoch": null, "name": "dbus-broker", "release": "2.fc40", "source": "rpm", "version": "36" } ], "dbus-common": [ { "arch": "noarch", "epoch": 1, "name": "dbus-common", "release": "3.fc40", "source": "rpm", "version": "1.14.10" } ], "dbus-libs": [ { "arch": "x86_64", "epoch": 1, "name": "dbus-libs", "release": "3.fc40", "source": "rpm", "version": "1.14.10" } ], "default-fonts-core-sans": [ { "arch": "noarch", "epoch": null, "name": "default-fonts-core-sans", "release": "13.fc40", "source": "rpm", "version": "4.0" } ], "device-mapper": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper", "release": "1.fc40", "source": "rpm", "version": "1.02.197" } ], "device-mapper-event": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-event", "release": "1.fc40", "source": "rpm", "version": "1.02.197" } ], "device-mapper-event-libs": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-event-libs", "release": "1.fc40", "source": "rpm", "version": "1.02.197" } ], "device-mapper-libs": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-libs", "release": "1.fc40", "source": "rpm", "version": "1.02.197" } ], "device-mapper-multipath": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-multipath", "release": "7.fc40", "source": "rpm", "version": "0.9.7" } ], "device-mapper-multipath-libs": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-multipath-libs", "release": "7.fc40", "source": "rpm", "version": "0.9.7" } ], "device-mapper-persistent-data": [ { "arch": "x86_64", "epoch": null, "name": "device-mapper-persistent-data", "release": "1.fc40", "source": "rpm", "version": "1.0.12" } ], "dhcp-client": [ { "arch": "x86_64", "epoch": 12, "name": "dhcp-client", "release": "13.P1.fc40", "source": "rpm", "version": "4.4.3" } ], "dhcp-common": [ { "arch": "noarch", "epoch": 12, "name": "dhcp-common", "release": "13.P1.fc40", "source": "rpm", "version": "4.4.3" } ], "dhcpcd": [ { "arch": "x86_64", "epoch": null, "name": "dhcpcd", "release": "4.fc40", "source": "rpm", "version": "10.0.6" } ], "diffutils": [ { "arch": "x86_64", "epoch": null, "name": "diffutils", "release": "5.fc40", "source": "rpm", "version": "3.10" } ], "dnf": [ { "arch": "noarch", "epoch": null, "name": "dnf", "release": "1.fc40", "source": "rpm", "version": "4.21.0" } ], "dnf-data": [ { "arch": "noarch", "epoch": null, "name": "dnf-data", "release": "1.fc40", "source": "rpm", "version": "4.21.0" } ], "dnf-plugins-core": [ { "arch": "noarch", "epoch": null, "name": "dnf-plugins-core", "release": "1.fc40", "source": "rpm", "version": "4.8.0" } ], "dnf-utils": [ { "arch": "noarch", "epoch": null, "name": "dnf-utils", "release": "1.fc40", "source": "rpm", "version": "4.8.0" } ], "dosfstools": [ { "arch": "x86_64", "epoch": null, "name": "dosfstools", "release": "11.fc40", "source": "rpm", "version": "4.2" } ], "dracut": [ { "arch": "x86_64", "epoch": null, "name": "dracut", "release": "2.fc40", "source": "rpm", "version": "102" } ], "dracut-config-rescue": [ { "arch": "x86_64", "epoch": null, "name": "dracut-config-rescue", "release": "2.fc40", "source": "rpm", "version": "102" } ], "duktape": [ { "arch": "x86_64", "epoch": null, "name": "duktape", "release": "7.fc40", "source": "rpm", "version": "2.7.0" } ], "dyninst": [ { "arch": "x86_64", "epoch": null, "name": "dyninst", "release": "6.fc40", "source": "rpm", "version": "12.3.0" } ], "e2fsprogs": [ { "arch": "x86_64", "epoch": null, "name": "e2fsprogs", "release": "5.fc40", "source": "rpm", "version": "1.47.0" } ], "e2fsprogs-libs": [ { "arch": "x86_64", "epoch": null, "name": "e2fsprogs-libs", "release": "5.fc40", "source": "rpm", "version": "1.47.0" } ], "efivar-libs": [ { "arch": "x86_64", "epoch": null, "name": "efivar-libs", "release": "2.fc40", "source": "rpm", "version": "39" } ], "elfutils-debuginfod-client": [ { "arch": "x86_64", "epoch": null, "name": "elfutils-debuginfod-client", "release": "4.fc40", "source": "rpm", "version": "0.191" } ], "elfutils-debuginfod-client-devel": [ { "arch": "x86_64", "epoch": null, "name": "elfutils-debuginfod-client-devel", "release": "4.fc40", "source": "rpm", "version": "0.191" } ], "elfutils-default-yama-scope": [ { "arch": "noarch", "epoch": null, "name": "elfutils-default-yama-scope", "release": "4.fc40", "source": "rpm", "version": "0.191" } ], "elfutils-devel": [ { "arch": "x86_64", "epoch": null, "name": "elfutils-devel", "release": "4.fc40", "source": "rpm", "version": "0.191" } ], "elfutils-libelf": [ { "arch": "x86_64", "epoch": null, "name": "elfutils-libelf", "release": "4.fc40", "source": "rpm", "version": "0.191" } ], "elfutils-libelf-devel": [ { "arch": "x86_64", "epoch": null, "name": "elfutils-libelf-devel", "release": "4.fc40", "source": "rpm", "version": "0.191" } ], "elfutils-libs": [ { "arch": "x86_64", "epoch": null, "name": "elfutils-libs", "release": "4.fc40", "source": "rpm", "version": "0.191" } ], "exfatprogs": [ { "arch": "x86_64", "epoch": null, "name": "exfatprogs", "release": "1.fc40", "source": "rpm", "version": "1.2.4" } ], "expat": [ { "arch": "x86_64", "epoch": null, "name": "expat", "release": "1.fc40", "source": "rpm", "version": "2.6.2" } ], "fedora-gpg-keys": [ { "arch": "noarch", "epoch": null, "name": "fedora-gpg-keys", "release": "2", "source": "rpm", "version": "40" } ], "fedora-release": [ { "arch": "noarch", "epoch": null, "name": "fedora-release", "release": "39", "source": "rpm", "version": "40" } ], "fedora-release-common": [ { "arch": "noarch", "epoch": null, "name": "fedora-release-common", "release": "39", "source": "rpm", "version": "40" } ], "fedora-release-identity-basic": [ { "arch": "noarch", "epoch": null, "name": "fedora-release-identity-basic", "release": "39", "source": "rpm", "version": "40" } ], "fedora-repos": [ { "arch": "noarch", "epoch": null, "name": "fedora-repos", "release": "2", "source": "rpm", "version": "40" } ], "file": [ { "arch": "x86_64", "epoch": null, "name": "file", "release": "4.fc40", "source": "rpm", "version": "5.45" } ], "file-libs": [ { "arch": "x86_64", "epoch": null, "name": "file-libs", "release": "4.fc40", "source": "rpm", "version": "5.45" } ], "filesystem": [ { "arch": "x86_64", "epoch": null, "name": "filesystem", "release": "8.fc40", "source": "rpm", "version": "3.18" } ], "findutils": [ { "arch": "x86_64", "epoch": 1, "name": "findutils", "release": "9.fc40", "source": "rpm", "version": "4.9.0" } ], "firewalld": [ { "arch": "noarch", "epoch": null, "name": "firewalld", "release": "1.fc40", "source": "rpm", "version": "2.1.3" } ], "firewalld-filesystem": [ { "arch": "noarch", "epoch": null, "name": "firewalld-filesystem", "release": "1.fc40", "source": "rpm", "version": "2.1.3" } ], "flex": [ { "arch": "x86_64", "epoch": null, "name": "flex", "release": "16.fc40", "source": "rpm", "version": "2.6.4" } ], "fonts-filesystem": [ { "arch": "noarch", "epoch": 1, "name": "fonts-filesystem", "release": "14.fc40", "source": "rpm", "version": "2.0.5" } ], "fuse-libs": [ { "arch": "x86_64", "epoch": null, "name": "fuse-libs", "release": "21.fc40", "source": "rpm", "version": "2.9.9" } ], "fwupd": [ { "arch": "x86_64", "epoch": null, "name": "fwupd", "release": "1.fc40", "source": "rpm", "version": "1.9.21" } ], "fwupd-efi": [ { "arch": "x86_64", "epoch": null, "name": "fwupd-efi", "release": "1.fc40", "source": "rpm", "version": "1.6" } ], "fwupd-plugin-modem-manager": [ { "arch": "x86_64", "epoch": null, "name": "fwupd-plugin-modem-manager", "release": "1.fc40", "source": "rpm", "version": "1.9.21" } ], "fwupd-plugin-uefi-capsule-data": [ { "arch": "x86_64", "epoch": null, "name": "fwupd-plugin-uefi-capsule-data", "release": "1.fc40", "source": "rpm", "version": "1.9.21" } ], "gawk": [ { "arch": "x86_64", "epoch": null, "name": "gawk", "release": "3.fc40", "source": "rpm", "version": "5.3.0" } ], "gc": [ { "arch": "x86_64", "epoch": null, "name": "gc", "release": "6.fc40", "source": "rpm", "version": "8.2.2" } ], "gcc": [ { "arch": "x86_64", "epoch": null, "name": "gcc", "release": "1.fc40", "source": "rpm", "version": "14.2.1" } ], "gdbm": [ { "arch": "x86_64", "epoch": 1, "name": "gdbm", "release": "6.fc40", "source": "rpm", "version": "1.23" } ], "gdbm-libs": [ { "arch": "x86_64", "epoch": 1, "name": "gdbm-libs", "release": "6.fc40", "source": "rpm", "version": "1.23" } ], "gdisk": [ { "arch": "x86_64", "epoch": null, "name": "gdisk", "release": "1.fc40", "source": "rpm", "version": "1.0.10" } ], "gettext-envsubst": [ { "arch": "x86_64", "epoch": null, "name": "gettext-envsubst", "release": "2.fc40", "source": "rpm", "version": "0.22.5" } ], "gettext-libs": [ { "arch": "x86_64", "epoch": null, "name": "gettext-libs", "release": "2.fc40", "source": "rpm", "version": "0.22.5" } ], "gettext-runtime": [ { "arch": "x86_64", "epoch": null, "name": "gettext-runtime", "release": "2.fc40", "source": "rpm", "version": "0.22.5" } ], "git": [ { "arch": "x86_64", "epoch": null, "name": "git", "release": "2.fc40", "source": "rpm", "version": "2.45.2" } ], "git-core": [ { "arch": "x86_64", "epoch": null, "name": "git-core", "release": "2.fc40", "source": "rpm", "version": "2.45.2" } ], "git-core-doc": [ { "arch": "noarch", "epoch": null, "name": "git-core-doc", "release": "2.fc40", "source": "rpm", "version": "2.45.2" } ], "glib-networking": [ { "arch": "x86_64", "epoch": null, "name": "glib-networking", "release": "1.fc40", "source": "rpm", "version": "2.80.0" } ], "glib2": [ { "arch": "x86_64", "epoch": null, "name": "glib2", "release": "1.fc40", "source": "rpm", "version": "2.80.3" } ], "glibc": [ { "arch": "x86_64", "epoch": null, "name": "glibc", "release": "17.fc40", "source": "rpm", "version": "2.39" } ], "glibc-common": [ { "arch": "x86_64", "epoch": null, "name": "glibc-common", "release": "17.fc40", "source": "rpm", "version": "2.39" } ], "glibc-devel": [ { "arch": "x86_64", "epoch": null, "name": "glibc-devel", "release": "17.fc40", "source": "rpm", "version": "2.39" } ], "glibc-gconv-extra": [ { "arch": "x86_64", "epoch": null, "name": "glibc-gconv-extra", "release": "17.fc40", "source": "rpm", "version": "2.39" } ], "glibc-headers-x86": [ { "arch": "noarch", "epoch": null, "name": "glibc-headers-x86", "release": "17.fc40", "source": "rpm", "version": "2.39" } ], "glibc-langpack-en": [ { "arch": "x86_64", "epoch": null, "name": "glibc-langpack-en", "release": "17.fc40", "source": "rpm", "version": "2.39" } ], "gmp": [ { "arch": "x86_64", "epoch": 1, "name": "gmp", "release": "8.fc40", "source": "rpm", "version": "6.2.1" } ], "gnupg2": [ { "arch": "x86_64", "epoch": null, "name": "gnupg2", "release": "1.fc40", "source": "rpm", "version": "2.4.4" } ], "gnutls": [ { "arch": "x86_64", "epoch": null, "name": "gnutls", "release": "1.fc40", "source": "rpm", "version": "3.8.6" } ], "gnutls-dane": [ { "arch": "x86_64", "epoch": null, "name": "gnutls-dane", "release": "1.fc40", "source": "rpm", "version": "3.8.6" } ], "gobject-introspection": [ { "arch": "x86_64", "epoch": null, "name": "gobject-introspection", "release": "1.fc40", "source": "rpm", "version": "1.80.1" } ], "google-noto-fonts-common": [ { "arch": "noarch", "epoch": null, "name": "google-noto-fonts-common", "release": "2.fc40", "source": "rpm", "version": "20240301" } ], "google-noto-sans-mono-vf-fonts": [ { "arch": "noarch", "epoch": null, "name": "google-noto-sans-mono-vf-fonts", "release": "2.fc40", "source": "rpm", "version": "20240301" } ], "google-noto-sans-vf-fonts": [ { "arch": "noarch", "epoch": null, "name": "google-noto-sans-vf-fonts", "release": "2.fc40", "source": "rpm", "version": "20240301" } ], "google-noto-serif-vf-fonts": [ { "arch": "noarch", "epoch": null, "name": "google-noto-serif-vf-fonts", "release": "2.fc40", "source": "rpm", "version": "20240301" } ], "gpg-pubkey": [ { "arch": null, "epoch": null, "name": "gpg-pubkey", "release": "63d04c2c", "source": "rpm", "version": "a15b79cc" } ], "gpgme": [ { "arch": "x86_64", "epoch": null, "name": "gpgme", "release": "3.fc40", "source": "rpm", "version": "1.23.2" } ], "gpm-libs": [ { "arch": "x86_64", "epoch": null, "name": "gpm-libs", "release": "46.fc40", "source": "rpm", "version": "1.20.7" } ], "grep": [ { "arch": "x86_64", "epoch": null, "name": "grep", "release": "7.fc40", "source": "rpm", "version": "3.11" } ], "groff-base": [ { "arch": "x86_64", "epoch": null, "name": "groff-base", "release": "6.fc40", "source": "rpm", "version": "1.23.0" } ], "grub2-common": [ { "arch": "noarch", "epoch": 1, "name": "grub2-common", "release": "123.fc40", "source": "rpm", "version": "2.06" } ], "grub2-pc": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-pc", "release": "123.fc40", "source": "rpm", "version": "2.06" } ], "grub2-pc-modules": [ { "arch": "noarch", "epoch": 1, "name": "grub2-pc-modules", "release": "123.fc40", "source": "rpm", "version": "2.06" } ], "grub2-tools": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-tools", "release": "123.fc40", "source": "rpm", "version": "2.06" } ], "grub2-tools-minimal": [ { "arch": "x86_64", "epoch": 1, "name": "grub2-tools-minimal", "release": "123.fc40", "source": "rpm", "version": "2.06" } ], "grubby": [ { "arch": "x86_64", "epoch": null, "name": "grubby", "release": "75.fc40", "source": "rpm", "version": "8.40" } ], "gsettings-desktop-schemas": [ { "arch": "x86_64", "epoch": null, "name": "gsettings-desktop-schemas", "release": "1.fc40", "source": "rpm", "version": "46.1" } ], "gssproxy": [ { "arch": "x86_64", "epoch": null, "name": "gssproxy", "release": "3.fc40", "source": "rpm", "version": "0.9.2" } ], "guile30": [ { "arch": "x86_64", "epoch": null, "name": "guile30", "release": "12.fc40", "source": "rpm", "version": "3.0.7" } ], "gzip": [ { "arch": "x86_64", "epoch": null, "name": "gzip", "release": "1.fc40", "source": "rpm", "version": "1.13" } ], "hostname": [ { "arch": "x86_64", "epoch": null, "name": "hostname", "release": "12.fc40", "source": "rpm", "version": "3.23" } ], "hunspell": [ { "arch": "x86_64", "epoch": null, "name": "hunspell", "release": "7.fc40", "source": "rpm", "version": "1.7.2" } ], "hunspell-en": [ { "arch": "noarch", "epoch": null, "name": "hunspell-en", "release": "9.fc40", "source": "rpm", "version": "0.20201207" } ], "hunspell-en-GB": [ { "arch": "noarch", "epoch": null, "name": "hunspell-en-GB", "release": "9.fc40", "source": "rpm", "version": "0.20201207" } ], "hunspell-en-US": [ { "arch": "noarch", "epoch": null, "name": "hunspell-en-US", "release": "9.fc40", "source": "rpm", "version": "0.20201207" } ], "hunspell-filesystem": [ { "arch": "x86_64", "epoch": null, "name": "hunspell-filesystem", "release": "7.fc40", "source": "rpm", "version": "1.7.2" } ], "ima-evm-utils": [ { "arch": "x86_64", "epoch": null, "name": "ima-evm-utils", "release": "4.fc40", "source": "rpm", "version": "1.5" } ], "inih": [ { "arch": "x86_64", "epoch": null, "name": "inih", "release": "1.fc40", "source": "rpm", "version": "58" } ], "initscripts-service": [ { "arch": "noarch", "epoch": null, "name": "initscripts-service", "release": "1.fc40", "source": "rpm", "version": "10.23" } ], "intel-audio-firmware": [ { "arch": "noarch", "epoch": null, "name": "intel-audio-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "intel-gpu-firmware": [ { "arch": "noarch", "epoch": null, "name": "intel-gpu-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "ipcalc": [ { "arch": "x86_64", "epoch": null, "name": "ipcalc", "release": "9.fc40", "source": "rpm", "version": "1.0.3" } ], "iproute": [ { "arch": "x86_64", "epoch": null, "name": "iproute", "release": "2.fc40", "source": "rpm", "version": "6.7.0" } ], "iptables-libs": [ { "arch": "x86_64", "epoch": null, "name": "iptables-libs", "release": "7.fc40", "source": "rpm", "version": "1.8.10" } ], "iptables-nft": [ { "arch": "x86_64", "epoch": null, "name": "iptables-nft", "release": "7.fc40", "source": "rpm", "version": "1.8.10" } ], "iputils": [ { "arch": "x86_64", "epoch": null, "name": "iputils", "release": "4.fc40", "source": "rpm", "version": "20240117" } ], "jansson": [ { "arch": "x86_64", "epoch": null, "name": "jansson", "release": "9.fc40", "source": "rpm", "version": "2.13.1" } ], "jitterentropy": [ { "arch": "x86_64", "epoch": null, "name": "jitterentropy", "release": "3.fc40", "source": "rpm", "version": "3.5.0" } ], "jose": [ { "arch": "x86_64", "epoch": null, "name": "jose", "release": "1.fc40", "source": "rpm", "version": "14" } ], "jq": [ { "arch": "x86_64", "epoch": null, "name": "jq", "release": "7.fc40", "source": "rpm", "version": "1.7.1" } ], "json-c": [ { "arch": "x86_64", "epoch": null, "name": "json-c", "release": "3.fc40", "source": "rpm", "version": "0.17" } ], "json-glib": [ { "arch": "x86_64", "epoch": null, "name": "json-glib", "release": "3.fc40", "source": "rpm", "version": "1.8.0" } ], "kbd": [ { "arch": "x86_64", "epoch": null, "name": "kbd", "release": "3.fc40", "source": "rpm", "version": "2.6.4" } ], "kbd-legacy": [ { "arch": "noarch", "epoch": null, "name": "kbd-legacy", "release": "3.fc40", "source": "rpm", "version": "2.6.4" } ], "kbd-misc": [ { "arch": "noarch", "epoch": null, "name": "kbd-misc", "release": "3.fc40", "source": "rpm", "version": "2.6.4" } ], "kernel": [ { "arch": "x86_64", "epoch": null, "name": "kernel", "release": "200.fc40", "source": "rpm", "version": "6.9.12" } ], "kernel-core": [ { "arch": "x86_64", "epoch": null, "name": "kernel-core", "release": "200.fc40", "source": "rpm", "version": "6.9.12" } ], "kernel-devel": [ { "arch": "x86_64", "epoch": null, "name": "kernel-devel", "release": "200.fc40", "source": "rpm", "version": "6.9.12" } ], "kernel-headers": [ { "arch": "x86_64", "epoch": null, "name": "kernel-headers", "release": "200.fc40", "source": "rpm", "version": "6.9.4" } ], "kernel-modules": [ { "arch": "x86_64", "epoch": null, "name": "kernel-modules", "release": "200.fc40", "source": "rpm", "version": "6.9.12" } ], "kernel-modules-core": [ { "arch": "x86_64", "epoch": null, "name": "kernel-modules-core", "release": "200.fc40", "source": "rpm", "version": "6.9.12" } ], "keyutils": [ { "arch": "x86_64", "epoch": null, "name": "keyutils", "release": "3.fc40", "source": "rpm", "version": "1.6.3" } ], "keyutils-libs": [ { "arch": "x86_64", "epoch": null, "name": "keyutils-libs", "release": "3.fc40", "source": "rpm", "version": "1.6.3" } ], "kmod": [ { "arch": "x86_64", "epoch": null, "name": "kmod", "release": "5.fc40", "source": "rpm", "version": "31" } ], "kmod-libs": [ { "arch": "x86_64", "epoch": null, "name": "kmod-libs", "release": "5.fc40", "source": "rpm", "version": "31" } ], "kpartx": [ { "arch": "x86_64", "epoch": null, "name": "kpartx", "release": "7.fc40", "source": "rpm", "version": "0.9.7" } ], "krb5-libs": [ { "arch": "x86_64", "epoch": null, "name": "krb5-libs", "release": "1.fc40", "source": "rpm", "version": "1.21.3" } ], "langpacks-core-en": [ { "arch": "noarch", "epoch": null, "name": "langpacks-core-en", "release": "13.fc40", "source": "rpm", "version": "4.0" } ], "langpacks-en": [ { "arch": "noarch", "epoch": null, "name": "langpacks-en", "release": "13.fc40", "source": "rpm", "version": "4.0" } ], "langpacks-fonts-en": [ { "arch": "noarch", "epoch": null, "name": "langpacks-fonts-en", "release": "13.fc40", "source": "rpm", "version": "4.0" } ], "less": [ { "arch": "x86_64", "epoch": null, "name": "less", "release": "4.fc40", "source": "rpm", "version": "643" } ], "libacl": [ { "arch": "x86_64", "epoch": null, "name": "libacl", "release": "1.fc40", "source": "rpm", "version": "2.3.2" } ], "libaio": [ { "arch": "x86_64", "epoch": null, "name": "libaio", "release": "19.fc40", "source": "rpm", "version": "0.3.111" } ], "libarchive": [ { "arch": "x86_64", "epoch": null, "name": "libarchive", "release": "4.fc40", "source": "rpm", "version": "3.7.2" } ], "libassuan": [ { "arch": "x86_64", "epoch": null, "name": "libassuan", "release": "1.fc40", "source": "rpm", "version": "2.5.7" } ], "libatasmart": [ { "arch": "x86_64", "epoch": null, "name": "libatasmart", "release": "28.fc40", "source": "rpm", "version": "0.19" } ], "libattr": [ { "arch": "x86_64", "epoch": null, "name": "libattr", "release": "3.fc40", "source": "rpm", "version": "2.5.2" } ], "libb2": [ { "arch": "x86_64", "epoch": null, "name": "libb2", "release": "11.fc40", "source": "rpm", "version": "0.98.1" } ], "libbasicobjects": [ { "arch": "x86_64", "epoch": null, "name": "libbasicobjects", "release": "56.fc40", "source": "rpm", "version": "0.1.1" } ], "libblkid": [ { "arch": "x86_64", "epoch": null, "name": "libblkid", "release": "1.fc40", "source": "rpm", "version": "2.40.1" } ], "libblockdev": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-btrfs": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-btrfs", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-crypto": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-crypto", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-dm": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-dm", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-fs": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-fs", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-loop": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-loop", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-lvm": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-lvm", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-mdraid": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-mdraid", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-mpath": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-mpath", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-nvme": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-nvme", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-part": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-part", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-swap": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-swap", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libblockdev-utils": [ { "arch": "x86_64", "epoch": null, "name": "libblockdev-utils", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "libbpf": [ { "arch": "x86_64", "epoch": 2, "name": "libbpf", "release": "3.fc40", "source": "rpm", "version": "1.2.0" } ], "libbrotli": [ { "arch": "x86_64", "epoch": null, "name": "libbrotli", "release": "3.fc40", "source": "rpm", "version": "1.1.0" } ], "libbytesize": [ { "arch": "x86_64", "epoch": null, "name": "libbytesize", "release": "3.fc40", "source": "rpm", "version": "2.10" } ], "libcap": [ { "arch": "x86_64", "epoch": null, "name": "libcap", "release": "8.fc40", "source": "rpm", "version": "2.69" } ], "libcap-ng": [ { "arch": "x86_64", "epoch": null, "name": "libcap-ng", "release": "4.fc40", "source": "rpm", "version": "0.8.4" } ], "libcbor": [ { "arch": "x86_64", "epoch": null, "name": "libcbor", "release": "1.fc40", "source": "rpm", "version": "0.11.0" } ], "libcollection": [ { "arch": "x86_64", "epoch": null, "name": "libcollection", "release": "56.fc40", "source": "rpm", "version": "0.7.0" } ], "libcom_err": [ { "arch": "x86_64", "epoch": null, "name": "libcom_err", "release": "5.fc40", "source": "rpm", "version": "1.47.0" } ], "libcomps": [ { "arch": "x86_64", "epoch": null, "name": "libcomps", "release": "5.fc40", "source": "rpm", "version": "0.1.20" } ], "libcurl": [ { "arch": "x86_64", "epoch": null, "name": "libcurl", "release": "10.fc40", "source": "rpm", "version": "8.6.0" } ], "libdhash": [ { "arch": "x86_64", "epoch": null, "name": "libdhash", "release": "56.fc40", "source": "rpm", "version": "0.5.0" } ], "libdnf": [ { "arch": "x86_64", "epoch": null, "name": "libdnf", "release": "1.fc40", "source": "rpm", "version": "0.73.2" } ], "libeconf": [ { "arch": "x86_64", "epoch": null, "name": "libeconf", "release": "2.fc40", "source": "rpm", "version": "0.6.2" } ], "libedit": [ { "arch": "x86_64", "epoch": null, "name": "libedit", "release": "51.20240517cvs.fc40", "source": "rpm", "version": "3.1" } ], "libev": [ { "arch": "x86_64", "epoch": null, "name": "libev", "release": "11.fc40", "source": "rpm", "version": "4.33" } ], "libevdev": [ { "arch": "x86_64", "epoch": null, "name": "libevdev", "release": "2.fc40", "source": "rpm", "version": "1.13.2" } ], "libevent": [ { "arch": "x86_64", "epoch": null, "name": "libevent", "release": "12.fc40", "source": "rpm", "version": "2.1.12" } ], "libfdisk": [ { "arch": "x86_64", "epoch": null, "name": "libfdisk", "release": "1.fc40", "source": "rpm", "version": "2.40.1" } ], "libffi": [ { "arch": "x86_64", "epoch": null, "name": "libffi", "release": "7.fc40", "source": "rpm", "version": "3.4.4" } ], "libfido2": [ { "arch": "x86_64", "epoch": null, "name": "libfido2", "release": "4.fc40", "source": "rpm", "version": "1.14.0" } ], "libfsverity": [ { "arch": "x86_64", "epoch": null, "name": "libfsverity", "release": "12.fc40", "source": "rpm", "version": "1.4" } ], "libgcc": [ { "arch": "x86_64", "epoch": null, "name": "libgcc", "release": "1.fc40", "source": "rpm", "version": "14.2.1" } ], "libgcrypt": [ { "arch": "x86_64", "epoch": null, "name": "libgcrypt", "release": "3.fc40", "source": "rpm", "version": "1.10.3" } ], "libgomp": [ { "arch": "x86_64", "epoch": null, "name": "libgomp", "release": "1.fc40", "source": "rpm", "version": "14.2.1" } ], "libgpg-error": [ { "arch": "x86_64", "epoch": null, "name": "libgpg-error", "release": "1.fc40", "source": "rpm", "version": "1.49" } ], "libgudev": [ { "arch": "x86_64", "epoch": null, "name": "libgudev", "release": "5.fc40", "source": "rpm", "version": "238" } ], "libgusb": [ { "arch": "x86_64", "epoch": null, "name": "libgusb", "release": "1.fc40", "source": "rpm", "version": "0.4.9" } ], "libidn2": [ { "arch": "x86_64", "epoch": null, "name": "libidn2", "release": "1.fc40", "source": "rpm", "version": "2.3.7" } ], "libini_config": [ { "arch": "x86_64", "epoch": null, "name": "libini_config", "release": "56.fc40", "source": "rpm", "version": "1.3.1" } ], "libjcat": [ { "arch": "x86_64", "epoch": null, "name": "libjcat", "release": "2.fc40", "source": "rpm", "version": "0.2.1" } ], "libjose": [ { "arch": "x86_64", "epoch": null, "name": "libjose", "release": "1.fc40", "source": "rpm", "version": "14" } ], "libkcapi": [ { "arch": "x86_64", "epoch": null, "name": "libkcapi", "release": "10.fc40", "source": "rpm", "version": "1.4.0" } ], "libkcapi-hmaccalc": [ { "arch": "x86_64", "epoch": null, "name": "libkcapi-hmaccalc", "release": "10.fc40", "source": "rpm", "version": "1.4.0" } ], "libksba": [ { "arch": "x86_64", "epoch": null, "name": "libksba", "release": "1.fc40", "source": "rpm", "version": "1.6.6" } ], "libldb": [ { "arch": "x86_64", "epoch": null, "name": "libldb", "release": "1.fc40", "source": "rpm", "version": "2.9.1" } ], "libluksmeta": [ { "arch": "x86_64", "epoch": null, "name": "libluksmeta", "release": "22.fc40", "source": "rpm", "version": "9" } ], "libmaxminddb": [ { "arch": "x86_64", "epoch": null, "name": "libmaxminddb", "release": "1.fc40", "source": "rpm", "version": "1.10.0" } ], "libmbim": [ { "arch": "x86_64", "epoch": null, "name": "libmbim", "release": "3.fc40", "source": "rpm", "version": "1.30.0" } ], "libmnl": [ { "arch": "x86_64", "epoch": null, "name": "libmnl", "release": "5.fc40", "source": "rpm", "version": "1.0.5" } ], "libmodulemd": [ { "arch": "x86_64", "epoch": null, "name": "libmodulemd", "release": "12.fc40", "source": "rpm", "version": "2.15.0" } ], "libmount": [ { "arch": "x86_64", "epoch": null, "name": "libmount", "release": "1.fc40", "source": "rpm", "version": "2.40.1" } ], "libmpc": [ { "arch": "x86_64", "epoch": null, "name": "libmpc", "release": "5.fc40", "source": "rpm", "version": "1.3.1" } ], "libndp": [ { "arch": "x86_64", "epoch": null, "name": "libndp", "release": "9.fc40", "source": "rpm", "version": "1.8" } ], "libnetfilter_conntrack": [ { "arch": "x86_64", "epoch": null, "name": "libnetfilter_conntrack", "release": "5.fc40", "source": "rpm", "version": "1.0.9" } ], "libnfnetlink": [ { "arch": "x86_64", "epoch": null, "name": "libnfnetlink", "release": "27.fc40", "source": "rpm", "version": "1.0.1" } ], "libnfsidmap": [ { "arch": "x86_64", "epoch": 1, "name": "libnfsidmap", "release": "0.rc6.fc40", "source": "rpm", "version": "2.6.4" } ], "libnftnl": [ { "arch": "x86_64", "epoch": null, "name": "libnftnl", "release": "5.fc40", "source": "rpm", "version": "1.2.6" } ], "libnghttp2": [ { "arch": "x86_64", "epoch": null, "name": "libnghttp2", "release": "3.fc40", "source": "rpm", "version": "1.59.0" } ], "libnl3": [ { "arch": "x86_64", "epoch": null, "name": "libnl3", "release": "1.fc40", "source": "rpm", "version": "3.10.0" } ], "libnsl2": [ { "arch": "x86_64", "epoch": null, "name": "libnsl2", "release": "1.fc40", "source": "rpm", "version": "2.0.1" } ], "libnvme": [ { "arch": "x86_64", "epoch": null, "name": "libnvme", "release": "1.fc40", "source": "rpm", "version": "1.8" } ], "libpath_utils": [ { "arch": "x86_64", "epoch": null, "name": "libpath_utils", "release": "56.fc40", "source": "rpm", "version": "0.2.1" } ], "libpipeline": [ { "arch": "x86_64", "epoch": null, "name": "libpipeline", "release": "5.fc40", "source": "rpm", "version": "1.5.7" } ], "libpkgconf": [ { "arch": "x86_64", "epoch": null, "name": "libpkgconf", "release": "1.fc40", "source": "rpm", "version": "2.1.1" } ], "libproxy": [ { "arch": "x86_64", "epoch": null, "name": "libproxy", "release": "1.fc40", "source": "rpm", "version": "0.5.5" } ], "libpsl": [ { "arch": "x86_64", "epoch": null, "name": "libpsl", "release": "3.fc40", "source": "rpm", "version": "0.21.5" } ], "libpwquality": [ { "arch": "x86_64", "epoch": null, "name": "libpwquality", "release": "9.fc40", "source": "rpm", "version": "1.4.5" } ], "libqmi": [ { "arch": "x86_64", "epoch": null, "name": "libqmi", "release": "5.fc40", "source": "rpm", "version": "1.34.0" } ], "libqrtr-glib": [ { "arch": "x86_64", "epoch": null, "name": "libqrtr-glib", "release": "5.fc40", "source": "rpm", "version": "1.2.2" } ], "libref_array": [ { "arch": "x86_64", "epoch": null, "name": "libref_array", "release": "56.fc40", "source": "rpm", "version": "0.1.5" } ], "librepo": [ { "arch": "x86_64", "epoch": null, "name": "librepo", "release": "1.fc40", "source": "rpm", "version": "1.18.0" } ], "libreport-filesystem": [ { "arch": "noarch", "epoch": null, "name": "libreport-filesystem", "release": "1.fc40", "source": "rpm", "version": "2.17.15" } ], "libseccomp": [ { "arch": "x86_64", "epoch": null, "name": "libseccomp", "release": "1.fc40", "source": "rpm", "version": "2.5.5" } ], "libselinux": [ { "arch": "x86_64", "epoch": null, "name": "libselinux", "release": "4.fc40", "source": "rpm", "version": "3.6" } ], "libselinux-utils": [ { "arch": "x86_64", "epoch": null, "name": "libselinux-utils", "release": "4.fc40", "source": "rpm", "version": "3.6" } ], "libsemanage": [ { "arch": "x86_64", "epoch": null, "name": "libsemanage", "release": "3.fc40", "source": "rpm", "version": "3.6" } ], "libsepol": [ { "arch": "x86_64", "epoch": null, "name": "libsepol", "release": "3.fc40", "source": "rpm", "version": "3.6" } ], "libsmartcols": [ { "arch": "x86_64", "epoch": null, "name": "libsmartcols", "release": "1.fc40", "source": "rpm", "version": "2.40.1" } ], "libsodium": [ { "arch": "x86_64", "epoch": null, "name": "libsodium", "release": "1.fc40", "source": "rpm", "version": "1.0.20" } ], "libsolv": [ { "arch": "x86_64", "epoch": null, "name": "libsolv", "release": "1.fc40", "source": "rpm", "version": "0.7.30" } ], "libsoup3": [ { "arch": "x86_64", "epoch": null, "name": "libsoup3", "release": "3.fc40", "source": "rpm", "version": "3.4.4" } ], "libss": [ { "arch": "x86_64", "epoch": null, "name": "libss", "release": "5.fc40", "source": "rpm", "version": "1.47.0" } ], "libssh": [ { "arch": "x86_64", "epoch": null, "name": "libssh", "release": "5.fc40", "source": "rpm", "version": "0.10.6" } ], "libssh-config": [ { "arch": "noarch", "epoch": null, "name": "libssh-config", "release": "5.fc40", "source": "rpm", "version": "0.10.6" } ], "libsss_certmap": [ { "arch": "x86_64", "epoch": null, "name": "libsss_certmap", "release": "1.fc40", "source": "rpm", "version": "2.9.5" } ], "libsss_idmap": [ { "arch": "x86_64", "epoch": null, "name": "libsss_idmap", "release": "1.fc40", "source": "rpm", "version": "2.9.5" } ], "libsss_nss_idmap": [ { "arch": "x86_64", "epoch": null, "name": "libsss_nss_idmap", "release": "1.fc40", "source": "rpm", "version": "2.9.5" } ], "libsss_sudo": [ { "arch": "x86_64", "epoch": null, "name": "libsss_sudo", "release": "1.fc40", "source": "rpm", "version": "2.9.5" } ], "libstdc++": [ { "arch": "x86_64", "epoch": null, "name": "libstdc++", "release": "1.fc40", "source": "rpm", "version": "14.2.1" } ], "libtalloc": [ { "arch": "x86_64", "epoch": null, "name": "libtalloc", "release": "1.fc40", "source": "rpm", "version": "2.4.2" } ], "libtasn1": [ { "arch": "x86_64", "epoch": null, "name": "libtasn1", "release": "6.fc40", "source": "rpm", "version": "4.19.0" } ], "libtdb": [ { "arch": "x86_64", "epoch": null, "name": "libtdb", "release": "1.fc40", "source": "rpm", "version": "1.4.10" } ], "libtevent": [ { "arch": "x86_64", "epoch": null, "name": "libtevent", "release": "1.fc40", "source": "rpm", "version": "0.16.1" } ], "libtirpc": [ { "arch": "x86_64", "epoch": null, "name": "libtirpc", "release": "0.fc40", "source": "rpm", "version": "1.3.5" } ], "libtool-ltdl": [ { "arch": "x86_64", "epoch": null, "name": "libtool-ltdl", "release": "10.fc40", "source": "rpm", "version": "2.4.7" } ], "libudisks2": [ { "arch": "x86_64", "epoch": null, "name": "libudisks2", "release": "5.fc40", "source": "rpm", "version": "2.10.1" } ], "libunistring": [ { "arch": "x86_64", "epoch": null, "name": "libunistring", "release": "7.fc40", "source": "rpm", "version": "1.1" } ], "libusb1": [ { "arch": "x86_64", "epoch": null, "name": "libusb1", "release": "2.fc40", "source": "rpm", "version": "1.0.27" } ], "libutempter": [ { "arch": "x86_64", "epoch": null, "name": "libutempter", "release": "13.fc40", "source": "rpm", "version": "1.2.1" } ], "libuuid": [ { "arch": "x86_64", "epoch": null, "name": "libuuid", "release": "1.fc40", "source": "rpm", "version": "2.40.1" } ], "libverto": [ { "arch": "x86_64", "epoch": null, "name": "libverto", "release": "8.fc40", "source": "rpm", "version": "0.3.2" } ], "libverto-libev": [ { "arch": "x86_64", "epoch": null, "name": "libverto-libev", "release": "8.fc40", "source": "rpm", "version": "0.3.2" } ], "libxcrypt": [ { "arch": "x86_64", "epoch": null, "name": "libxcrypt", "release": "5.fc40", "source": "rpm", "version": "4.4.36" } ], "libxcrypt-devel": [ { "arch": "x86_64", "epoch": null, "name": "libxcrypt-devel", "release": "5.fc40", "source": "rpm", "version": "4.4.36" } ], "libxkbcommon": [ { "arch": "x86_64", "epoch": null, "name": "libxkbcommon", "release": "2.fc40", "source": "rpm", "version": "1.6.0" } ], "libxml2": [ { "arch": "x86_64", "epoch": null, "name": "libxml2", "release": "1.fc40", "source": "rpm", "version": "2.12.8" } ], "libxmlb": [ { "arch": "x86_64", "epoch": null, "name": "libxmlb", "release": "2.fc40", "source": "rpm", "version": "0.3.19" } ], "libxslt": [ { "arch": "x86_64", "epoch": null, "name": "libxslt", "release": "1.fc40", "source": "rpm", "version": "1.1.42" } ], "libyaml": [ { "arch": "x86_64", "epoch": null, "name": "libyaml", "release": "14.fc40", "source": "rpm", "version": "0.2.5" } ], "libzstd": [ { "arch": "x86_64", "epoch": null, "name": "libzstd", "release": "1.fc40", "source": "rpm", "version": "1.5.6" } ], "libzstd-devel": [ { "arch": "x86_64", "epoch": null, "name": "libzstd-devel", "release": "1.fc40", "source": "rpm", "version": "1.5.6" } ], "linux-firmware": [ { "arch": "noarch", "epoch": null, "name": "linux-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "linux-firmware-whence": [ { "arch": "noarch", "epoch": null, "name": "linux-firmware-whence", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "lmdb-libs": [ { "arch": "x86_64", "epoch": null, "name": "lmdb-libs", "release": "1.fc40", "source": "rpm", "version": "0.9.33" } ], "lsof": [ { "arch": "x86_64", "epoch": null, "name": "lsof", "release": "4.fc40", "source": "rpm", "version": "4.98.0" } ], "lua-libs": [ { "arch": "x86_64", "epoch": null, "name": "lua-libs", "release": "5.fc40", "source": "rpm", "version": "5.4.6" } ], "luksmeta": [ { "arch": "x86_64", "epoch": null, "name": "luksmeta", "release": "22.fc40", "source": "rpm", "version": "9" } ], "lvm2": [ { "arch": "x86_64", "epoch": null, "name": "lvm2", "release": "1.fc40", "source": "rpm", "version": "2.03.23" } ], "lvm2-libs": [ { "arch": "x86_64", "epoch": null, "name": "lvm2-libs", "release": "1.fc40", "source": "rpm", "version": "2.03.23" } ], "lz4-libs": [ { "arch": "x86_64", "epoch": null, "name": "lz4-libs", "release": "6.fc40", "source": "rpm", "version": "1.9.4" } ], "lzo": [ { "arch": "x86_64", "epoch": null, "name": "lzo", "release": "12.fc40", "source": "rpm", "version": "2.10" } ], "m4": [ { "arch": "x86_64", "epoch": null, "name": "m4", "release": "9.fc40", "source": "rpm", "version": "1.4.19" } ], "make": [ { "arch": "x86_64", "epoch": 1, "name": "make", "release": "6.fc40", "source": "rpm", "version": "4.4.1" } ], "man-db": [ { "arch": "x86_64", "epoch": null, "name": "man-db", "release": "6.fc40", "source": "rpm", "version": "2.12.0" } ], "mdadm": [ { "arch": "x86_64", "epoch": null, "name": "mdadm", "release": "8.fc40", "source": "rpm", "version": "4.2" } ], "mokutil": [ { "arch": "x86_64", "epoch": 2, "name": "mokutil", "release": "1.fc40", "source": "rpm", "version": "0.7.1" } ], "mpdecimal": [ { "arch": "x86_64", "epoch": null, "name": "mpdecimal", "release": "9.fc40", "source": "rpm", "version": "2.5.1" } ], "mpfr": [ { "arch": "x86_64", "epoch": null, "name": "mpfr", "release": "4.fc40", "source": "rpm", "version": "4.2.1" } ], "mt7xxx-firmware": [ { "arch": "noarch", "epoch": null, "name": "mt7xxx-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "nano": [ { "arch": "x86_64", "epoch": null, "name": "nano", "release": "7.fc40", "source": "rpm", "version": "7.2" } ], "nano-default-editor": [ { "arch": "noarch", "epoch": null, "name": "nano-default-editor", "release": "7.fc40", "source": "rpm", "version": "7.2" } ], "ncurses": [ { "arch": "x86_64", "epoch": null, "name": "ncurses", "release": "12.20240127.fc40", "source": "rpm", "version": "6.4" } ], "ncurses-base": [ { "arch": "noarch", "epoch": null, "name": "ncurses-base", "release": "12.20240127.fc40", "source": "rpm", "version": "6.4" } ], "ncurses-libs": [ { "arch": "x86_64", "epoch": null, "name": "ncurses-libs", "release": "12.20240127.fc40", "source": "rpm", "version": "6.4" } ], "net-tools": [ { "arch": "x86_64", "epoch": null, "name": "net-tools", "release": "0.69.20160912git.fc40", "source": "rpm", "version": "2.0" } ], "nettle": [ { "arch": "x86_64", "epoch": null, "name": "nettle", "release": "6.fc40", "source": "rpm", "version": "3.9.1" } ], "nfs-utils": [ { "arch": "x86_64", "epoch": 1, "name": "nfs-utils", "release": "0.rc6.fc40", "source": "rpm", "version": "2.6.4" } ], "nftables": [ { "arch": "x86_64", "epoch": 1, "name": "nftables", "release": "3.fc40", "source": "rpm", "version": "1.0.9" } ], "nilfs-utils": [ { "arch": "x86_64", "epoch": null, "name": "nilfs-utils", "release": "6.fc40", "source": "rpm", "version": "2.2.9" } ], "npth": [ { "arch": "x86_64", "epoch": null, "name": "npth", "release": "1.fc40", "source": "rpm", "version": "1.7" } ], "nspr": [ { "arch": "x86_64", "epoch": null, "name": "nspr", "release": "24.fc40", "source": "rpm", "version": "4.35.0" } ], "nss": [ { "arch": "x86_64", "epoch": null, "name": "nss", "release": "1.fc40", "source": "rpm", "version": "3.101.0" } ], "nss-softokn": [ { "arch": "x86_64", "epoch": null, "name": "nss-softokn", "release": "1.fc40", "source": "rpm", "version": "3.101.0" } ], "nss-softokn-freebl": [ { "arch": "x86_64", "epoch": null, "name": "nss-softokn-freebl", "release": "1.fc40", "source": "rpm", "version": "3.101.0" } ], "nss-sysinit": [ { "arch": "x86_64", "epoch": null, "name": "nss-sysinit", "release": "1.fc40", "source": "rpm", "version": "3.101.0" } ], "nss-util": [ { "arch": "x86_64", "epoch": null, "name": "nss-util", "release": "1.fc40", "source": "rpm", "version": "3.101.0" } ], "ntfs-3g": [ { "arch": "x86_64", "epoch": 2, "name": "ntfs-3g", "release": "5.fc40", "source": "rpm", "version": "2022.10.3" } ], "ntfs-3g-libs": [ { "arch": "x86_64", "epoch": 2, "name": "ntfs-3g-libs", "release": "5.fc40", "source": "rpm", "version": "2022.10.3" } ], "ntfsprogs": [ { "arch": "x86_64", "epoch": 2, "name": "ntfsprogs", "release": "5.fc40", "source": "rpm", "version": "2022.10.3" } ], "nvidia-gpu-firmware": [ { "arch": "noarch", "epoch": null, "name": "nvidia-gpu-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "nxpwireless-firmware": [ { "arch": "noarch", "epoch": null, "name": "nxpwireless-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "oniguruma": [ { "arch": "x86_64", "epoch": null, "name": "oniguruma", "release": "3.fc40", "source": "rpm", "version": "6.9.9" } ], "openldap": [ { "arch": "x86_64", "epoch": null, "name": "openldap", "release": "1.fc40", "source": "rpm", "version": "2.6.7" } ], "openssh": [ { "arch": "x86_64", "epoch": null, "name": "openssh", "release": "1.fc40.4", "source": "rpm", "version": "9.6p1" } ], "openssh-clients": [ { "arch": "x86_64", "epoch": null, "name": "openssh-clients", "release": "1.fc40.4", "source": "rpm", "version": "9.6p1" } ], "openssh-server": [ { "arch": "x86_64", "epoch": null, "name": "openssh-server", "release": "1.fc40.4", "source": "rpm", "version": "9.6p1" } ], "openssl": [ { "arch": "x86_64", "epoch": 1, "name": "openssl", "release": "2.fc40", "source": "rpm", "version": "3.2.1" } ], "openssl-devel": [ { "arch": "x86_64", "epoch": 1, "name": "openssl-devel", "release": "2.fc40", "source": "rpm", "version": "3.2.1" } ], "openssl-libs": [ { "arch": "x86_64", "epoch": 1, "name": "openssl-libs", "release": "2.fc40", "source": "rpm", "version": "3.2.1" } ], "openssl-pkcs11": [ { "arch": "x86_64", "epoch": null, "name": "openssl-pkcs11", "release": "8.fc40", "source": "rpm", "version": "0.4.12" } ], "os-prober": [ { "arch": "x86_64", "epoch": null, "name": "os-prober", "release": "6.fc40", "source": "rpm", "version": "1.81" } ], "p11-kit": [ { "arch": "x86_64", "epoch": null, "name": "p11-kit", "release": "1.fc40", "source": "rpm", "version": "0.25.5" } ], "p11-kit-trust": [ { "arch": "x86_64", "epoch": null, "name": "p11-kit-trust", "release": "1.fc40", "source": "rpm", "version": "0.25.5" } ], "pam": [ { "arch": "x86_64", "epoch": null, "name": "pam", "release": "3.fc40", "source": "rpm", "version": "1.6.1" } ], "pam-libs": [ { "arch": "x86_64", "epoch": null, "name": "pam-libs", "release": "3.fc40", "source": "rpm", "version": "1.6.1" } ], "parted": [ { "arch": "x86_64", "epoch": null, "name": "parted", "release": "4.fc40", "source": "rpm", "version": "3.6" } ], "passim": [ { "arch": "x86_64", "epoch": null, "name": "passim", "release": "1.fc40", "source": "rpm", "version": "0.1.7" } ], "passim-libs": [ { "arch": "x86_64", "epoch": null, "name": "passim-libs", "release": "1.fc40", "source": "rpm", "version": "0.1.7" } ], "pcre2": [ { "arch": "x86_64", "epoch": null, "name": "pcre2", "release": "1.fc40", "source": "rpm", "version": "10.44" } ], "pcre2-syntax": [ { "arch": "noarch", "epoch": null, "name": "pcre2-syntax", "release": "1.fc40", "source": "rpm", "version": "10.44" } ], "pcsc-lite": [ { "arch": "x86_64", "epoch": null, "name": "pcsc-lite", "release": "1.fc40", "source": "rpm", "version": "2.0.3" } ], "pcsc-lite-ccid": [ { "arch": "x86_64", "epoch": null, "name": "pcsc-lite-ccid", "release": "3.fc40", "source": "rpm", "version": "1.5.5" } ], "pcsc-lite-libs": [ { "arch": "x86_64", "epoch": null, "name": "pcsc-lite-libs", "release": "1.fc40", "source": "rpm", "version": "2.0.3" } ], "perl-AutoLoader": [ { "arch": "noarch", "epoch": 0, "name": "perl-AutoLoader", "release": "506.fc40", "source": "rpm", "version": "5.74" } ], "perl-B": [ { "arch": "x86_64", "epoch": 0, "name": "perl-B", "release": "506.fc40", "source": "rpm", "version": "1.88" } ], "perl-Carp": [ { "arch": "noarch", "epoch": null, "name": "perl-Carp", "release": "502.fc40", "source": "rpm", "version": "1.54" } ], "perl-Class-Struct": [ { "arch": "noarch", "epoch": 0, "name": "perl-Class-Struct", "release": "506.fc40", "source": "rpm", "version": "0.68" } ], "perl-Data-Dumper": [ { "arch": "x86_64", "epoch": null, "name": "perl-Data-Dumper", "release": "503.fc40", "source": "rpm", "version": "2.188" } ], "perl-Digest": [ { "arch": "noarch", "epoch": null, "name": "perl-Digest", "release": "502.fc40", "source": "rpm", "version": "1.20" } ], "perl-Digest-MD5": [ { "arch": "x86_64", "epoch": null, "name": "perl-Digest-MD5", "release": "3.fc40", "source": "rpm", "version": "2.59" } ], "perl-DynaLoader": [ { "arch": "x86_64", "epoch": 0, "name": "perl-DynaLoader", "release": "506.fc40", "source": "rpm", "version": "1.54" } ], "perl-Encode": [ { "arch": "x86_64", "epoch": 4, "name": "perl-Encode", "release": "505.fc40", "source": "rpm", "version": "3.21" } ], "perl-Errno": [ { "arch": "x86_64", "epoch": 0, "name": "perl-Errno", "release": "506.fc40", "source": "rpm", "version": "1.37" } ], "perl-Error": [ { "arch": "noarch", "epoch": 1, "name": "perl-Error", "release": "15.fc40", "source": "rpm", "version": "0.17029" } ], "perl-Exporter": [ { "arch": "noarch", "epoch": null, "name": "perl-Exporter", "release": "3.fc40", "source": "rpm", "version": "5.78" } ], "perl-Fcntl": [ { "arch": "x86_64", "epoch": 0, "name": "perl-Fcntl", "release": "506.fc40", "source": "rpm", "version": "1.15" } ], "perl-File-Basename": [ { "arch": "noarch", "epoch": 0, "name": "perl-File-Basename", "release": "506.fc40", "source": "rpm", "version": "2.86" } ], "perl-File-Find": [ { "arch": "noarch", "epoch": 0, "name": "perl-File-Find", "release": "506.fc40", "source": "rpm", "version": "1.43" } ], "perl-File-Path": [ { "arch": "noarch", "epoch": null, "name": "perl-File-Path", "release": "503.fc40", "source": "rpm", "version": "2.18" } ], "perl-File-Temp": [ { "arch": "noarch", "epoch": 1, "name": "perl-File-Temp", "release": "503.fc40", "source": "rpm", "version": "0.231.100" } ], "perl-File-stat": [ { "arch": "noarch", "epoch": 0, "name": "perl-File-stat", "release": "506.fc40", "source": "rpm", "version": "1.13" } ], "perl-FileHandle": [ { "arch": "noarch", "epoch": 0, "name": "perl-FileHandle", "release": "506.fc40", "source": "rpm", "version": "2.05" } ], "perl-Getopt-Long": [ { "arch": "noarch", "epoch": 1, "name": "perl-Getopt-Long", "release": "4.fc40", "source": "rpm", "version": "2.57" } ], "perl-Getopt-Std": [ { "arch": "noarch", "epoch": 0, "name": "perl-Getopt-Std", "release": "506.fc40", "source": "rpm", "version": "1.13" } ], "perl-Git": [ { "arch": "noarch", "epoch": null, "name": "perl-Git", "release": "2.fc40", "source": "rpm", "version": "2.45.2" } ], "perl-HTTP-Tiny": [ { "arch": "noarch", "epoch": null, "name": "perl-HTTP-Tiny", "release": "5.fc40", "source": "rpm", "version": "0.088" } ], "perl-IO": [ { "arch": "x86_64", "epoch": 0, "name": "perl-IO", "release": "506.fc40", "source": "rpm", "version": "1.52" } ], "perl-IO-Socket-IP": [ { "arch": "noarch", "epoch": null, "name": "perl-IO-Socket-IP", "release": "2.fc40", "source": "rpm", "version": "0.42" } ], "perl-IO-Socket-SSL": [ { "arch": "noarch", "epoch": null, "name": "perl-IO-Socket-SSL", "release": "1.fc40", "source": "rpm", "version": "2.085" } ], "perl-IPC-Open3": [ { "arch": "noarch", "epoch": 0, "name": "perl-IPC-Open3", "release": "506.fc40", "source": "rpm", "version": "1.22" } ], "perl-MIME-Base64": [ { "arch": "x86_64", "epoch": null, "name": "perl-MIME-Base64", "release": "503.fc40", "source": "rpm", "version": "3.16" } ], "perl-Mozilla-CA": [ { "arch": "noarch", "epoch": null, "name": "perl-Mozilla-CA", "release": "3.fc40", "source": "rpm", "version": "20231213" } ], "perl-NDBM_File": [ { "arch": "x86_64", "epoch": 0, "name": "perl-NDBM_File", "release": "506.fc40", "source": "rpm", "version": "1.16" } ], "perl-Net-SSLeay": [ { "arch": "x86_64", "epoch": null, "name": "perl-Net-SSLeay", "release": "3.fc40", "source": "rpm", "version": "1.94" } ], "perl-POSIX": [ { "arch": "x86_64", "epoch": 0, "name": "perl-POSIX", "release": "506.fc40", "source": "rpm", "version": "2.13" } ], "perl-PathTools": [ { "arch": "x86_64", "epoch": null, "name": "perl-PathTools", "release": "502.fc40", "source": "rpm", "version": "3.89" } ], "perl-Pod-Escapes": [ { "arch": "noarch", "epoch": 1, "name": "perl-Pod-Escapes", "release": "503.fc40", "source": "rpm", "version": "1.07" } ], "perl-Pod-Perldoc": [ { "arch": "noarch", "epoch": null, "name": "perl-Pod-Perldoc", "release": "503.fc40", "source": "rpm", "version": "3.28.01" } ], "perl-Pod-Simple": [ { "arch": "noarch", "epoch": 1, "name": "perl-Pod-Simple", "release": "6.fc40", "source": "rpm", "version": "3.45" } ], "perl-Pod-Usage": [ { "arch": "noarch", "epoch": 4, "name": "perl-Pod-Usage", "release": "504.fc40", "source": "rpm", "version": "2.03" } ], "perl-Scalar-List-Utils": [ { "arch": "x86_64", "epoch": 5, "name": "perl-Scalar-List-Utils", "release": "503.fc40", "source": "rpm", "version": "1.63" } ], "perl-SelectSaver": [ { "arch": "noarch", "epoch": 0, "name": "perl-SelectSaver", "release": "506.fc40", "source": "rpm", "version": "1.02" } ], "perl-Socket": [ { "arch": "x86_64", "epoch": 4, "name": "perl-Socket", "release": "1.fc40", "source": "rpm", "version": "2.038" } ], "perl-Storable": [ { "arch": "x86_64", "epoch": 1, "name": "perl-Storable", "release": "502.fc40", "source": "rpm", "version": "3.32" } ], "perl-Symbol": [ { "arch": "noarch", "epoch": 0, "name": "perl-Symbol", "release": "506.fc40", "source": "rpm", "version": "1.09" } ], "perl-Term-ANSIColor": [ { "arch": "noarch", "epoch": null, "name": "perl-Term-ANSIColor", "release": "504.fc40", "source": "rpm", "version": "5.01" } ], "perl-Term-Cap": [ { "arch": "noarch", "epoch": null, "name": "perl-Term-Cap", "release": "503.fc40", "source": "rpm", "version": "1.18" } ], "perl-TermReadKey": [ { "arch": "x86_64", "epoch": null, "name": "perl-TermReadKey", "release": "21.fc40", "source": "rpm", "version": "2.38" } ], "perl-Text-ParseWords": [ { "arch": "noarch", "epoch": null, "name": "perl-Text-ParseWords", "release": "502.fc40", "source": "rpm", "version": "3.31" } ], "perl-Text-Tabs+Wrap": [ { "arch": "noarch", "epoch": null, "name": "perl-Text-Tabs+Wrap", "release": "1.fc40", "source": "rpm", "version": "2024.001" } ], "perl-Time-Local": [ { "arch": "noarch", "epoch": 2, "name": "perl-Time-Local", "release": "5.fc40", "source": "rpm", "version": "1.350" } ], "perl-URI": [ { "arch": "noarch", "epoch": null, "name": "perl-URI", "release": "1.fc40", "source": "rpm", "version": "5.28" } ], "perl-base": [ { "arch": "noarch", "epoch": 0, "name": "perl-base", "release": "506.fc40", "source": "rpm", "version": "2.27" } ], "perl-constant": [ { "arch": "noarch", "epoch": null, "name": "perl-constant", "release": "503.fc40", "source": "rpm", "version": "1.33" } ], "perl-if": [ { "arch": "noarch", "epoch": 0, "name": "perl-if", "release": "506.fc40", "source": "rpm", "version": "0.61.000" } ], "perl-interpreter": [ { "arch": "x86_64", "epoch": 4, "name": "perl-interpreter", "release": "506.fc40", "source": "rpm", "version": "5.38.2" } ], "perl-lib": [ { "arch": "x86_64", "epoch": 0, "name": "perl-lib", "release": "506.fc40", "source": "rpm", "version": "0.65" } ], "perl-libnet": [ { "arch": "noarch", "epoch": null, "name": "perl-libnet", "release": "503.fc40", "source": "rpm", "version": "3.15" } ], "perl-libs": [ { "arch": "x86_64", "epoch": 4, "name": "perl-libs", "release": "506.fc40", "source": "rpm", "version": "5.38.2" } ], "perl-locale": [ { "arch": "noarch", "epoch": 0, "name": "perl-locale", "release": "506.fc40", "source": "rpm", "version": "1.10" } ], "perl-mro": [ { "arch": "x86_64", "epoch": 0, "name": "perl-mro", "release": "506.fc40", "source": "rpm", "version": "1.28" } ], "perl-overload": [ { "arch": "noarch", "epoch": 0, "name": "perl-overload", "release": "506.fc40", "source": "rpm", "version": "1.37" } ], "perl-overloading": [ { "arch": "noarch", "epoch": 0, "name": "perl-overloading", "release": "506.fc40", "source": "rpm", "version": "0.02" } ], "perl-parent": [ { "arch": "noarch", "epoch": 1, "name": "perl-parent", "release": "502.fc40", "source": "rpm", "version": "0.241" } ], "perl-podlators": [ { "arch": "noarch", "epoch": 1, "name": "perl-podlators", "release": "502.fc40", "source": "rpm", "version": "5.01" } ], "perl-vars": [ { "arch": "noarch", "epoch": 0, "name": "perl-vars", "release": "506.fc40", "source": "rpm", "version": "1.05" } ], "pkgconf": [ { "arch": "x86_64", "epoch": null, "name": "pkgconf", "release": "1.fc40", "source": "rpm", "version": "2.1.1" } ], "pkgconf-m4": [ { "arch": "noarch", "epoch": null, "name": "pkgconf-m4", "release": "1.fc40", "source": "rpm", "version": "2.1.1" } ], "pkgconf-pkg-config": [ { "arch": "x86_64", "epoch": null, "name": "pkgconf-pkg-config", "release": "1.fc40", "source": "rpm", "version": "2.1.1" } ], "plymouth": [ { "arch": "x86_64", "epoch": null, "name": "plymouth", "release": "12.fc40", "source": "rpm", "version": "24.004.60" } ], "plymouth-core-libs": [ { "arch": "x86_64", "epoch": null, "name": "plymouth-core-libs", "release": "12.fc40", "source": "rpm", "version": "24.004.60" } ], "plymouth-scripts": [ { "arch": "x86_64", "epoch": null, "name": "plymouth-scripts", "release": "12.fc40", "source": "rpm", "version": "24.004.60" } ], "policycoreutils": [ { "arch": "x86_64", "epoch": null, "name": "policycoreutils", "release": "3.fc40", "source": "rpm", "version": "3.6" } ], "polkit": [ { "arch": "x86_64", "epoch": null, "name": "polkit", "release": "2.fc40", "source": "rpm", "version": "124" } ], "polkit-libs": [ { "arch": "x86_64", "epoch": null, "name": "polkit-libs", "release": "2.fc40", "source": "rpm", "version": "124" } ], "polkit-pkla-compat": [ { "arch": "x86_64", "epoch": null, "name": "polkit-pkla-compat", "release": "28.fc40", "source": "rpm", "version": "0.1" } ], "popt": [ { "arch": "x86_64", "epoch": null, "name": "popt", "release": "6.fc40", "source": "rpm", "version": "1.19" } ], "procps-ng": [ { "arch": "x86_64", "epoch": null, "name": "procps-ng", "release": "3.fc40", "source": "rpm", "version": "4.0.4" } ], "protobuf-c": [ { "arch": "x86_64", "epoch": null, "name": "protobuf-c", "release": "3.fc40", "source": "rpm", "version": "1.5.0" } ], "psmisc": [ { "arch": "x86_64", "epoch": null, "name": "psmisc", "release": "6.fc40", "source": "rpm", "version": "23.6" } ], "publicsuffix-list-dafsa": [ { "arch": "noarch", "epoch": null, "name": "publicsuffix-list-dafsa", "release": "3.fc40", "source": "rpm", "version": "20240107" } ], "python-pip-wheel": [ { "arch": "noarch", "epoch": null, "name": "python-pip-wheel", "release": "1.fc40", "source": "rpm", "version": "23.3.2" } ], "python-unversioned-command": [ { "arch": "noarch", "epoch": null, "name": "python-unversioned-command", "release": "1.fc40", "source": "rpm", "version": "3.12.4" } ], "python3": [ { "arch": "x86_64", "epoch": null, "name": "python3", "release": "1.fc40", "source": "rpm", "version": "3.12.4" } ], "python3-attrs": [ { "arch": "noarch", "epoch": null, "name": "python3-attrs", "release": "4.fc40", "source": "rpm", "version": "23.2.0" } ], "python3-audit": [ { "arch": "x86_64", "epoch": null, "name": "python3-audit", "release": "1.fc40", "source": "rpm", "version": "4.0.1" } ], "python3-blivet": [ { "arch": "noarch", "epoch": 1, "name": "python3-blivet", "release": "1.fc40", "source": "rpm", "version": "3.10.0" } ], "python3-blockdev": [ { "arch": "x86_64", "epoch": null, "name": "python3-blockdev", "release": "1.fc40", "source": "rpm", "version": "3.1.1" } ], "python3-bytesize": [ { "arch": "x86_64", "epoch": null, "name": "python3-bytesize", "release": "3.fc40", "source": "rpm", "version": "2.10" } ], "python3-charset-normalizer": [ { "arch": "noarch", "epoch": null, "name": "python3-charset-normalizer", "release": "3.fc40", "source": "rpm", "version": "3.3.2" } ], "python3-configobj": [ { "arch": "noarch", "epoch": null, "name": "python3-configobj", "release": "8.fc40", "source": "rpm", "version": "5.0.8" } ], "python3-configshell": [ { "arch": "noarch", "epoch": 1, "name": "python3-configshell", "release": "6.fc40", "source": "rpm", "version": "1.1.30" } ], "python3-dateutil": [ { "arch": "noarch", "epoch": 1, "name": "python3-dateutil", "release": "13.fc40", "source": "rpm", "version": "2.8.2" } ], "python3-dbus": [ { "arch": "x86_64", "epoch": null, "name": "python3-dbus", "release": "6.fc40", "source": "rpm", "version": "1.3.2" } ], "python3-dbus-client-gen": [ { "arch": "noarch", "epoch": null, "name": "python3-dbus-client-gen", "release": "6.fc40", "source": "rpm", "version": "0.5.1" } ], "python3-dbus-python-client-gen": [ { "arch": "noarch", "epoch": null, "name": "python3-dbus-python-client-gen", "release": "5.fc40", "source": "rpm", "version": "0.8.3" } ], "python3-dbus-signature-pyparsing": [ { "arch": "noarch", "epoch": null, "name": "python3-dbus-signature-pyparsing", "release": "7.fc40", "source": "rpm", "version": "0.4.1" } ], "python3-distro": [ { "arch": "noarch", "epoch": null, "name": "python3-distro", "release": "3.fc40", "source": "rpm", "version": "1.9.0" } ], "python3-dnf": [ { "arch": "noarch", "epoch": null, "name": "python3-dnf", "release": "1.fc40", "source": "rpm", "version": "4.21.0" } ], "python3-dnf-plugins-core": [ { "arch": "noarch", "epoch": null, "name": "python3-dnf-plugins-core", "release": "1.fc40", "source": "rpm", "version": "4.8.0" } ], "python3-firewall": [ { "arch": "noarch", "epoch": null, "name": "python3-firewall", "release": "1.fc40", "source": "rpm", "version": "2.1.3" } ], "python3-gobject-base": [ { "arch": "x86_64", "epoch": null, "name": "python3-gobject-base", "release": "1.fc40", "source": "rpm", "version": "3.48.2" } ], "python3-hawkey": [ { "arch": "x86_64", "epoch": null, "name": "python3-hawkey", "release": "1.fc40", "source": "rpm", "version": "0.73.2" } ], "python3-idna": [ { "arch": "noarch", "epoch": null, "name": "python3-idna", "release": "1.fc40", "source": "rpm", "version": "3.7" } ], "python3-into-dbus-python": [ { "arch": "noarch", "epoch": null, "name": "python3-into-dbus-python", "release": "5.fc40", "source": "rpm", "version": "0.8.2" } ], "python3-jinja2": [ { "arch": "noarch", "epoch": null, "name": "python3-jinja2", "release": "1.fc40", "source": "rpm", "version": "3.1.4" } ], "python3-jsonpatch": [ { "arch": "noarch", "epoch": null, "name": "python3-jsonpatch", "release": "4.fc40", "source": "rpm", "version": "1.33" } ], "python3-jsonpointer": [ { "arch": "noarch", "epoch": null, "name": "python3-jsonpointer", "release": "7.fc40", "source": "rpm", "version": "2.3" } ], "python3-jsonschema": [ { "arch": "noarch", "epoch": null, "name": "python3-jsonschema", "release": "3.fc40", "source": "rpm", "version": "4.19.1" } ], "python3-jsonschema-specifications": [ { "arch": "noarch", "epoch": null, "name": "python3-jsonschema-specifications", "release": "3.fc40", "source": "rpm", "version": "2023.11.2" } ], "python3-justbases": [ { "arch": "noarch", "epoch": null, "name": "python3-justbases", "release": "7.fc40", "source": "rpm", "version": "0.15.2" } ], "python3-justbytes": [ { "arch": "noarch", "epoch": null, "name": "python3-justbytes", "release": "5.fc40", "source": "rpm", "version": "0.15.2" } ], "python3-kmod": [ { "arch": "x86_64", "epoch": null, "name": "python3-kmod", "release": "4.fc40", "source": "rpm", "version": "0.9.2" } ], "python3-libcomps": [ { "arch": "x86_64", "epoch": null, "name": "python3-libcomps", "release": "5.fc40", "source": "rpm", "version": "0.1.20" } ], "python3-libdnf": [ { "arch": "x86_64", "epoch": null, "name": "python3-libdnf", "release": "1.fc40", "source": "rpm", "version": "0.73.2" } ], "python3-libmount": [ { "arch": "x86_64", "epoch": null, "name": "python3-libmount", "release": "1.fc40", "source": "rpm", "version": "2.40.1" } ], "python3-libs": [ { "arch": "x86_64", "epoch": null, "name": "python3-libs", "release": "1.fc40", "source": "rpm", "version": "3.12.4" } ], "python3-libselinux": [ { "arch": "x86_64", "epoch": null, "name": "python3-libselinux", "release": "4.fc40", "source": "rpm", "version": "3.6" } ], "python3-libsemanage": [ { "arch": "x86_64", "epoch": null, "name": "python3-libsemanage", "release": "3.fc40", "source": "rpm", "version": "3.6" } ], "python3-lxml": [ { "arch": "x86_64", "epoch": null, "name": "python3-lxml", "release": "7.fc40", "source": "rpm", "version": "5.1.0" } ], "python3-markupsafe": [ { "arch": "x86_64", "epoch": null, "name": "python3-markupsafe", "release": "4.fc40", "source": "rpm", "version": "2.1.3" } ], "python3-netifaces": [ { "arch": "x86_64", "epoch": null, "name": "python3-netifaces", "release": "9.fc40", "source": "rpm", "version": "0.11.0" } ], "python3-nftables": [ { "arch": "x86_64", "epoch": 1, "name": "python3-nftables", "release": "3.fc40", "source": "rpm", "version": "1.0.9" } ], "python3-oauthlib": [ { "arch": "noarch", "epoch": null, "name": "python3-oauthlib", "release": "3.fc40", "source": "rpm", "version": "3.2.2" } ], "python3-packaging": [ { "arch": "noarch", "epoch": null, "name": "python3-packaging", "release": "4.fc40", "source": "rpm", "version": "23.2" } ], "python3-policycoreutils": [ { "arch": "noarch", "epoch": null, "name": "python3-policycoreutils", "release": "3.fc40", "source": "rpm", "version": "3.6" } ], "python3-psutil": [ { "arch": "x86_64", "epoch": null, "name": "python3-psutil", "release": "1.fc40", "source": "rpm", "version": "5.9.8" } ], "python3-pyparsing": [ { "arch": "noarch", "epoch": null, "name": "python3-pyparsing", "release": "2.fc40", "source": "rpm", "version": "3.1.2" } ], "python3-pyparted": [ { "arch": "x86_64", "epoch": 1, "name": "python3-pyparted", "release": "5.fc40", "source": "rpm", "version": "3.13.0" } ], "python3-pyserial": [ { "arch": "noarch", "epoch": null, "name": "python3-pyserial", "release": "8.fc40", "source": "rpm", "version": "3.5" } ], "python3-pysocks": [ { "arch": "noarch", "epoch": null, "name": "python3-pysocks", "release": "22.fc40", "source": "rpm", "version": "1.7.1" } ], "python3-pyudev": [ { "arch": "noarch", "epoch": null, "name": "python3-pyudev", "release": "7.fc40", "source": "rpm", "version": "0.24.1" } ], "python3-pyyaml": [ { "arch": "x86_64", "epoch": null, "name": "python3-pyyaml", "release": "14.fc40", "source": "rpm", "version": "6.0.1" } ], "python3-referencing": [ { "arch": "noarch", "epoch": null, "name": "python3-referencing", "release": "3.fc40", "source": "rpm", "version": "0.31.1" } ], "python3-requests": [ { "arch": "noarch", "epoch": null, "name": "python3-requests", "release": "3.fc40", "source": "rpm", "version": "2.31.0" } ], "python3-rpds-py": [ { "arch": "x86_64", "epoch": null, "name": "python3-rpds-py", "release": "1.fc40", "source": "rpm", "version": "0.18.1" } ], "python3-rpm": [ { "arch": "x86_64", "epoch": null, "name": "python3-rpm", "release": "1.fc40", "source": "rpm", "version": "4.19.1.1" } ], "python3-rtslib": [ { "arch": "noarch", "epoch": null, "name": "python3-rtslib", "release": "7.fc40", "source": "rpm", "version": "2.1.76" } ], "python3-setools": [ { "arch": "x86_64", "epoch": null, "name": "python3-setools", "release": "2.fc40", "source": "rpm", "version": "4.5.1" } ], "python3-setuptools": [ { "arch": "noarch", "epoch": null, "name": "python3-setuptools", "release": "3.fc40", "source": "rpm", "version": "69.0.3" } ], "python3-six": [ { "arch": "noarch", "epoch": null, "name": "python3-six", "release": "14.fc40", "source": "rpm", "version": "1.16.0" } ], "python3-systemd": [ { "arch": "x86_64", "epoch": null, "name": "python3-systemd", "release": "9.fc40", "source": "rpm", "version": "235" } ], "python3-typing-extensions": [ { "arch": "noarch", "epoch": null, "name": "python3-typing-extensions", "release": "2.fc40", "source": "rpm", "version": "4.12.2" } ], "python3-unbound": [ { "arch": "x86_64", "epoch": null, "name": "python3-unbound", "release": "1.fc40", "source": "rpm", "version": "1.20.0" } ], "python3-urllib3": [ { "arch": "noarch", "epoch": null, "name": "python3-urllib3", "release": "1.fc40", "source": "rpm", "version": "1.26.19" } ], "python3-urllib3+socks": [ { "arch": "noarch", "epoch": null, "name": "python3-urllib3+socks", "release": "1.fc40", "source": "rpm", "version": "1.26.19" } ], "python3-urwid": [ { "arch": "x86_64", "epoch": null, "name": "python3-urwid", "release": "2.fc40", "source": "rpm", "version": "2.5.3" } ], "python3-wcwidth": [ { "arch": "noarch", "epoch": null, "name": "python3-wcwidth", "release": "2.fc40", "source": "rpm", "version": "0.2.13" } ], "qa-tools": [ { "arch": "noarch", "epoch": null, "name": "qa-tools", "release": "4.fc40", "source": "rpm", "version": "4.1" } ], "quota": [ { "arch": "x86_64", "epoch": 1, "name": "quota", "release": "5.fc40", "source": "rpm", "version": "4.09" } ], "quota-nls": [ { "arch": "noarch", "epoch": 1, "name": "quota-nls", "release": "5.fc40", "source": "rpm", "version": "4.09" } ], "readline": [ { "arch": "x86_64", "epoch": null, "name": "readline", "release": "8.fc40", "source": "rpm", "version": "8.2" } ], "realtek-firmware": [ { "arch": "noarch", "epoch": null, "name": "realtek-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "restraint": [ { "arch": "x86_64", "epoch": null, "name": "restraint", "release": "1.fc40eng", "source": "rpm", "version": "0.4.4" } ], "restraint-rhts": [ { "arch": "x86_64", "epoch": null, "name": "restraint-rhts", "release": "1.fc40eng", "source": "rpm", "version": "0.4.4" } ], "rng-tools": [ { "arch": "x86_64", "epoch": null, "name": "rng-tools", "release": "2.fc40", "source": "rpm", "version": "6.17" } ], "rootfiles": [ { "arch": "noarch", "epoch": null, "name": "rootfiles", "release": "36.fc40", "source": "rpm", "version": "8.1" } ], "rpcbind": [ { "arch": "x86_64", "epoch": null, "name": "rpcbind", "release": "4.rc3.fc40", "source": "rpm", "version": "1.2.6" } ], "rpm": [ { "arch": "x86_64", "epoch": null, "name": "rpm", "release": "1.fc40", "source": "rpm", "version": "4.19.1.1" } ], "rpm-build-libs": [ { "arch": "x86_64", "epoch": null, "name": "rpm-build-libs", "release": "1.fc40", "source": "rpm", "version": "4.19.1.1" } ], "rpm-libs": [ { "arch": "x86_64", "epoch": null, "name": "rpm-libs", "release": "1.fc40", "source": "rpm", "version": "4.19.1.1" } ], "rpm-plugin-selinux": [ { "arch": "x86_64", "epoch": null, "name": "rpm-plugin-selinux", "release": "1.fc40", "source": "rpm", "version": "4.19.1.1" } ], "rpm-sequoia": [ { "arch": "x86_64", "epoch": null, "name": "rpm-sequoia", "release": "1.fc40", "source": "rpm", "version": "1.7.0" } ], "rpm-sign-libs": [ { "arch": "x86_64", "epoch": null, "name": "rpm-sign-libs", "release": "1.fc40", "source": "rpm", "version": "4.19.1.1" } ], "rsync": [ { "arch": "x86_64", "epoch": null, "name": "rsync", "release": "1.fc40", "source": "rpm", "version": "3.3.0" } ], "rtl-sdr": [ { "arch": "x86_64", "epoch": null, "name": "rtl-sdr", "release": "3.fc40", "source": "rpm", "version": "0.6.0^20230921git1261fbb2" } ], "sed": [ { "arch": "x86_64", "epoch": null, "name": "sed", "release": "1.fc40", "source": "rpm", "version": "4.9" } ], "selinux-policy": [ { "arch": "noarch", "epoch": null, "name": "selinux-policy", "release": "1.fc40", "source": "rpm", "version": "40.26" } ], "selinux-policy-targeted": [ { "arch": "noarch", "epoch": null, "name": "selinux-policy-targeted", "release": "1.fc40", "source": "rpm", "version": "40.26" } ], "setup": [ { "arch": "noarch", "epoch": null, "name": "setup", "release": "2.fc40", "source": "rpm", "version": "2.14.5" } ], "shadow-utils": [ { "arch": "x86_64", "epoch": 2, "name": "shadow-utils", "release": "3.fc40", "source": "rpm", "version": "4.15.1" } ], "shared-mime-info": [ { "arch": "x86_64", "epoch": null, "name": "shared-mime-info", "release": "5.fc40", "source": "rpm", "version": "2.3" } ], "sqlite-libs": [ { "arch": "x86_64", "epoch": null, "name": "sqlite-libs", "release": "2.fc40", "source": "rpm", "version": "3.45.1" } ], "sssd-client": [ { "arch": "x86_64", "epoch": null, "name": "sssd-client", "release": "1.fc40", "source": "rpm", "version": "2.9.5" } ], "sssd-common": [ { "arch": "x86_64", "epoch": null, "name": "sssd-common", "release": "1.fc40", "source": "rpm", "version": "2.9.5" } ], "sssd-kcm": [ { "arch": "x86_64", "epoch": null, "name": "sssd-kcm", "release": "1.fc40", "source": "rpm", "version": "2.9.5" } ], "sssd-nfs-idmap": [ { "arch": "x86_64", "epoch": null, "name": "sssd-nfs-idmap", "release": "1.fc40", "source": "rpm", "version": "2.9.5" } ], "strace": [ { "arch": "x86_64", "epoch": null, "name": "strace", "release": "1.fc40", "source": "rpm", "version": "6.10" } ], "stratis-cli": [ { "arch": "noarch", "epoch": null, "name": "stratis-cli", "release": "1.fc40", "source": "rpm", "version": "3.6.2" } ], "stratisd": [ { "arch": "x86_64", "epoch": null, "name": "stratisd", "release": "1.fc40", "source": "rpm", "version": "3.6.8" } ], "sudo": [ { "arch": "x86_64", "epoch": null, "name": "sudo", "release": "2.p5.fc40", "source": "rpm", "version": "1.9.15" } ], "systemd": [ { "arch": "x86_64", "epoch": null, "name": "systemd", "release": "1.fc40", "source": "rpm", "version": "255.10" } ], "systemd-libs": [ { "arch": "x86_64", "epoch": null, "name": "systemd-libs", "release": "1.fc40", "source": "rpm", "version": "255.10" } ], "systemd-networkd": [ { "arch": "x86_64", "epoch": null, "name": "systemd-networkd", "release": "1.fc40", "source": "rpm", "version": "255.10" } ], "systemd-pam": [ { "arch": "x86_64", "epoch": null, "name": "systemd-pam", "release": "1.fc40", "source": "rpm", "version": "255.10" } ], "systemd-resolved": [ { "arch": "x86_64", "epoch": null, "name": "systemd-resolved", "release": "1.fc40", "source": "rpm", "version": "255.10" } ], "systemd-udev": [ { "arch": "x86_64", "epoch": null, "name": "systemd-udev", "release": "1.fc40", "source": "rpm", "version": "255.10" } ], "systemtap": [ { "arch": "x86_64", "epoch": null, "name": "systemtap", "release": "1.fc40", "source": "rpm", "version": "5.1" } ], "systemtap-client": [ { "arch": "x86_64", "epoch": null, "name": "systemtap-client", "release": "1.fc40", "source": "rpm", "version": "5.1" } ], "systemtap-devel": [ { "arch": "x86_64", "epoch": null, "name": "systemtap-devel", "release": "1.fc40", "source": "rpm", "version": "5.1" } ], "systemtap-runtime": [ { "arch": "x86_64", "epoch": null, "name": "systemtap-runtime", "release": "1.fc40", "source": "rpm", "version": "5.1" } ], "tar": [ { "arch": "x86_64", "epoch": 2, "name": "tar", "release": "3.fc40", "source": "rpm", "version": "1.35" } ], "target-restore": [ { "arch": "noarch", "epoch": null, "name": "target-restore", "release": "7.fc40", "source": "rpm", "version": "2.1.76" } ], "targetcli": [ { "arch": "noarch", "epoch": null, "name": "targetcli", "release": "1.fc40", "source": "rpm", "version": "2.1.58" } ], "tbb": [ { "arch": "x86_64", "epoch": null, "name": "tbb", "release": "5.fc40", "source": "rpm", "version": "2021.11.0" } ], "time": [ { "arch": "x86_64", "epoch": null, "name": "time", "release": "23.fc40", "source": "rpm", "version": "1.9" } ], "tiwilink-firmware": [ { "arch": "noarch", "epoch": null, "name": "tiwilink-firmware", "release": "1.fc40", "source": "rpm", "version": "20240709" } ], "tpm2-tools": [ { "arch": "x86_64", "epoch": null, "name": "tpm2-tools", "release": "1.fc40", "source": "rpm", "version": "5.7" } ], "tpm2-tss": [ { "arch": "x86_64", "epoch": null, "name": "tpm2-tss", "release": "1.fc40", "source": "rpm", "version": "4.1.3" } ], "tpm2-tss-fapi": [ { "arch": "x86_64", "epoch": null, "name": "tpm2-tss-fapi", "release": "1.fc40", "source": "rpm", "version": "4.1.3" } ], "tzdata": [ { "arch": "noarch", "epoch": null, "name": "tzdata", "release": "5.fc40", "source": "rpm", "version": "2024a" } ], "udisks2": [ { "arch": "x86_64", "epoch": null, "name": "udisks2", "release": "5.fc40", "source": "rpm", "version": "2.10.1" } ], "unbound-anchor": [ { "arch": "x86_64", "epoch": null, "name": "unbound-anchor", "release": "1.fc40", "source": "rpm", "version": "1.20.0" } ], "unbound-libs": [ { "arch": "x86_64", "epoch": null, "name": "unbound-libs", "release": "1.fc40", "source": "rpm", "version": "1.20.0" } ], "unzip": [ { "arch": "x86_64", "epoch": null, "name": "unzip", "release": "63.fc40", "source": "rpm", "version": "6.0" } ], "userspace-rcu": [ { "arch": "x86_64", "epoch": null, "name": "userspace-rcu", "release": "4.fc40", "source": "rpm", "version": "0.14.0" } ], "util-linux": [ { "arch": "x86_64", "epoch": null, "name": "util-linux", "release": "1.fc40", "source": "rpm", "version": "2.40.1" } ], "util-linux-core": [ { "arch": "x86_64", "epoch": null, "name": "util-linux-core", "release": "1.fc40", "source": "rpm", "version": "2.40.1" } ], "vim-common": [ { "arch": "x86_64", "epoch": 2, "name": "vim-common", "release": "1.fc40", "source": "rpm", "version": "9.1.571" } ], "vim-data": [ { "arch": "noarch", "epoch": 2, "name": "vim-data", "release": "1.fc40", "source": "rpm", "version": "9.1.571" } ], "vim-enhanced": [ { "arch": "x86_64", "epoch": 2, "name": "vim-enhanced", "release": "1.fc40", "source": "rpm", "version": "9.1.571" } ], "vim-filesystem": [ { "arch": "noarch", "epoch": 2, "name": "vim-filesystem", "release": "1.fc40", "source": "rpm", "version": "9.1.571" } ], "vim-minimal": [ { "arch": "x86_64", "epoch": 2, "name": "vim-minimal", "release": "1.fc40", "source": "rpm", "version": "9.1.571" } ], "volume_key-libs": [ { "arch": "x86_64", "epoch": null, "name": "volume_key-libs", "release": "21.fc40", "source": "rpm", "version": "0.3.12" } ], "wget2": [ { "arch": "x86_64", "epoch": null, "name": "wget2", "release": "11.fc40", "source": "rpm", "version": "2.1.0" } ], "wget2-libs": [ { "arch": "x86_64", "epoch": null, "name": "wget2-libs", "release": "11.fc40", "source": "rpm", "version": "2.1.0" } ], "wget2-wget": [ { "arch": "x86_64", "epoch": null, "name": "wget2-wget", "release": "11.fc40", "source": "rpm", "version": "2.1.0" } ], "which": [ { "arch": "x86_64", "epoch": null, "name": "which", "release": "41.fc40", "source": "rpm", "version": "2.21" } ], "xfsprogs": [ { "arch": "x86_64", "epoch": null, "name": "xfsprogs", "release": "3.fc40", "source": "rpm", "version": "6.5.0" } ], "xkeyboard-config": [ { "arch": "noarch", "epoch": null, "name": "xkeyboard-config", "release": "1.fc40", "source": "rpm", "version": "2.41" } ], "xxd": [ { "arch": "x86_64", "epoch": 2, "name": "xxd", "release": "1.fc40", "source": "rpm", "version": "9.1.571" } ], "xxhash-libs": [ { "arch": "x86_64", "epoch": null, "name": "xxhash-libs", "release": "2.fc40", "source": "rpm", "version": "0.8.2" } ], "xz": [ { "arch": "x86_64", "epoch": 1, "name": "xz", "release": "3.fc40", "source": "rpm", "version": "5.4.6" } ], "xz-devel": [ { "arch": "x86_64", "epoch": 1, "name": "xz-devel", "release": "3.fc40", "source": "rpm", "version": "5.4.6" } ], "xz-libs": [ { "arch": "x86_64", "epoch": 1, "name": "xz-libs", "release": "3.fc40", "source": "rpm", "version": "5.4.6" } ], "yum": [ { "arch": "noarch", "epoch": null, "name": "yum", "release": "1.fc40", "source": "rpm", "version": "4.21.0" } ], "zchunk-libs": [ { "arch": "x86_64", "epoch": null, "name": "zchunk-libs", "release": "1.fc40", "source": "rpm", "version": "1.5.1" } ], "zip": [ { "arch": "x86_64", "epoch": null, "name": "zip", "release": "40.fc40", "source": "rpm", "version": "3.0" } ], "zlib-ng-compat": [ { "arch": "x86_64", "epoch": null, "name": "zlib-ng-compat", "release": "1.fc40", "source": "rpm", "version": "2.1.7" } ], "zlib-ng-compat-devel": [ { "arch": "x86_64", "epoch": null, "name": "zlib-ng-compat-devel", "release": "1.fc40", "source": "rpm", "version": "2.1.7" } ], "zram-generator": [ { "arch": "x86_64", "epoch": null, "name": "zram-generator", "release": "11.fc40", "source": "rpm", "version": "1.1.2" } ], "zram-generator-defaults": [ { "arch": "noarch", "epoch": null, "name": "zram-generator-defaults", "release": "11.fc40", "source": "rpm", "version": "1.1.2" } ] } }, "changed": false } TASK [Set blivet package name] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:28 Saturday 17 August 2024 19:30:17 -0400 (0:00:01.339) 0:00:17.968 ******* ok: [managed_node2] => { "ansible_facts": { "blivet_pkg_name": [ "python3-blivet" ] }, "changed": false } TASK [Set blivet package version] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:32 Saturday 17 August 2024 19:30:17 -0400 (0:00:00.079) 0:00:18.048 ******* ok: [managed_node2] => { "ansible_facts": { "blivet_pkg_version": "3.10.0-1.fc40" }, "changed": false } TASK [Set distribution version] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:36 Saturday 17 August 2024 19:30:17 -0400 (0:00:00.111) 0:00:18.160 ******* ok: [managed_node2] => { "ansible_facts": { "is_fedora": true, "is_rhel10": false, "is_rhel7": false, "is_rhel9": false }, "changed": false } TASK [Get unused disks] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:47 Saturday 17 August 2024 19:30:17 -0400 (0:00:00.113) 0:00:18.273 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed_node2 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Saturday 17 August 2024 19:30:17 -0400 (0:00:00.191) 0:00:18.465 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: util-linux-core TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Saturday 17 August 2024 19:30:19 -0400 (0:00:01.419) 0:00:19.885 ******* ok: [managed_node2] => { "changed": false, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"1048576\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"1048576\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda2\" TYPE=\"part\" SIZE=\"268433341952\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda2\" TYPE=\"part\" SIZE=\"268433341952\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/zram0\" TYPE=\"disk\" SIZE=\"3897556992\" FSTYPE=\"\" LOG-SEC=\"4096\"", "filename [xvda2] is a partition", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions", "Disk [/dev/zram0] attrs [{'type': 'disk', 'size': '3897556992', 'fstype': '', 'ssize': '4096'}] size is less than requested" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Saturday 17 August 2024 19:30:19 -0400 (0:00:00.723) 0:00:20.608 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "'Unable to find unused disk' in unused_disks_return.disks", "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Saturday 17 August 2024 19:30:20 -0400 (0:00:00.105) 0:00:20.713 ******* ok: [managed_node2] => { "ansible_facts": { "unused_disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Saturday 17 August 2024 19:30:20 -0400 (0:00:00.091) 0:00:20.805 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "unused_disks | d([]) | length < disks_needed | d(1)", "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Saturday 17 August 2024 19:30:20 -0400 (0:00:00.134) 0:00:20.940 ******* ok: [managed_node2] => { "unused_disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ] } TASK [Start stratisd service] ************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:55 Saturday 17 August 2024 19:30:20 -0400 (0:00:00.095) 0:00:21.035 ******* changed: [managed_node2] => { "changed": true, "name": "stratisd", "state": "started", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system.slice dbus.socket local-fs.target systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.storage.stratis3", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "no", "Description": "Stratis daemon", "DevicePolicy": "auto", "Documentation": "\"man:stratisd(8)\"", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "DynamicUser": "no", "Environment": "RUST_BACKTRACE=1", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/libexec/stratisd ; argv[]=/usr/libexec/stratisd --log-level debug ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/libexec/stratisd ; argv[]=/usr/libexec/stratisd --log-level debug ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/stratisd.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "stratisd.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "process", "KillSignal": "2", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14724", "LimitNPROCSoft": "14724", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14724", "LimitSIGPENDINGSoft": "14724", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3380731904", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "[not set]", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "[not set]", "MemorySwapMax": "infinity", "MemorySwapPeak": "[not set]", "MemoryZSwapCurrent": "[not set]", "MemoryZSwapMax": "infinity", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "stratisd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.socket system.slice", "Restart": "on-abort", "RestartKillSignal": "2", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestamp": "Sat 2024-08-17 18:21:41 EDT", "StateChangeTimestampMonotonic": "265144585", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4417", "TimeoutAbortUSec": "45s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "45s", "TimeoutStopFailureMode": "abort", "TimeoutStopUSec": "45s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Create one Stratis pool with one volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:60 Saturday 17 August 2024 19:30:21 -0400 (0:00:01.099) 0:00:22.135 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:30:21 -0400 (0:00:00.194) 0:00:22.330 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:30:21 -0400 (0:00:00.196) 0:00:22.526 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:30:22 -0400 (0:00:00.214) 0:00:22.740 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:30:22 -0400 (0:00:00.209) 0:00:22.950 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:30:22 -0400 (0:00:00.088) 0:00:23.039 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:30:22 -0400 (0:00:00.091) 0:00:23.130 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:30:22 -0400 (0:00:00.086) 0:00:23.217 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:30:22 -0400 (0:00:00.089) 0:00:23.306 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:30:22 -0400 (0:00:00.185) 0:00:23.492 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:30:22 -0400 (0:00:00.087) 0:00:23.579 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "name": "foo", "type": "stratis", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:30:22 -0400 (0:00:00.098) 0:00:23.677 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:30:23 -0400 (0:00:00.140) 0:00:23.818 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:30:23 -0400 (0:00:00.151) 0:00:23.971 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:30:23 -0400 (0:00:00.162) 0:00:24.134 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:30:23 -0400 (0:00:00.189) 0:00:24.323 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:30:23 -0400 (0:00:00.122) 0:00:24.446 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:30:23 -0400 (0:00:00.179) 0:00:24.626 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:30:24 -0400 (0:00:00.088) 0:00:24.714 ******* changed: [managed_node2] => { "actions": [ { "action": "create format", "device": "/dev/sdi", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdh", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdg", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdf", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sde", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdd", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdc", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "create device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "create device", "device": "/dev/stratis/foo/test1", "fs_type": null }, { "action": "create format", "device": "/dev/stratis/foo/test1", "fs_type": "stratis xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/stratis/foo/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" } ], "packages": [ "stratisd", "stratis-cli", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:30:35 -0400 (0:00:11.362) 0:00:36.076 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:30:35 -0400 (0:00:00.072) 0:00:36.149 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937269.8527687, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "040ba4405b5492ce3b98ec92daf6841922885fc7", "ctime": 1723937269.8517687, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937269.8517687, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:30:35 -0400 (0:00:00.519) 0:00:36.669 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:30:36 -0400 (0:00:00.792) 0:00:37.462 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:30:36 -0400 (0:00:00.049) 0:00:37.511 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdi", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdh", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdg", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdf", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sde", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdd", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdc", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "create device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "create device", "device": "/dev/stratis/foo/test1", "fs_type": null }, { "action": "create format", "device": "/dev/stratis/foo/test1", "fs_type": "stratis xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/stratis/foo/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" } ], "packages": [ "stratisd", "stratis-cli", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:30:36 -0400 (0:00:00.057) 0:00:37.569 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:30:36 -0400 (0:00:00.051) 0:00:37.620 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:30:36 -0400 (0:00:00.048) 0:00:37.668 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:30:37 -0400 (0:00:00.071) 0:00:37.739 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:30:37 -0400 (0:00:00.953) 0:00:38.693 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed_node2] => (item={'src': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:30:38 -0400 (0:00:00.895) 0:00:39.588 ******* skipping: [managed_node2] => (item={'src': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:30:39 -0400 (0:00:00.175) 0:00:39.763 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:30:39 -0400 (0:00:00.846) 0:00:40.610 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:30:40 -0400 (0:00:00.448) 0:00:41.059 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:30:40 -0400 (0:00:00.080) 0:00:41.140 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:73 Saturday 17 August 2024 19:30:42 -0400 (0:00:02.504) 0:00:43.644 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:30:43 -0400 (0:00:00.091) 0:00:43.735 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:30:43 -0400 (0:00:00.093) 0:00:43.829 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:30:43 -0400 (0:00:00.116) 0:00:43.946 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-mdv": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-mdv", "size": "512M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thindata": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thindata", "size": "50G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thinmeta": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thinmeta", "size": "799M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-physical-originsub": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-physical-originsub", "size": "52.1G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-thinpool-pool": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-thinpool-pool", "size": "50G", "type": "stratis", "uuid": "" }, "/dev/sda": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "8e3a46ab-eda5-4b79-aed5-26808a0082b4" }, "/dev/sdb": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "dbb28899-2399-4b36-ae1b-2d4ecaafa038" }, "/dev/sdc": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "fcc15fa5-920d-41fb-a4c8-b8c0bab09e6d" }, "/dev/sdd": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "c5989df1-30c6-485f-aef2-d71d94020819" }, "/dev/sde": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "597ffcc8-ed6c-49aa-8ba7-67db8f46e7b1" }, "/dev/sdf": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "14f51071-e50a-4ecd-bbc8-b9a49033ab8b" }, "/dev/sdg": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "0896e8bd-7657-40de-9a19-bc0b9f43dc8a" }, "/dev/sdh": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "eb1e3924-45c3-4eeb-a2b0-fbeb6bce5e12" }, "/dev/sdi": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "5f9f5739-5f7c-4fbb-87eb-9aef69095115" }, "/dev/stratis/foo/test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/stratis/foo/test1", "size": "4G", "type": "stratis", "uuid": "16326300-d90c-4644-96f1-30fbfb3c417f" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:30:43 -0400 (0:00:00.716) 0:00:44.662 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003191", "end": "2024-08-17 19:30:44.551073", "rc": 0, "start": "2024-08-17 19:30:44.547882" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=16326300-d90c-4644-96f1-30fbfb3c417f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:30:44 -0400 (0:00:00.685) 0:00:45.348 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003225", "end": "2024-08-17 19:30:44.991391", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:30:44.988166" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.437) 0:00:45.785 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'stratis', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.165) 0:00:45.951 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.046) 0:00:45.997 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.044) 0:00:46.042 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.050) 0:00:46.092 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.170) 0:00:46.263 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.082) 0:00:46.346 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.082) 0:00:46.428 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.087) 0:00:46.516 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.087) 0:00:46.603 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:30:45 -0400 (0:00:00.060) 0:00:46.664 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:30:46 -0400 (0:00:00.052) 0:00:46.717 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:30:46 -0400 (0:00:00.087) 0:00:46.805 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:30:46 -0400 (0:00:00.044) 0:00:46.849 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:30:46 -0400 (0:00:00.043) 0:00:46.893 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:30:46 -0400 (0:00:00.470) 0:00:47.363 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:30:46 -0400 (0:00:00.115) 0:00:47.479 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:30:47 -0400 (0:00:00.275) 0:00:47.755 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:30:47 -0400 (0:00:00.092) 0:00:47.848 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:30:47 -0400 (0:00:00.099) 0:00:47.947 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:30:47 -0400 (0:00:00.089) 0:00:48.037 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:30:47 -0400 (0:00:00.079) 0:00:48.116 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:30:47 -0400 (0:00:00.081) 0:00:48.198 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:30:47 -0400 (0:00:00.094) 0:00:48.292 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:30:47 -0400 (0:00:00.162) 0:00:48.454 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:30:47 -0400 (0:00:00.098) 0:00:48.553 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:30:47 -0400 (0:00:00.093) 0:00:48.647 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:30:48 -0400 (0:00:00.098) 0:00:48.746 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:30:48 -0400 (0:00:00.162) 0:00:48.908 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:30:48 -0400 (0:00:00.288) 0:00:49.199 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:30:48 -0400 (0:00:00.149) 0:00:49.349 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:30:48 -0400 (0:00:00.204) 0:00:49.554 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:30:48 -0400 (0:00:00.098) 0:00:49.652 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:30:49 -0400 (0:00:00.216) 0:00:49.868 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:30:49 -0400 (0:00:00.265) 0:00:50.134 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:30:49 -0400 (0:00:00.088) 0:00:50.223 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:30:49 -0400 (0:00:00.079) 0:00:50.302 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:30:49 -0400 (0:00:00.086) 0:00:50.389 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:30:49 -0400 (0:00:00.188) 0:00:50.577 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:30:49 -0400 (0:00:00.097) 0:00:50.674 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:30:50 -0400 (0:00:00.204) 0:00:50.879 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.471232", "end": "2024-08-17 19:30:51.090643", "rc": 0, "start": "2024-08-17 19:30:50.619411" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sda", "size": "20971520 sectors", "uuid": "8e3a46ab-eda5-4b79-aed5-26808a0082b4" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdb", "size": "20971520 sectors", "uuid": "dbb28899-2399-4b36-ae1b-2d4ecaafa038" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdc", "size": "20971520 sectors", "uuid": "fcc15fa5-920d-41fb-a4c8-b8c0bab09e6d" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdd", "size": "2147483648 sectors", "uuid": "c5989df1-30c6-485f-aef2-d71d94020819" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sde", "size": "2147483648 sectors", "uuid": "597ffcc8-ed6c-49aa-8ba7-67db8f46e7b1" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdf", "size": "20971520 sectors", "uuid": "14f51071-e50a-4ecd-bbc8-b9a49033ab8b" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdg", "size": "2147483648 sectors", "uuid": "0896e8bd-7657-40de-9a19-bc0b9f43dc8a" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdh", "size": "20971520 sectors", "uuid": "eb1e3924-45c3-4eeb-a2b0-fbeb6bce5e12" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdi", "size": "20971520 sectors", "uuid": "5f9f5739-5f7c-4fbb-87eb-9aef69095115" } ] }, "filesystems": [ { "name": "test1", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "16326300-d90c-4644-96f1-30fbfb3c417f" } ], "fs_limit": 100, "name": "foo", "uuid": "720adb20-fdab-4201-b59d-d23c87c33523" } ], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:30:51 -0400 (0:00:01.014) 0:00:51.894 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sda", "size": "20971520 sectors", "uuid": "8e3a46ab-eda5-4b79-aed5-26808a0082b4" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdb", "size": "20971520 sectors", "uuid": "dbb28899-2399-4b36-ae1b-2d4ecaafa038" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdc", "size": "20971520 sectors", "uuid": "fcc15fa5-920d-41fb-a4c8-b8c0bab09e6d" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdd", "size": "2147483648 sectors", "uuid": "c5989df1-30c6-485f-aef2-d71d94020819" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sde", "size": "2147483648 sectors", "uuid": "597ffcc8-ed6c-49aa-8ba7-67db8f46e7b1" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdf", "size": "20971520 sectors", "uuid": "14f51071-e50a-4ecd-bbc8-b9a49033ab8b" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdg", "size": "2147483648 sectors", "uuid": "0896e8bd-7657-40de-9a19-bc0b9f43dc8a" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdh", "size": "20971520 sectors", "uuid": "eb1e3924-45c3-4eeb-a2b0-fbeb6bce5e12" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdi", "size": "20971520 sectors", "uuid": "5f9f5739-5f7c-4fbb-87eb-9aef69095115" } ] }, "filesystems": [ { "name": "test1", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "16326300-d90c-4644-96f1-30fbfb3c417f" } ], "fs_limit": 100, "name": "foo", "uuid": "720adb20-fdab-4201-b59d-d23c87c33523" } ], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:30:51 -0400 (0:00:00.097) 0:00:51.991 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:30:51 -0400 (0:00:00.117) 0:00:52.109 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:30:51 -0400 (0:00:00.091) 0:00:52.201 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:30:51 -0400 (0:00:00.111) 0:00:52.313 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:30:51 -0400 (0:00:00.086) 0:00:52.399 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:30:51 -0400 (0:00:00.092) 0:00:52.492 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.212) 0:00:52.704 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.131) 0:00:52.835 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 => (item=mount) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 => (item=fstab) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 => (item=fs) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 => (item=device) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 => (item=encryption) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 => (item=md) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 => (item=size) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.221) 0:00:53.057 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/stratis/foo/test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.056) 0:00:53.113 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.089) 0:00:53.203 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and (storage_test_volume.mount_user or storage_test_volume.mount_group or storage_test_volume.mount_mode)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.062) 0:00:53.265 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.051) 0:00:53.317 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_user", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.043) 0:00:53.360 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_group", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.081) 0:00:53.442 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_mode", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.045) 0:00:53.488 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.047) 0:00:53.535 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.045) 0:00:53.581 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 17 August 2024 19:30:52 -0400 (0:00:00.061) 0:00:53.642 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 17 August 2024 19:30:53 -0400 (0:00:00.084) 0:00:53.727 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=16326300-d90c-4644-96f1-30fbfb3c417f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 17 August 2024 19:30:53 -0400 (0:00:00.169) 0:00:53.896 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 17 August 2024 19:30:53 -0400 (0:00:00.094) 0:00:53.991 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 17 August 2024 19:30:53 -0400 (0:00:00.092) 0:00:54.084 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 17 August 2024 19:30:53 -0400 (0:00:00.066) 0:00:54.150 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 17 August 2024 19:30:53 -0400 (0:00:00.055) 0:00:54.206 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 17 August 2024 19:30:53 -0400 (0:00:00.107) 0:00:54.314 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 17 August 2024 19:30:53 -0400 (0:00:00.147) 0:00:54.462 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 17 August 2024 19:30:53 -0400 (0:00:00.174) 0:00:54.636 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937435.227908, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1723937435.227908, "dev": 6, "device_type": 64773, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4868, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1723937435.227908, "nlink": 1, "path": "/dev/stratis/foo/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 17 August 2024 19:30:54 -0400 (0:00:00.507) 0:00:55.144 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 17 August 2024 19:30:54 -0400 (0:00:00.096) 0:00:55.241 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 17 August 2024 19:30:54 -0400 (0:00:00.060) 0:00:55.301 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 17 August 2024 19:30:54 -0400 (0:00:00.097) 0:00:55.398 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "stratis" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 17 August 2024 19:30:54 -0400 (0:00:00.058) 0:00:55.457 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 17 August 2024 19:30:54 -0400 (0:00:00.044) 0:00:55.502 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 17 August 2024 19:30:54 -0400 (0:00:00.049) 0:00:55.551 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 17 August 2024 19:30:54 -0400 (0:00:00.044) 0:00:55.596 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 17 August 2024 19:30:56 -0400 (0:00:01.612) 0:00:57.209 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 17 August 2024 19:30:56 -0400 (0:00:00.098) 0:00:57.307 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 17 August 2024 19:30:56 -0400 (0:00:00.090) 0:00:57.398 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 17 August 2024 19:30:56 -0400 (0:00:00.169) 0:00:57.567 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 17 August 2024 19:30:57 -0400 (0:00:00.169) 0:00:57.736 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 17 August 2024 19:30:57 -0400 (0:00:00.095) 0:00:57.832 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 17 August 2024 19:30:57 -0400 (0:00:00.131) 0:00:57.963 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 17 August 2024 19:30:57 -0400 (0:00:00.106) 0:00:58.069 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 17 August 2024 19:30:57 -0400 (0:00:00.109) 0:00:58.178 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 17 August 2024 19:30:57 -0400 (0:00:00.141) 0:00:58.319 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 17 August 2024 19:30:57 -0400 (0:00:00.136) 0:00:58.455 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 17 August 2024 19:30:57 -0400 (0:00:00.163) 0:00:58.619 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 17 August 2024 19:30:58 -0400 (0:00:00.231) 0:00:58.851 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 17 August 2024 19:30:58 -0400 (0:00:00.150) 0:00:59.002 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 17 August 2024 19:30:58 -0400 (0:00:00.136) 0:00:59.138 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 17 August 2024 19:30:58 -0400 (0:00:00.105) 0:00:59.244 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 17 August 2024 19:30:58 -0400 (0:00:00.095) 0:00:59.339 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 17 August 2024 19:30:58 -0400 (0:00:00.078) 0:00:59.418 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 17 August 2024 19:30:58 -0400 (0:00:00.177) 0:00:59.595 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 17 August 2024 19:30:59 -0400 (0:00:00.107) 0:00:59.702 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 17 August 2024 19:30:59 -0400 (0:00:00.172) 0:00:59.874 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 17 August 2024 19:30:59 -0400 (0:00:00.118) 0:00:59.993 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 17 August 2024 19:30:59 -0400 (0:00:00.155) 0:01:00.148 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 17 August 2024 19:30:59 -0400 (0:00:00.121) 0:01:00.270 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 17 August 2024 19:30:59 -0400 (0:00:00.092) 0:01:00.362 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 17 August 2024 19:31:00 -0400 (0:00:00.927) 0:01:01.290 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 17 August 2024 19:31:00 -0400 (0:00:00.140) 0:01:01.430 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 17 August 2024 19:31:00 -0400 (0:00:00.116) 0:01:01.547 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 17 August 2024 19:31:00 -0400 (0:00:00.087) 0:01:01.634 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 17 August 2024 19:31:01 -0400 (0:00:00.113) 0:01:01.748 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 17 August 2024 19:31:01 -0400 (0:00:00.118) 0:01:01.867 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 17 August 2024 19:31:01 -0400 (0:00:00.221) 0:01:02.089 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 17 August 2024 19:31:01 -0400 (0:00:00.127) 0:01:02.216 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 17 August 2024 19:31:01 -0400 (0:00:00.125) 0:01:02.341 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 17 August 2024 19:31:01 -0400 (0:00:00.080) 0:01:02.422 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 17 August 2024 19:31:01 -0400 (0:00:00.081) 0:01:02.503 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 17 August 2024 19:31:01 -0400 (0:00:00.082) 0:01:02.585 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 17 August 2024 19:31:01 -0400 (0:00:00.083) 0:01:02.668 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.081) 0:01:02.750 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.082) 0:01:02.832 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.080) 0:01:02.913 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.074) 0:01:02.988 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.081) 0:01:03.070 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.060) 0:01:03.130 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.089) 0:01:03.220 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.044) 0:01:03.265 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.043) 0:01:03.308 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.045) 0:01:03.354 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.057) 0:01:03.411 ******* ok: [managed_node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.111) 0:01:03.523 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 17 August 2024 19:31:02 -0400 (0:00:00.104) 0:01:03.627 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.157) 0:01:03.785 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.103) 0:01:03.888 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.051) 0:01:03.940 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.056) 0:01:03.997 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.055) 0:01:04.052 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.058) 0:01:04.110 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.095) 0:01:04.206 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.105) 0:01:04.311 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.061) 0:01:04.373 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.042) 0:01:04.416 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:76 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.047) 0:01:04.463 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.101) 0:01:04.565 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:31:03 -0400 (0:00:00.104) 0:01:04.670 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:31:04 -0400 (0:00:00.134) 0:01:04.805 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:31:04 -0400 (0:00:00.171) 0:01:04.977 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:31:04 -0400 (0:00:00.059) 0:01:05.036 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:31:04 -0400 (0:00:00.067) 0:01:05.104 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:31:04 -0400 (0:00:00.053) 0:01:05.158 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:31:04 -0400 (0:00:00.120) 0:01:05.278 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:31:04 -0400 (0:00:00.182) 0:01:05.461 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:31:04 -0400 (0:00:00.090) 0:01:05.551 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "name": "foo", "type": "stratis", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:31:04 -0400 (0:00:00.098) 0:01:05.650 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:31:05 -0400 (0:00:00.091) 0:01:05.741 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:31:05 -0400 (0:00:00.091) 0:01:05.833 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:31:05 -0400 (0:00:00.141) 0:01:05.974 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:31:05 -0400 (0:00:00.115) 0:01:06.090 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:31:05 -0400 (0:00:00.117) 0:01:06.208 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:31:05 -0400 (0:00:00.216) 0:01:06.424 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:31:05 -0400 (0:00:00.106) 0:01:06.531 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/stratis/foo/test1", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" } ], "packages": [ "stratisd", "stratis-cli", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:31:10 -0400 (0:00:04.547) 0:01:11.078 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:31:10 -0400 (0:00:00.154) 0:01:11.233 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937438.7719324, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "817b04bbaec0586cc701642d5c401dd7f77eee19", "ctime": 1723937438.7709322, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937438.7709322, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1436, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:31:10 -0400 (0:00:00.445) 0:01:11.678 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:31:11 -0400 (0:00:00.050) 0:01:11.729 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:31:11 -0400 (0:00:00.054) 0:01:11.784 ******* ok: [managed_node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/stratis/foo/test1", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" } ], "packages": [ "stratisd", "stratis-cli", "e2fsprogs", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:31:11 -0400 (0:00:00.096) 0:01:11.880 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:31:11 -0400 (0:00:00.092) 0:01:11.973 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:31:11 -0400 (0:00:00.063) 0:01:12.037 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:31:11 -0400 (0:00:00.086) 0:01:12.123 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:31:12 -0400 (0:00:00.899) 0:01:13.022 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [managed_node2] => (item={'src': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:31:12 -0400 (0:00:00.581) 0:01:13.604 ******* skipping: [managed_node2] => (item={'src': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:31:13 -0400 (0:00:00.132) 0:01:13.736 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:31:13 -0400 (0:00:00.922) 0:01:14.658 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:31:14 -0400 (0:00:00.526) 0:01:15.185 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:31:14 -0400 (0:00:00.171) 0:01:15.356 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:89 Saturday 17 August 2024 19:31:17 -0400 (0:00:02.697) 0:01:18.054 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:31:17 -0400 (0:00:00.132) 0:01:18.187 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:31:17 -0400 (0:00:00.092) 0:01:18.279 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:31:17 -0400 (0:00:00.105) 0:01:18.384 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-mdv": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-mdv", "size": "512M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thindata": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thindata", "size": "50G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thinmeta": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thinmeta", "size": "799M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-physical-originsub": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-physical-originsub", "size": "52.1G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-thinpool-pool": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-thinpool-pool", "size": "50G", "type": "stratis", "uuid": "" }, "/dev/sda": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "8e3a46ab-eda5-4b79-aed5-26808a0082b4" }, "/dev/sdb": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "dbb28899-2399-4b36-ae1b-2d4ecaafa038" }, "/dev/sdc": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "fcc15fa5-920d-41fb-a4c8-b8c0bab09e6d" }, "/dev/sdd": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "c5989df1-30c6-485f-aef2-d71d94020819" }, "/dev/sde": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "597ffcc8-ed6c-49aa-8ba7-67db8f46e7b1" }, "/dev/sdf": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "14f51071-e50a-4ecd-bbc8-b9a49033ab8b" }, "/dev/sdg": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "0896e8bd-7657-40de-9a19-bc0b9f43dc8a" }, "/dev/sdh": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "eb1e3924-45c3-4eeb-a2b0-fbeb6bce5e12" }, "/dev/sdi": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "5f9f5739-5f7c-4fbb-87eb-9aef69095115" }, "/dev/stratis/foo/test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/stratis/foo/test1", "size": "4G", "type": "stratis", "uuid": "16326300-d90c-4644-96f1-30fbfb3c417f" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:31:18 -0400 (0:00:00.447) 0:01:18.832 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003071", "end": "2024-08-17 19:31:18.475850", "rc": 0, "start": "2024-08-17 19:31:18.472779" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=16326300-d90c-4644-96f1-30fbfb3c417f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:31:18 -0400 (0:00:00.461) 0:01:19.293 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003185", "end": "2024-08-17 19:31:18.948644", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:31:18.945459" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:31:19 -0400 (0:00:00.476) 0:01:19.770 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'stratis', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:31:19 -0400 (0:00:00.285) 0:01:20.055 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:31:19 -0400 (0:00:00.185) 0:01:20.241 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:31:19 -0400 (0:00:00.200) 0:01:20.441 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:31:19 -0400 (0:00:00.100) 0:01:20.542 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:31:20 -0400 (0:00:00.186) 0:01:20.728 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:31:20 -0400 (0:00:00.110) 0:01:20.839 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:31:20 -0400 (0:00:00.111) 0:01:20.951 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:31:20 -0400 (0:00:00.108) 0:01:21.059 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:31:20 -0400 (0:00:00.103) 0:01:21.163 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:31:20 -0400 (0:00:00.096) 0:01:21.259 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:31:20 -0400 (0:00:00.103) 0:01:21.363 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:31:20 -0400 (0:00:00.072) 0:01:21.435 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:31:20 -0400 (0:00:00.059) 0:01:21.495 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:31:20 -0400 (0:00:00.086) 0:01:21.582 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:31:21 -0400 (0:00:00.520) 0:01:22.103 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:31:21 -0400 (0:00:00.362) 0:01:22.465 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:31:21 -0400 (0:00:00.194) 0:01:22.659 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:31:22 -0400 (0:00:00.116) 0:01:22.775 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:31:22 -0400 (0:00:00.173) 0:01:22.949 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:31:22 -0400 (0:00:00.158) 0:01:23.107 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:31:22 -0400 (0:00:00.133) 0:01:23.240 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:31:22 -0400 (0:00:00.097) 0:01:23.338 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:31:22 -0400 (0:00:00.098) 0:01:23.436 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:31:22 -0400 (0:00:00.143) 0:01:23.579 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:31:22 -0400 (0:00:00.097) 0:01:23.677 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:31:23 -0400 (0:00:00.125) 0:01:23.802 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:31:23 -0400 (0:00:00.110) 0:01:23.912 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:31:23 -0400 (0:00:00.177) 0:01:24.090 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:31:23 -0400 (0:00:00.215) 0:01:24.305 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:31:23 -0400 (0:00:00.123) 0:01:24.429 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:31:24 -0400 (0:00:00.296) 0:01:24.725 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:31:24 -0400 (0:00:00.100) 0:01:24.825 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:31:24 -0400 (0:00:00.225) 0:01:25.050 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:31:24 -0400 (0:00:00.179) 0:01:25.230 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:31:24 -0400 (0:00:00.097) 0:01:25.328 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:31:24 -0400 (0:00:00.091) 0:01:25.419 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:31:24 -0400 (0:00:00.126) 0:01:25.545 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:31:25 -0400 (0:00:00.425) 0:01:25.971 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:31:25 -0400 (0:00:00.153) 0:01:26.125 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:31:25 -0400 (0:00:00.290) 0:01:26.415 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.374082", "end": "2024-08-17 19:31:26.525375", "rc": 0, "start": "2024-08-17 19:31:26.151293" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sda", "size": "20971520 sectors", "uuid": "8e3a46ab-eda5-4b79-aed5-26808a0082b4" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdb", "size": "20971520 sectors", "uuid": "dbb28899-2399-4b36-ae1b-2d4ecaafa038" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdc", "size": "20971520 sectors", "uuid": "fcc15fa5-920d-41fb-a4c8-b8c0bab09e6d" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdd", "size": "2147483648 sectors", "uuid": "c5989df1-30c6-485f-aef2-d71d94020819" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sde", "size": "2147483648 sectors", "uuid": "597ffcc8-ed6c-49aa-8ba7-67db8f46e7b1" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdf", "size": "20971520 sectors", "uuid": "14f51071-e50a-4ecd-bbc8-b9a49033ab8b" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdg", "size": "2147483648 sectors", "uuid": "0896e8bd-7657-40de-9a19-bc0b9f43dc8a" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdh", "size": "20971520 sectors", "uuid": "eb1e3924-45c3-4eeb-a2b0-fbeb6bce5e12" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdi", "size": "20971520 sectors", "uuid": "5f9f5739-5f7c-4fbb-87eb-9aef69095115" } ] }, "filesystems": [ { "name": "test1", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "16326300-d90c-4644-96f1-30fbfb3c417f" } ], "fs_limit": 100, "name": "foo", "uuid": "720adb20-fdab-4201-b59d-d23c87c33523" } ], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:31:26 -0400 (0:00:00.984) 0:01:27.399 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sda", "size": "20971520 sectors", "uuid": "8e3a46ab-eda5-4b79-aed5-26808a0082b4" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdb", "size": "20971520 sectors", "uuid": "dbb28899-2399-4b36-ae1b-2d4ecaafa038" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdc", "size": "20971520 sectors", "uuid": "fcc15fa5-920d-41fb-a4c8-b8c0bab09e6d" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdd", "size": "2147483648 sectors", "uuid": "c5989df1-30c6-485f-aef2-d71d94020819" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sde", "size": "2147483648 sectors", "uuid": "597ffcc8-ed6c-49aa-8ba7-67db8f46e7b1" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdf", "size": "20971520 sectors", "uuid": "14f51071-e50a-4ecd-bbc8-b9a49033ab8b" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdg", "size": "2147483648 sectors", "uuid": "0896e8bd-7657-40de-9a19-bc0b9f43dc8a" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdh", "size": "20971520 sectors", "uuid": "eb1e3924-45c3-4eeb-a2b0-fbeb6bce5e12" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdi", "size": "20971520 sectors", "uuid": "5f9f5739-5f7c-4fbb-87eb-9aef69095115" } ] }, "filesystems": [ { "name": "test1", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "16326300-d90c-4644-96f1-30fbfb3c417f" } ], "fs_limit": 100, "name": "foo", "uuid": "720adb20-fdab-4201-b59d-d23c87c33523" } ], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:31:26 -0400 (0:00:00.194) 0:01:27.594 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:31:27 -0400 (0:00:00.153) 0:01:27.748 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:31:27 -0400 (0:00:00.087) 0:01:27.835 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:31:27 -0400 (0:00:00.085) 0:01:27.921 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:31:27 -0400 (0:00:00.135) 0:01:28.057 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:31:27 -0400 (0:00:00.085) 0:01:28.142 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 17 August 2024 19:31:27 -0400 (0:00:00.161) 0:01:28.304 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 17 August 2024 19:31:27 -0400 (0:00:00.086) 0:01:28.391 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 => (item=mount) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 => (item=fstab) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 => (item=fs) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 => (item=device) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 => (item=encryption) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 => (item=md) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 => (item=size) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 17 August 2024 19:31:27 -0400 (0:00:00.189) 0:01:28.580 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/stratis/foo/test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 17 August 2024 19:31:27 -0400 (0:00:00.087) 0:01:28.668 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.134) 0:01:28.803 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and (storage_test_volume.mount_user or storage_test_volume.mount_group or storage_test_volume.mount_mode)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.085) 0:01:28.888 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.096) 0:01:28.985 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_user", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.094) 0:01:29.079 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_group", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.072) 0:01:29.152 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_mode", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.057) 0:01:29.209 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.051) 0:01:29.261 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.090) 0:01:29.351 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.045) 0:01:29.397 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.050) 0:01:29.447 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=16326300-d90c-4644-96f1-30fbfb3c417f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.099) 0:01:29.547 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.074) 0:01:29.622 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 17 August 2024 19:31:28 -0400 (0:00:00.074) 0:01:29.697 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 17 August 2024 19:31:29 -0400 (0:00:00.067) 0:01:29.765 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 17 August 2024 19:31:29 -0400 (0:00:00.048) 0:01:29.813 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 17 August 2024 19:31:29 -0400 (0:00:00.049) 0:01:29.863 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 17 August 2024 19:31:29 -0400 (0:00:00.045) 0:01:29.908 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 17 August 2024 19:31:29 -0400 (0:00:00.044) 0:01:29.953 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937435.227908, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1723937435.227908, "dev": 6, "device_type": 64773, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4868, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1723937435.227908, "nlink": 1, "path": "/dev/stratis/foo/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 17 August 2024 19:31:29 -0400 (0:00:00.434) 0:01:30.387 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 17 August 2024 19:31:29 -0400 (0:00:00.051) 0:01:30.439 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 17 August 2024 19:31:29 -0400 (0:00:00.043) 0:01:30.482 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 17 August 2024 19:31:29 -0400 (0:00:00.096) 0:01:30.579 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "stratis" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 17 August 2024 19:31:29 -0400 (0:00:00.089) 0:01:30.669 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 17 August 2024 19:31:30 -0400 (0:00:00.082) 0:01:30.752 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 17 August 2024 19:31:30 -0400 (0:00:00.093) 0:01:30.845 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 17 August 2024 19:31:30 -0400 (0:00:00.120) 0:01:30.966 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 17 August 2024 19:31:31 -0400 (0:00:01.572) 0:01:32.539 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 17 August 2024 19:31:31 -0400 (0:00:00.083) 0:01:32.623 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 17 August 2024 19:31:32 -0400 (0:00:00.078) 0:01:32.701 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 17 August 2024 19:31:32 -0400 (0:00:00.088) 0:01:32.790 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 17 August 2024 19:31:32 -0400 (0:00:00.057) 0:01:32.847 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 17 August 2024 19:31:32 -0400 (0:00:00.047) 0:01:32.895 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 17 August 2024 19:31:32 -0400 (0:00:00.047) 0:01:32.942 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 17 August 2024 19:31:32 -0400 (0:00:00.052) 0:01:32.994 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 17 August 2024 19:31:32 -0400 (0:00:00.161) 0:01:33.156 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 17 August 2024 19:31:32 -0400 (0:00:00.180) 0:01:33.337 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 17 August 2024 19:31:32 -0400 (0:00:00.143) 0:01:33.481 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 17 August 2024 19:31:32 -0400 (0:00:00.166) 0:01:33.647 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 17 August 2024 19:31:33 -0400 (0:00:00.159) 0:01:33.806 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 17 August 2024 19:31:33 -0400 (0:00:00.151) 0:01:33.958 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 17 August 2024 19:31:33 -0400 (0:00:00.109) 0:01:34.067 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 17 August 2024 19:31:33 -0400 (0:00:00.083) 0:01:34.151 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 17 August 2024 19:31:33 -0400 (0:00:00.082) 0:01:34.233 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 17 August 2024 19:31:33 -0400 (0:00:00.086) 0:01:34.320 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 17 August 2024 19:31:33 -0400 (0:00:00.105) 0:01:34.425 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 17 August 2024 19:31:33 -0400 (0:00:00.095) 0:01:34.521 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 17 August 2024 19:31:33 -0400 (0:00:00.062) 0:01:34.584 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 17 August 2024 19:31:33 -0400 (0:00:00.115) 0:01:34.699 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 17 August 2024 19:31:34 -0400 (0:00:00.076) 0:01:34.775 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 17 August 2024 19:31:34 -0400 (0:00:00.054) 0:01:34.830 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 17 August 2024 19:31:34 -0400 (0:00:00.060) 0:01:34.891 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 17 August 2024 19:31:34 -0400 (0:00:00.517) 0:01:35.408 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 17 August 2024 19:31:34 -0400 (0:00:00.173) 0:01:35.582 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 17 August 2024 19:31:35 -0400 (0:00:00.122) 0:01:35.704 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 17 August 2024 19:31:35 -0400 (0:00:00.088) 0:01:35.793 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 17 August 2024 19:31:35 -0400 (0:00:00.120) 0:01:35.914 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 17 August 2024 19:31:35 -0400 (0:00:00.121) 0:01:36.036 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 17 August 2024 19:31:35 -0400 (0:00:00.125) 0:01:36.161 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 17 August 2024 19:31:35 -0400 (0:00:00.122) 0:01:36.284 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 17 August 2024 19:31:35 -0400 (0:00:00.122) 0:01:36.407 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 17 August 2024 19:31:35 -0400 (0:00:00.085) 0:01:36.492 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 17 August 2024 19:31:35 -0400 (0:00:00.152) 0:01:36.645 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.083) 0:01:36.728 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.056) 0:01:36.785 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.049) 0:01:36.834 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.045) 0:01:36.879 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.043) 0:01:36.923 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.044) 0:01:36.967 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.044) 0:01:37.012 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.044) 0:01:37.056 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.044) 0:01:37.100 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.044) 0:01:37.145 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.043) 0:01:37.188 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.044) 0:01:37.233 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.085) 0:01:37.318 ******* ok: [managed_node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.050) 0:01:37.369 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.050) 0:01:37.419 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.066) 0:01:37.486 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.045) 0:01:37.531 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.056) 0:01:37.588 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 17 August 2024 19:31:36 -0400 (0:00:00.075) 0:01:37.663 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.086) 0:01:37.750 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.100) 0:01:37.851 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.082) 0:01:37.933 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.066) 0:01:38.000 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.060) 0:01:38.061 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.051) 0:01:38.112 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Add second filesystem to the pool] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:92 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.097) 0:01:38.209 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.111) 0:01:38.321 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.112) 0:01:38.434 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.093) 0:01:38.528 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.114) 0:01:38.643 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:31:37 -0400 (0:00:00.051) 0:01:38.694 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:31:38 -0400 (0:00:00.050) 0:01:38.744 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:31:38 -0400 (0:00:00.047) 0:01:38.792 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:31:38 -0400 (0:00:00.124) 0:01:38.916 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:31:38 -0400 (0:00:00.200) 0:01:39.116 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:31:38 -0400 (0:00:00.112) 0:01:39.229 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "name": "foo", "type": "stratis", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:31:38 -0400 (0:00:00.054) 0:01:39.283 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:31:38 -0400 (0:00:00.056) 0:01:39.340 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:31:38 -0400 (0:00:00.097) 0:01:39.438 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:31:38 -0400 (0:00:00.123) 0:01:39.562 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:31:38 -0400 (0:00:00.091) 0:01:39.653 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:31:39 -0400 (0:00:00.084) 0:01:39.738 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:31:39 -0400 (0:00:00.104) 0:01:39.842 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:31:39 -0400 (0:00:00.041) 0:01:39.883 ******* changed: [managed_node2] => { "actions": [ { "action": "create device", "device": "/dev/stratis/foo/test2", "fs_type": null }, { "action": "create format", "device": "/dev/stratis/foo/test2", "fs_type": "stratis xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/stratis/foo/test1", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/stratis/foo/test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "state": "mounted" } ], "packages": [ "xfsprogs", "stratis-cli", "stratisd", "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null }, { "_device": "/dev/stratis/foo/test2", "_kernel_device": "/dev/dm-6", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "_raw_kernel_device": "/dev/dm-6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:31:49 -0400 (0:00:10.353) 0:01:50.237 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:31:49 -0400 (0:00:00.149) 0:01:50.386 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937438.7719324, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "817b04bbaec0586cc701642d5c401dd7f77eee19", "ctime": 1723937438.7709322, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937438.7709322, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1436, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:31:50 -0400 (0:00:00.548) 0:01:50.934 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:31:50 -0400 (0:00:00.620) 0:01:51.555 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:31:51 -0400 (0:00:00.231) 0:01:51.786 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "create device", "device": "/dev/stratis/foo/test2", "fs_type": null }, { "action": "create format", "device": "/dev/stratis/foo/test2", "fs_type": "stratis xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/stratis/foo/test1", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/stratis/foo/test2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "state": "mounted" } ], "packages": [ "xfsprogs", "stratis-cli", "stratisd", "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null }, { "_device": "/dev/stratis/foo/test2", "_kernel_device": "/dev/dm-6", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "_raw_kernel_device": "/dev/dm-6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:31:51 -0400 (0:00:00.187) 0:01:51.974 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null }, { "_device": "/dev/stratis/foo/test2", "_kernel_device": "/dev/dm-6", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "_raw_kernel_device": "/dev/dm-6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:31:51 -0400 (0:00:00.155) 0:01:52.129 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:31:51 -0400 (0:00:00.095) 0:01:52.225 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:31:51 -0400 (0:00:00.167) 0:01:52.392 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:31:52 -0400 (0:00:01.014) 0:01:53.406 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [managed_node2] => (item={'src': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f" } redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed_node2] => (item={'src': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d', 'path': '/opt/test2', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:31:53 -0400 (0:00:00.992) 0:01:54.399 ******* skipping: [managed_node2] => (item={'src': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item={'src': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d', 'path': '/opt/test2', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test2", "src": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:31:53 -0400 (0:00:00.163) 0:01:54.563 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:31:54 -0400 (0:00:00.916) 0:01:55.479 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:31:55 -0400 (0:00:00.478) 0:01:55.958 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:31:55 -0400 (0:00:00.060) 0:01:56.019 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:108 Saturday 17 August 2024 19:31:57 -0400 (0:00:02.435) 0:01:58.454 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:31:58 -0400 (0:00:00.260) 0:01:58.714 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null }, { "_device": "/dev/stratis/foo/test2", "_kernel_device": "/dev/dm-6", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "_raw_kernel_device": "/dev/dm-6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:31:58 -0400 (0:00:00.146) 0:01:58.861 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:31:58 -0400 (0:00:00.176) 0:01:59.038 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-mdv": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-mdv", "size": "512M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thindata": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thindata", "size": "50G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thinmeta": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-flex-thinmeta", "size": "799M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-physical-originsub": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-physical-originsub", "size": "52.1G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-thinpool-pool": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-720adb20fdab4201b59dd23c87c33523-thinpool-pool", "size": "50G", "type": "stratis", "uuid": "" }, "/dev/sda": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "8e3a46ab-eda5-4b79-aed5-26808a0082b4" }, "/dev/sdb": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "dbb28899-2399-4b36-ae1b-2d4ecaafa038" }, "/dev/sdc": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "fcc15fa5-920d-41fb-a4c8-b8c0bab09e6d" }, "/dev/sdd": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "c5989df1-30c6-485f-aef2-d71d94020819" }, "/dev/sde": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "597ffcc8-ed6c-49aa-8ba7-67db8f46e7b1" }, "/dev/sdf": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "14f51071-e50a-4ecd-bbc8-b9a49033ab8b" }, "/dev/sdg": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "0896e8bd-7657-40de-9a19-bc0b9f43dc8a" }, "/dev/sdh": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "eb1e3924-45c3-4eeb-a2b0-fbeb6bce5e12" }, "/dev/sdi": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "5f9f5739-5f7c-4fbb-87eb-9aef69095115" }, "/dev/stratis/foo/test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/stratis/foo/test1", "size": "4G", "type": "stratis", "uuid": "16326300-d90c-4644-96f1-30fbfb3c417f" }, "/dev/stratis/foo/test2": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test2", "name": "/dev/stratis/foo/test2", "size": "4G", "type": "stratis", "uuid": "3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:31:58 -0400 (0:00:00.525) 0:01:59.563 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004194", "end": "2024-08-17 19:32:00.205305", "rc": 0, "start": "2024-08-17 19:31:59.201111" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=16326300-d90c-4644-96f1-30fbfb3c417f /opt/test1 xfs defaults 0 0 UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:32:00 -0400 (0:00:01.432) 0:02:00.996 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003209", "end": "2024-08-17 19:32:00.627680", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:32:00.624471" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:32:00 -0400 (0:00:00.413) 0:02:01.409 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'stratis', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}, {'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test2', '_raw_device': '/dev/stratis/foo/test2', '_mount_id': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d', '_kernel_device': '/dev/dm-6', '_raw_kernel_device': '/dev/dm-6'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:32:00 -0400 (0:00:00.146) 0:02:01.556 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:32:00 -0400 (0:00:00.050) 0:02:01.607 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:32:00 -0400 (0:00:00.046) 0:02:01.653 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:32:00 -0400 (0:00:00.045) 0:02:01.699 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:32:01 -0400 (0:00:00.098) 0:02:01.798 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:32:01 -0400 (0:00:00.148) 0:02:01.946 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:32:01 -0400 (0:00:00.082) 0:02:02.028 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:32:01 -0400 (0:00:00.082) 0:02:02.110 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:32:01 -0400 (0:00:00.080) 0:02:02.191 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:32:01 -0400 (0:00:00.055) 0:02:02.246 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:32:01 -0400 (0:00:00.057) 0:02:02.304 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:32:01 -0400 (0:00:00.050) 0:02:02.355 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:32:01 -0400 (0:00:00.045) 0:02:02.401 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:32:01 -0400 (0:00:00.041) 0:02:02.442 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.452) 0:02:02.894 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.073) 0:02:02.967 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.131) 0:02:03.099 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.110) 0:02:03.210 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.045) 0:02:03.256 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.045) 0:02:03.302 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.045) 0:02:03.347 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.044) 0:02:03.392 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.064) 0:02:03.456 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.095) 0:02:03.552 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:32:02 -0400 (0:00:00.116) 0:02:03.668 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:32:03 -0400 (0:00:00.085) 0:02:03.753 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:32:03 -0400 (0:00:00.081) 0:02:03.835 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:32:03 -0400 (0:00:00.089) 0:02:03.925 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:32:03 -0400 (0:00:00.176) 0:02:04.101 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test2', '_raw_device': '/dev/stratis/foo/test2', '_mount_id': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d', '_kernel_device': '/dev/dm-6', '_raw_kernel_device': '/dev/dm-6'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/stratis/foo/test2", "_kernel_device": "/dev/dm-6", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "_raw_kernel_device": "/dev/dm-6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:32:03 -0400 (0:00:00.173) 0:02:04.274 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:32:03 -0400 (0:00:00.139) 0:02:04.413 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test2', '_raw_device': '/dev/stratis/foo/test2', '_mount_id': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d', '_kernel_device': '/dev/dm-6', '_raw_kernel_device': '/dev/dm-6'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/stratis/foo/test2", "_kernel_device": "/dev/dm-6", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "_raw_kernel_device": "/dev/dm-6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:32:03 -0400 (0:00:00.090) 0:02:04.504 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:32:03 -0400 (0:00:00.172) 0:02:04.677 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:32:04 -0400 (0:00:00.129) 0:02:04.806 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:32:04 -0400 (0:00:00.077) 0:02:04.883 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:32:04 -0400 (0:00:00.072) 0:02:04.956 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:32:04 -0400 (0:00:00.088) 0:02:05.045 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:32:04 -0400 (0:00:00.226) 0:02:05.271 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-5", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test2', '_raw_device': '/dev/stratis/foo/test2', '_mount_id': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d', '_kernel_device': '/dev/dm-6', '_raw_kernel_device': '/dev/dm-6'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/stratis/foo/test2", "_kernel_device": "/dev/dm-6", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "_raw_kernel_device": "/dev/dm-6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:32:04 -0400 (0:00:00.121) 0:02:05.393 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:32:04 -0400 (0:00:00.268) 0:02:05.661 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.373535", "end": "2024-08-17 19:32:05.768516", "rc": 0, "start": "2024-08-17 19:32:05.394981" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sda", "size": "20971520 sectors", "uuid": "8e3a46ab-eda5-4b79-aed5-26808a0082b4" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdb", "size": "20971520 sectors", "uuid": "dbb28899-2399-4b36-ae1b-2d4ecaafa038" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdc", "size": "20971520 sectors", "uuid": "fcc15fa5-920d-41fb-a4c8-b8c0bab09e6d" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdd", "size": "2147483648 sectors", "uuid": "c5989df1-30c6-485f-aef2-d71d94020819" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sde", "size": "2147483648 sectors", "uuid": "597ffcc8-ed6c-49aa-8ba7-67db8f46e7b1" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdf", "size": "20971520 sectors", "uuid": "14f51071-e50a-4ecd-bbc8-b9a49033ab8b" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdg", "size": "2147483648 sectors", "uuid": "0896e8bd-7657-40de-9a19-bc0b9f43dc8a" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdh", "size": "20971520 sectors", "uuid": "eb1e3924-45c3-4eeb-a2b0-fbeb6bce5e12" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdi", "size": "20971520 sectors", "uuid": "5f9f5739-5f7c-4fbb-87eb-9aef69095115" } ] }, "filesystems": [ { "name": "test2", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d" }, { "name": "test1", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "16326300-d90c-4644-96f1-30fbfb3c417f" } ], "fs_limit": 100, "name": "foo", "uuid": "720adb20-fdab-4201-b59d-d23c87c33523" } ], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:32:05 -0400 (0:00:00.927) 0:02:06.589 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sda", "size": "20971520 sectors", "uuid": "8e3a46ab-eda5-4b79-aed5-26808a0082b4" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdb", "size": "20971520 sectors", "uuid": "dbb28899-2399-4b36-ae1b-2d4ecaafa038" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdc", "size": "20971520 sectors", "uuid": "fcc15fa5-920d-41fb-a4c8-b8c0bab09e6d" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdd", "size": "2147483648 sectors", "uuid": "c5989df1-30c6-485f-aef2-d71d94020819" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sde", "size": "2147483648 sectors", "uuid": "597ffcc8-ed6c-49aa-8ba7-67db8f46e7b1" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdf", "size": "20971520 sectors", "uuid": "14f51071-e50a-4ecd-bbc8-b9a49033ab8b" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdg", "size": "2147483648 sectors", "uuid": "0896e8bd-7657-40de-9a19-bc0b9f43dc8a" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdh", "size": "20971520 sectors", "uuid": "eb1e3924-45c3-4eeb-a2b0-fbeb6bce5e12" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": false, "path": "/dev/sdi", "size": "20971520 sectors", "uuid": "5f9f5739-5f7c-4fbb-87eb-9aef69095115" } ] }, "filesystems": [ { "name": "test2", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d" }, { "name": "test1", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "16326300-d90c-4644-96f1-30fbfb3c417f" } ], "fs_limit": 100, "name": "foo", "uuid": "720adb20-fdab-4201-b59d-d23c87c33523" } ], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:32:06 -0400 (0:00:00.129) 0:02:06.718 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:32:06 -0400 (0:00:00.133) 0:02:06.851 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:32:06 -0400 (0:00:00.102) 0:02:06.953 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:32:06 -0400 (0:00:00.093) 0:02:07.047 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:32:06 -0400 (0:00:00.106) 0:02:07.153 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:32:06 -0400 (0:00:00.105) 0:02:07.259 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test2', '_raw_device': '/dev/stratis/foo/test2', '_mount_id': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d', '_kernel_device': '/dev/dm-6', '_raw_kernel_device': '/dev/dm-6'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 17 August 2024 19:32:06 -0400 (0:00:00.203) 0:02:07.462 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 17 August 2024 19:32:06 -0400 (0:00:00.130) 0:02:07.593 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 => (item=mount) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 => (item=fstab) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 => (item=fs) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 => (item=device) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 => (item=encryption) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 => (item=md) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 => (item=size) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 17 August 2024 19:32:07 -0400 (0:00:00.560) 0:02:08.154 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/stratis/foo/test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 17 August 2024 19:32:07 -0400 (0:00:00.113) 0:02:08.267 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 17 August 2024 19:32:07 -0400 (0:00:00.131) 0:02:08.399 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and (storage_test_volume.mount_user or storage_test_volume.mount_group or storage_test_volume.mount_mode)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 17 August 2024 19:32:07 -0400 (0:00:00.082) 0:02:08.481 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 17 August 2024 19:32:07 -0400 (0:00:00.095) 0:02:08.577 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_user", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 17 August 2024 19:32:07 -0400 (0:00:00.083) 0:02:08.660 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_group", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 17 August 2024 19:32:08 -0400 (0:00:00.083) 0:02:08.743 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_mode", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 17 August 2024 19:32:08 -0400 (0:00:00.088) 0:02:08.832 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 17 August 2024 19:32:08 -0400 (0:00:00.123) 0:02:08.956 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 17 August 2024 19:32:08 -0400 (0:00:00.098) 0:02:09.054 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 17 August 2024 19:32:08 -0400 (0:00:00.147) 0:02:09.201 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 17 August 2024 19:32:08 -0400 (0:00:00.087) 0:02:09.288 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=16326300-d90c-4644-96f1-30fbfb3c417f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 17 August 2024 19:32:08 -0400 (0:00:00.264) 0:02:09.553 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 17 August 2024 19:32:09 -0400 (0:00:00.166) 0:02:09.720 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 17 August 2024 19:32:09 -0400 (0:00:00.109) 0:02:09.829 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 17 August 2024 19:32:09 -0400 (0:00:00.076) 0:02:09.906 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 17 August 2024 19:32:09 -0400 (0:00:00.050) 0:02:09.957 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 17 August 2024 19:32:09 -0400 (0:00:00.048) 0:02:10.006 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 17 August 2024 19:32:09 -0400 (0:00:00.046) 0:02:10.052 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 17 August 2024 19:32:09 -0400 (0:00:00.051) 0:02:10.103 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937435.227908, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1723937435.227908, "dev": 6, "device_type": 64773, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4868, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1723937435.227908, "nlink": 1, "path": "/dev/stratis/foo/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 17 August 2024 19:32:09 -0400 (0:00:00.513) 0:02:10.617 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 17 August 2024 19:32:10 -0400 (0:00:00.095) 0:02:10.713 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 17 August 2024 19:32:10 -0400 (0:00:00.083) 0:02:10.796 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 17 August 2024 19:32:10 -0400 (0:00:00.086) 0:02:10.883 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "stratis" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 17 August 2024 19:32:10 -0400 (0:00:00.063) 0:02:10.946 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 17 August 2024 19:32:10 -0400 (0:00:00.057) 0:02:11.003 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 17 August 2024 19:32:10 -0400 (0:00:00.108) 0:02:11.112 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 17 August 2024 19:32:10 -0400 (0:00:00.047) 0:02:11.160 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 17 August 2024 19:32:11 -0400 (0:00:01.493) 0:02:12.653 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 17 August 2024 19:32:12 -0400 (0:00:00.087) 0:02:12.741 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 17 August 2024 19:32:12 -0400 (0:00:00.092) 0:02:12.833 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 17 August 2024 19:32:12 -0400 (0:00:00.194) 0:02:13.027 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 17 August 2024 19:32:12 -0400 (0:00:00.117) 0:02:13.145 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 17 August 2024 19:32:12 -0400 (0:00:00.139) 0:02:13.284 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 17 August 2024 19:32:12 -0400 (0:00:00.107) 0:02:13.392 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 17 August 2024 19:32:12 -0400 (0:00:00.115) 0:02:13.507 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 17 August 2024 19:32:12 -0400 (0:00:00.110) 0:02:13.618 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 17 August 2024 19:32:13 -0400 (0:00:00.136) 0:02:13.755 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 17 August 2024 19:32:13 -0400 (0:00:00.141) 0:02:13.897 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 17 August 2024 19:32:13 -0400 (0:00:00.360) 0:02:14.257 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 17 August 2024 19:32:13 -0400 (0:00:00.191) 0:02:14.449 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 17 August 2024 19:32:13 -0400 (0:00:00.207) 0:02:14.657 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 17 August 2024 19:32:14 -0400 (0:00:00.115) 0:02:14.772 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 17 August 2024 19:32:14 -0400 (0:00:00.081) 0:02:14.853 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 17 August 2024 19:32:14 -0400 (0:00:00.083) 0:02:14.937 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 17 August 2024 19:32:14 -0400 (0:00:00.088) 0:02:15.025 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 17 August 2024 19:32:14 -0400 (0:00:00.104) 0:02:15.130 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 17 August 2024 19:32:14 -0400 (0:00:00.094) 0:02:15.225 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 17 August 2024 19:32:14 -0400 (0:00:00.103) 0:02:15.328 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 17 August 2024 19:32:14 -0400 (0:00:00.112) 0:02:15.441 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 17 August 2024 19:32:14 -0400 (0:00:00.098) 0:02:15.540 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 17 August 2024 19:32:14 -0400 (0:00:00.082) 0:02:15.622 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 17 August 2024 19:32:15 -0400 (0:00:00.244) 0:02:15.867 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 17 August 2024 19:32:15 -0400 (0:00:00.664) 0:02:16.532 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 17 August 2024 19:32:16 -0400 (0:00:00.178) 0:02:16.711 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 17 August 2024 19:32:16 -0400 (0:00:00.225) 0:02:16.936 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 17 August 2024 19:32:16 -0400 (0:00:00.114) 0:02:17.051 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 17 August 2024 19:32:16 -0400 (0:00:00.135) 0:02:17.186 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 17 August 2024 19:32:16 -0400 (0:00:00.144) 0:02:17.331 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 17 August 2024 19:32:16 -0400 (0:00:00.120) 0:02:17.451 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 17 August 2024 19:32:16 -0400 (0:00:00.122) 0:02:17.574 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.130) 0:02:17.704 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.089) 0:02:17.793 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.080) 0:02:17.874 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.102) 0:02:17.976 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.174) 0:02:18.151 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.080) 0:02:18.232 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.086) 0:02:18.318 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.084) 0:02:18.402 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.087) 0:02:18.490 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.088) 0:02:18.579 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 17 August 2024 19:32:17 -0400 (0:00:00.086) 0:02:18.665 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 17 August 2024 19:32:18 -0400 (0:00:00.086) 0:02:18.751 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 17 August 2024 19:32:18 -0400 (0:00:00.088) 0:02:18.840 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 17 August 2024 19:32:18 -0400 (0:00:00.086) 0:02:18.926 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 17 August 2024 19:32:18 -0400 (0:00:00.086) 0:02:19.014 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 17 August 2024 19:32:18 -0400 (0:00:00.112) 0:02:19.126 ******* ok: [managed_node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 17 August 2024 19:32:18 -0400 (0:00:00.096) 0:02:19.223 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 17 August 2024 19:32:18 -0400 (0:00:00.092) 0:02:19.315 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 17 August 2024 19:32:18 -0400 (0:00:00.294) 0:02:19.610 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 17 August 2024 19:32:19 -0400 (0:00:00.143) 0:02:19.753 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 17 August 2024 19:32:19 -0400 (0:00:00.121) 0:02:19.874 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 17 August 2024 19:32:19 -0400 (0:00:00.100) 0:02:19.975 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 17 August 2024 19:32:19 -0400 (0:00:00.086) 0:02:20.061 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 17 August 2024 19:32:19 -0400 (0:00:00.099) 0:02:20.160 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 17 August 2024 19:32:19 -0400 (0:00:00.122) 0:02:20.283 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 17 August 2024 19:32:19 -0400 (0:00:00.114) 0:02:20.398 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 17 August 2024 19:32:19 -0400 (0:00:00.111) 0:02:20.510 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 17 August 2024 19:32:19 -0400 (0:00:00.148) 0:02:20.658 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 => (item=mount) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 => (item=fstab) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 => (item=fs) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 => (item=device) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 => (item=encryption) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 => (item=md) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 => (item=size) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 17 August 2024 19:32:20 -0400 (0:00:00.394) 0:02:21.053 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/stratis/foo/test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 17 August 2024 19:32:20 -0400 (0:00:00.192) 0:02:21.246 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test2", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 17 August 2024 19:32:20 -0400 (0:00:00.198) 0:02:21.445 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and (storage_test_volume.mount_user or storage_test_volume.mount_group or storage_test_volume.mount_mode)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 17 August 2024 19:32:20 -0400 (0:00:00.098) 0:02:21.544 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 17 August 2024 19:32:20 -0400 (0:00:00.113) 0:02:21.657 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_user", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.103) 0:02:21.761 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_group", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.086) 0:02:21.847 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_mode", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.070) 0:02:21.917 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.055) 0:02:21.973 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.057) 0:02:22.031 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.051) 0:02:22.083 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.049) 0:02:22.132 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.099) 0:02:22.232 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.077) 0:02:22.309 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.201) 0:02:22.511 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.084) 0:02:22.595 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 17 August 2024 19:32:21 -0400 (0:00:00.064) 0:02:22.660 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 17 August 2024 19:32:22 -0400 (0:00:00.049) 0:02:22.710 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 17 August 2024 19:32:22 -0400 (0:00:00.045) 0:02:22.755 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 17 August 2024 19:32:22 -0400 (0:00:00.044) 0:02:22.800 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937509.304418, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1723937509.304418, "dev": 6, "device_type": 64774, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 4892, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1723937509.304418, "nlink": 1, "path": "/dev/stratis/foo/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 17 August 2024 19:32:22 -0400 (0:00:00.432) 0:02:23.232 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 17 August 2024 19:32:22 -0400 (0:00:00.051) 0:02:23.284 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 17 August 2024 19:32:22 -0400 (0:00:00.045) 0:02:23.330 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 17 August 2024 19:32:22 -0400 (0:00:00.058) 0:02:23.389 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "stratis" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 17 August 2024 19:32:22 -0400 (0:00:00.109) 0:02:23.498 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 17 August 2024 19:32:22 -0400 (0:00:00.076) 0:02:23.575 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 17 August 2024 19:32:22 -0400 (0:00:00.097) 0:02:23.672 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 17 August 2024 19:32:23 -0400 (0:00:00.086) 0:02:23.759 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 17 August 2024 19:32:24 -0400 (0:00:01.744) 0:02:25.503 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 17 August 2024 19:32:24 -0400 (0:00:00.096) 0:02:25.599 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 17 August 2024 19:32:24 -0400 (0:00:00.084) 0:02:25.684 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 17 August 2024 19:32:25 -0400 (0:00:00.092) 0:02:25.776 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 17 August 2024 19:32:25 -0400 (0:00:00.046) 0:02:25.822 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 17 August 2024 19:32:25 -0400 (0:00:00.046) 0:02:25.868 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 17 August 2024 19:32:25 -0400 (0:00:00.045) 0:02:25.914 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 17 August 2024 19:32:25 -0400 (0:00:00.046) 0:02:25.960 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 17 August 2024 19:32:25 -0400 (0:00:00.056) 0:02:26.016 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 17 August 2024 19:32:25 -0400 (0:00:00.136) 0:02:26.153 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 17 August 2024 19:32:25 -0400 (0:00:00.131) 0:02:26.284 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 17 August 2024 19:32:25 -0400 (0:00:00.127) 0:02:26.412 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 17 August 2024 19:32:25 -0400 (0:00:00.124) 0:02:26.536 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.244) 0:02:26.780 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.100) 0:02:26.880 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.144) 0:02:27.025 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.114) 0:02:27.140 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.155) 0:02:27.295 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.087) 0:02:27.383 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.068) 0:02:27.452 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.067) 0:02:27.519 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.051) 0:02:27.571 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.044) 0:02:27.615 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 17 August 2024 19:32:26 -0400 (0:00:00.045) 0:02:27.661 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 17 August 2024 19:32:27 -0400 (0:00:00.046) 0:02:27.707 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 17 August 2024 19:32:27 -0400 (0:00:00.526) 0:02:28.233 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 17 August 2024 19:32:27 -0400 (0:00:00.266) 0:02:28.499 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 17 August 2024 19:32:27 -0400 (0:00:00.137) 0:02:28.637 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 17 August 2024 19:32:28 -0400 (0:00:00.094) 0:02:28.732 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 17 August 2024 19:32:28 -0400 (0:00:00.179) 0:02:28.912 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 17 August 2024 19:32:28 -0400 (0:00:00.212) 0:02:29.124 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 17 August 2024 19:32:28 -0400 (0:00:00.171) 0:02:29.296 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 17 August 2024 19:32:28 -0400 (0:00:00.133) 0:02:29.429 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 17 August 2024 19:32:28 -0400 (0:00:00.119) 0:02:29.549 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 17 August 2024 19:32:28 -0400 (0:00:00.092) 0:02:29.641 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.100) 0:02:29.742 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.113) 0:02:29.855 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.114) 0:02:29.970 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.118) 0:02:30.089 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.055) 0:02:30.144 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.101) 0:02:30.246 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.046) 0:02:30.292 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.044) 0:02:30.337 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.046) 0:02:30.384 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.052) 0:02:30.436 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.075) 0:02:30.511 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.084) 0:02:30.595 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 17 August 2024 19:32:29 -0400 (0:00:00.090) 0:02:30.686 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.080) 0:02:30.767 ******* ok: [managed_node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.093) 0:02:30.860 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.070) 0:02:30.930 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.081) 0:02:31.012 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.064) 0:02:31.076 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.114) 0:02:31.190 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.046) 0:02:31.237 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.052) 0:02:31.290 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.068) 0:02:31.359 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.082) 0:02:31.441 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.082) 0:02:31.524 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.090) 0:02:31.614 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:32:30 -0400 (0:00:00.079) 0:02:31.694 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:111 Saturday 17 August 2024 19:32:31 -0400 (0:00:00.089) 0:02:31.783 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:32:31 -0400 (0:00:00.225) 0:02:32.009 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:32:31 -0400 (0:00:00.174) 0:02:32.183 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:32:31 -0400 (0:00:00.418) 0:02:32.601 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:32:32 -0400 (0:00:00.195) 0:02:32.797 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:32:32 -0400 (0:00:00.148) 0:02:32.946 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:32:32 -0400 (0:00:00.108) 0:02:33.054 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:32:32 -0400 (0:00:00.113) 0:02:33.168 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:32:32 -0400 (0:00:00.113) 0:02:33.281 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:32:32 -0400 (0:00:00.247) 0:02:33.529 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:32:32 -0400 (0:00:00.135) 0:02:33.664 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "name": "foo", "state": "absent", "type": "stratis", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g", "state": "absent" }, { "mount_point": "/opt/test2", "name": "test2", "size": "4g", "state": "absent" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:32:33 -0400 (0:00:00.126) 0:02:33.791 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:32:33 -0400 (0:00:00.116) 0:02:33.907 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:32:33 -0400 (0:00:00.118) 0:02:34.025 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:32:33 -0400 (0:00:00.094) 0:02:34.120 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:32:33 -0400 (0:00:00.182) 0:02:34.303 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:32:33 -0400 (0:00:00.150) 0:02:34.453 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:32:33 -0400 (0:00:00.212) 0:02:34.665 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:32:34 -0400 (0:00:00.083) 0:02:34.748 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/stratis/foo/test2", "fs_type": "stratis xfs" }, { "action": "destroy device", "device": "/dev/stratis/foo/test2", "fs_type": null }, { "action": "destroy format", "device": "/dev/stratis/foo/test1", "fs_type": "stratis xfs" }, { "action": "destroy device", "device": "/dev/stratis/foo/test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdg", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdh", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdi", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdf", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdd", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sde", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "stratis" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null }, { "_device": "/dev/stratis/foo/test2", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:32:47 -0400 (0:00:13.628) 0:02:48.377 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:32:47 -0400 (0:00:00.148) 0:02:48.526 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937513.5684476, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "45616163064181131c4c960fd2f3411e6ed549ba", "ctime": 1723937513.5674474, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937513.5674474, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1506, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:32:48 -0400 (0:00:00.511) 0:02:49.038 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:32:48 -0400 (0:00:00.525) 0:02:49.563 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:32:48 -0400 (0:00:00.081) 0:02:49.645 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/stratis/foo/test2", "fs_type": "stratis xfs" }, { "action": "destroy device", "device": "/dev/stratis/foo/test2", "fs_type": null }, { "action": "destroy format", "device": "/dev/stratis/foo/test1", "fs_type": "stratis xfs" }, { "action": "destroy device", "device": "/dev/stratis/foo/test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdg", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdh", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdi", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdf", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdd", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sde", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "stratis" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "state": "absent" }, { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null }, { "_device": "/dev/stratis/foo/test2", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:32:49 -0400 (0:00:00.134) 0:02:49.780 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null }, { "_device": "/dev/stratis/foo/test2", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:32:49 -0400 (0:00:00.109) 0:02:49.889 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:32:49 -0400 (0:00:00.117) 0:02:50.007 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed_node2] => (item={'src': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d', 'path': '/opt/test2', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d" } redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed_node2] => (item={'src': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:32:50 -0400 (0:00:00.944) 0:02:50.951 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:32:51 -0400 (0:00:00.890) 0:02:51.841 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:32:51 -0400 (0:00:00.074) 0:02:51.915 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:32:51 -0400 (0:00:00.072) 0:02:51.987 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:32:52 -0400 (0:00:00.851) 0:02:52.839 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:32:52 -0400 (0:00:00.454) 0:02:53.293 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:32:52 -0400 (0:00:00.043) 0:02:53.337 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:130 Saturday 17 August 2024 19:32:55 -0400 (0:00:02.575) 0:02:55.912 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:32:55 -0400 (0:00:00.178) 0:02:56.090 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null }, { "_device": "/dev/stratis/foo/test2", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:32:55 -0400 (0:00:00.135) 0:02:56.226 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:32:55 -0400 (0:00:00.202) 0:02:56.428 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:32:56 -0400 (0:00:00.535) 0:02:56.964 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004254", "end": "2024-08-17 19:32:57.689425", "rc": 0, "start": "2024-08-17 19:32:56.685171" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:32:57 -0400 (0:00:01.570) 0:02:58.535 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003225", "end": "2024-08-17 19:32:58.213966", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:32:58.210741" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:32:58 -0400 (0:00:00.615) 0:02:59.150 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'absent', 'type': 'stratis', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f'}, {'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test2', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test2', '_raw_device': '/dev/stratis/foo/test2', '_mount_id': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:32:58 -0400 (0:00:00.235) 0:02:59.386 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:32:58 -0400 (0:00:00.109) 0:02:59.495 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:32:58 -0400 (0:00:00.090) 0:02:59.586 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:32:58 -0400 (0:00:00.100) 0:02:59.686 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:32:59 -0400 (0:00:00.183) 0:02:59.870 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:32:59 -0400 (0:00:00.086) 0:02:59.956 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:32:59 -0400 (0:00:00.108) 0:03:00.064 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:32:59 -0400 (0:00:00.079) 0:03:00.144 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:32:59 -0400 (0:00:00.078) 0:03:00.223 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:32:59 -0400 (0:00:00.050) 0:03:00.273 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:32:59 -0400 (0:00:00.054) 0:03:00.328 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:32:59 -0400 (0:00:00.101) 0:03:00.430 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:32:59 -0400 (0:00:00.045) 0:03:00.476 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:32:59 -0400 (0:00:00.041) 0:03:00.517 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:33:00 -0400 (0:00:00.505) 0:03:01.023 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:33:00 -0400 (0:00:00.161) 0:03:01.184 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:33:00 -0400 (0:00:00.117) 0:03:01.302 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:33:00 -0400 (0:00:00.044) 0:03:01.346 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:33:00 -0400 (0:00:00.043) 0:03:01.390 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:33:00 -0400 (0:00:00.044) 0:03:01.434 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:33:00 -0400 (0:00:00.044) 0:03:01.479 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:33:00 -0400 (0:00:00.044) 0:03:01.523 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:33:00 -0400 (0:00:00.054) 0:03:01.577 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:33:00 -0400 (0:00:00.079) 0:03:01.657 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:33:01 -0400 (0:00:00.146) 0:03:01.804 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:33:01 -0400 (0:00:00.079) 0:03:01.883 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:33:01 -0400 (0:00:00.052) 0:03:01.936 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:33:01 -0400 (0:00:00.064) 0:03:02.001 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:33:01 -0400 (0:00:00.104) 0:03:02.105 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test2', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test2', '_raw_device': '/dev/stratis/foo/test2', '_mount_id': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/stratis/foo/test2", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:33:01 -0400 (0:00:00.064) 0:03:02.170 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:33:01 -0400 (0:00:00.141) 0:03:02.311 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test2', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test2', '_raw_device': '/dev/stratis/foo/test2', '_mount_id': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/stratis/foo/test2", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:33:01 -0400 (0:00:00.131) 0:03:02.443 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:33:02 -0400 (0:00:00.271) 0:03:02.717 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:33:02 -0400 (0:00:00.194) 0:03:02.912 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:33:02 -0400 (0:00:00.095) 0:03:03.008 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:33:02 -0400 (0:00:00.185) 0:03:03.193 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:33:02 -0400 (0:00:00.113) 0:03:03.307 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:33:02 -0400 (0:00:00.222) 0:03:03.529 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=16326300-d90c-4644-96f1-30fbfb3c417f", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test2', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test2', '_raw_device': '/dev/stratis/foo/test2', '_mount_id': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/stratis/foo/test2", "_mount_id": "UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d", "_raw_device": "/dev/stratis/foo/test2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "mount_user": null, "name": "test2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:33:03 -0400 (0:00:00.193) 0:03:03.722 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:33:03 -0400 (0:00:00.315) 0:03:04.038 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.376877", "end": "2024-08-17 19:33:04.096397", "rc": 0, "start": "2024-08-17 19:33:03.719520" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:33:04 -0400 (0:00:00.915) 0:03:04.954 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:33:04 -0400 (0:00:00.181) 0:03:05.136 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:33:04 -0400 (0:00:00.116) 0:03:05.252 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:33:04 -0400 (0:00:00.094) 0:03:05.346 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:33:04 -0400 (0:00:00.130) 0:03:05.477 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:33:04 -0400 (0:00:00.094) 0:03:05.572 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:33:04 -0400 (0:00:00.118) 0:03:05.691 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=16326300-d90c-4644-96f1-30fbfb3c417f'}) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test2', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test2', '_raw_device': '/dev/stratis/foo/test2', '_mount_id': 'UUID=3cb80bb8-15f0-4f66-a70c-5db3d3a14e1d'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 17 August 2024 19:33:05 -0400 (0:00:00.431) 0:03:06.122 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 17 August 2024 19:33:05 -0400 (0:00:00.174) 0:03:06.296 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 => (item=mount) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 => (item=fstab) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 => (item=fs) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 => (item=device) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 => (item=encryption) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 => (item=md) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 => (item=size) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 17 August 2024 19:33:06 -0400 (0:00:00.497) 0:03:06.793 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/stratis/foo/test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 17 August 2024 19:33:06 -0400 (0:00:00.136) 0:03:06.930 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 17 August 2024 19:33:06 -0400 (0:00:00.195) 0:03:07.125 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and (storage_test_volume.mount_user or storage_test_volume.mount_group or storage_test_volume.mount_mode)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 17 August 2024 19:33:06 -0400 (0:00:00.114) 0:03:07.239 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 17 August 2024 19:33:06 -0400 (0:00:00.098) 0:03:07.337 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_user", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 17 August 2024 19:33:06 -0400 (0:00:00.117) 0:03:07.455 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_group", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 17 August 2024 19:33:06 -0400 (0:00:00.085) 0:03:07.540 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_mode", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 17 August 2024 19:33:06 -0400 (0:00:00.083) 0:03:07.624 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 17 August 2024 19:33:07 -0400 (0:00:00.122) 0:03:07.746 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 17 August 2024 19:33:07 -0400 (0:00:00.149) 0:03:07.896 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 17 August 2024 19:33:07 -0400 (0:00:00.253) 0:03:08.149 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 17 August 2024 19:33:07 -0400 (0:00:00.132) 0:03:08.282 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 17 August 2024 19:33:07 -0400 (0:00:00.260) 0:03:08.542 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 17 August 2024 19:33:08 -0400 (0:00:00.196) 0:03:08.739 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 17 August 2024 19:33:08 -0400 (0:00:00.211) 0:03:08.950 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 17 August 2024 19:33:08 -0400 (0:00:00.181) 0:03:09.132 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 17 August 2024 19:33:08 -0400 (0:00:00.088) 0:03:09.220 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 17 August 2024 19:33:08 -0400 (0:00:00.129) 0:03:09.350 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 17 August 2024 19:33:08 -0400 (0:00:00.138) 0:03:09.489 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 17 August 2024 19:33:08 -0400 (0:00:00.112) 0:03:09.602 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 17 August 2024 19:33:09 -0400 (0:00:00.527) 0:03:10.130 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present or storage_test_volume.type == 'disk'", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 17 August 2024 19:33:09 -0400 (0:00:00.114) 0:03:10.244 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 17 August 2024 19:33:09 -0400 (0:00:00.138) 0:03:10.383 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 17 August 2024 19:33:09 -0400 (0:00:00.095) 0:03:10.478 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "stratis" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 17 August 2024 19:33:09 -0400 (0:00:00.121) 0:03:10.599 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 17 August 2024 19:33:10 -0400 (0:00:00.138) 0:03:10.738 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 17 August 2024 19:33:10 -0400 (0:00:00.057) 0:03:10.796 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 17 August 2024 19:33:10 -0400 (0:00:00.052) 0:03:10.848 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 17 August 2024 19:33:11 -0400 (0:00:01.452) 0:03:12.301 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 17 August 2024 19:33:11 -0400 (0:00:00.081) 0:03:12.383 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 17 August 2024 19:33:11 -0400 (0:00:00.077) 0:03:12.460 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 17 August 2024 19:33:11 -0400 (0:00:00.077) 0:03:12.537 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 17 August 2024 19:33:11 -0400 (0:00:00.079) 0:03:12.617 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 17 August 2024 19:33:11 -0400 (0:00:00.078) 0:03:12.695 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 17 August 2024 19:33:12 -0400 (0:00:00.076) 0:03:12.772 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 17 August 2024 19:33:12 -0400 (0:00:00.079) 0:03:12.851 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 17 August 2024 19:33:12 -0400 (0:00:00.080) 0:03:12.932 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 17 August 2024 19:33:12 -0400 (0:00:00.156) 0:03:13.088 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 17 August 2024 19:33:12 -0400 (0:00:00.126) 0:03:13.214 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 17 August 2024 19:33:12 -0400 (0:00:00.089) 0:03:13.304 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 17 August 2024 19:33:12 -0400 (0:00:00.145) 0:03:13.450 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 17 August 2024 19:33:12 -0400 (0:00:00.070) 0:03:13.520 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 17 August 2024 19:33:12 -0400 (0:00:00.049) 0:03:13.569 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 17 August 2024 19:33:12 -0400 (0:00:00.052) 0:03:13.622 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.087) 0:03:13.710 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.096) 0:03:13.806 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.082) 0:03:13.889 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.055) 0:03:13.945 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.056) 0:03:14.001 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.047) 0:03:14.049 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.050) 0:03:14.099 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.082) 0:03:14.182 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.086) 0:03:14.269 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.144) 0:03:14.413 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 17 August 2024 19:33:13 -0400 (0:00:00.100) 0:03:14.514 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 17 August 2024 19:33:14 -0400 (0:00:00.195) 0:03:14.709 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 17 August 2024 19:33:14 -0400 (0:00:00.063) 0:03:14.773 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 17 August 2024 19:33:14 -0400 (0:00:00.140) 0:03:14.914 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 17 August 2024 19:33:14 -0400 (0:00:00.201) 0:03:15.115 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 17 August 2024 19:33:14 -0400 (0:00:00.214) 0:03:15.329 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 17 August 2024 19:33:14 -0400 (0:00:00.142) 0:03:15.472 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 17 August 2024 19:33:14 -0400 (0:00:00.124) 0:03:15.597 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 17 August 2024 19:33:14 -0400 (0:00:00.080) 0:03:15.677 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.080) 0:03:15.758 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.082) 0:03:15.840 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.084) 0:03:15.925 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.083) 0:03:16.008 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.078) 0:03:16.087 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.096) 0:03:16.184 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.083) 0:03:16.267 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.220) 0:03:16.488 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.058) 0:03:16.546 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.046) 0:03:16.593 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.043) 0:03:16.636 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 17 August 2024 19:33:15 -0400 (0:00:00.044) 0:03:16.680 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 17 August 2024 19:33:16 -0400 (0:00:00.050) 0:03:16.731 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 17 August 2024 19:33:16 -0400 (0:00:00.073) 0:03:16.805 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 17 August 2024 19:33:16 -0400 (0:00:00.093) 0:03:16.898 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 17 August 2024 19:33:16 -0400 (0:00:00.126) 0:03:17.025 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 17 August 2024 19:33:16 -0400 (0:00:00.151) 0:03:17.176 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 17 August 2024 19:33:16 -0400 (0:00:00.085) 0:03:17.262 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 17 August 2024 19:33:16 -0400 (0:00:00.082) 0:03:17.344 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 17 August 2024 19:33:16 -0400 (0:00:00.076) 0:03:17.421 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 17 August 2024 19:33:16 -0400 (0:00:00.119) 0:03:17.540 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 17 August 2024 19:33:17 -0400 (0:00:00.167) 0:03:17.708 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 17 August 2024 19:33:17 -0400 (0:00:00.096) 0:03:17.804 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 17 August 2024 19:33:17 -0400 (0:00:00.138) 0:03:17.943 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 17 August 2024 19:33:17 -0400 (0:00:00.092) 0:03:18.036 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 17 August 2024 19:33:17 -0400 (0:00:00.154) 0:03:18.190 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 => (item=mount) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 => (item=fstab) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 => (item=fs) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 => (item=device) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 => (item=encryption) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 => (item=md) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 => (item=size) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 17 August 2024 19:33:17 -0400 (0:00:00.462) 0:03:18.652 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/stratis/foo/test2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.130) 0:03:18.783 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test2", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.132) 0:03:18.916 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and (storage_test_volume.mount_user or storage_test_volume.mount_group or storage_test_volume.mount_mode)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.059) 0:03:18.975 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.056) 0:03:19.032 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_user", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.052) 0:03:19.085 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_group", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.043) 0:03:19.128 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_mode", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.043) 0:03:19.172 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.091) 0:03:19.263 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.046) 0:03:19.310 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.044) 0:03:19.354 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.049) 0:03:19.404 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.100) 0:03:19.504 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.067) 0:03:19.571 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 17 August 2024 19:33:18 -0400 (0:00:00.098) 0:03:19.670 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.078) 0:03:19.749 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.051) 0:03:19.800 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.049) 0:03:19.850 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.043) 0:03:19.893 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.045) 0:03:19.939 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.405) 0:03:20.344 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present or storage_test_volume.type == 'disk'", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.045) 0:03:20.390 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.052) 0:03:20.442 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.104) 0:03:20.546 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "stratis" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.051) 0:03:20.598 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.044) 0:03:20.642 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 17 August 2024 19:33:19 -0400 (0:00:00.048) 0:03:20.691 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 17 August 2024 19:33:20 -0400 (0:00:00.072) 0:03:20.763 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 17 August 2024 19:33:21 -0400 (0:00:01.486) 0:03:22.250 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 17 August 2024 19:33:21 -0400 (0:00:00.083) 0:03:22.334 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 17 August 2024 19:33:21 -0400 (0:00:00.074) 0:03:22.408 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 17 August 2024 19:33:21 -0400 (0:00:00.058) 0:03:22.466 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 17 August 2024 19:33:21 -0400 (0:00:00.055) 0:03:22.522 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 17 August 2024 19:33:21 -0400 (0:00:00.049) 0:03:22.571 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 17 August 2024 19:33:21 -0400 (0:00:00.044) 0:03:22.616 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 17 August 2024 19:33:21 -0400 (0:00:00.044) 0:03:22.660 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 17 August 2024 19:33:22 -0400 (0:00:00.044) 0:03:22.705 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 17 August 2024 19:33:22 -0400 (0:00:00.101) 0:03:22.806 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 17 August 2024 19:33:22 -0400 (0:00:00.340) 0:03:23.147 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 17 August 2024 19:33:22 -0400 (0:00:00.071) 0:03:23.219 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 17 August 2024 19:33:22 -0400 (0:00:00.069) 0:03:23.289 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 17 August 2024 19:33:22 -0400 (0:00:00.097) 0:03:23.386 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 17 August 2024 19:33:22 -0400 (0:00:00.106) 0:03:23.492 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 17 August 2024 19:33:22 -0400 (0:00:00.082) 0:03:23.575 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 17 August 2024 19:33:22 -0400 (0:00:00.074) 0:03:23.650 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.057) 0:03:23.707 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.049) 0:03:23.757 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.045) 0:03:23.803 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.044) 0:03:23.847 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.058) 0:03:23.906 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.118) 0:03:24.024 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.078) 0:03:24.102 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.122) 0:03:24.225 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.079) 0:03:24.305 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.077) 0:03:24.382 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.095) 0:03:24.477 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.068) 0:03:24.546 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 17 August 2024 19:33:23 -0400 (0:00:00.128) 0:03:24.674 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 17 August 2024 19:33:24 -0400 (0:00:00.120) 0:03:24.795 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 17 August 2024 19:33:24 -0400 (0:00:00.132) 0:03:24.928 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 17 August 2024 19:33:24 -0400 (0:00:00.146) 0:03:25.074 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 17 August 2024 19:33:24 -0400 (0:00:00.106) 0:03:25.181 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 17 August 2024 19:33:24 -0400 (0:00:00.052) 0:03:25.233 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 17 August 2024 19:33:24 -0400 (0:00:00.046) 0:03:25.279 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 17 August 2024 19:33:24 -0400 (0:00:00.052) 0:03:25.332 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 17 August 2024 19:33:24 -0400 (0:00:00.081) 0:03:25.413 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 17 August 2024 19:33:24 -0400 (0:00:00.086) 0:03:25.500 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 17 August 2024 19:33:24 -0400 (0:00:00.162) 0:03:25.662 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.082) 0:03:25.745 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.070) 0:03:25.815 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.055) 0:03:25.871 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.064) 0:03:25.935 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.074) 0:03:26.009 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.079) 0:03:26.089 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.096) 0:03:26.186 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.126) 0:03:26.313 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.110) 0:03:26.423 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.101) 0:03:26.525 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 17 August 2024 19:33:25 -0400 (0:00:00.118) 0:03:26.644 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 17 August 2024 19:33:26 -0400 (0:00:00.116) 0:03:26.760 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 17 August 2024 19:33:26 -0400 (0:00:00.080) 0:03:26.841 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 17 August 2024 19:33:26 -0400 (0:00:00.111) 0:03:26.952 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 17 August 2024 19:33:26 -0400 (0:00:00.212) 0:03:27.165 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 17 August 2024 19:33:26 -0400 (0:00:00.101) 0:03:27.266 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 17 August 2024 19:33:26 -0400 (0:00:00.120) 0:03:27.386 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 17 August 2024 19:33:26 -0400 (0:00:00.082) 0:03:27.468 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 17 August 2024 19:33:26 -0400 (0:00:00.121) 0:03:27.590 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:33:26 -0400 (0:00:00.112) 0:03:27.702 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:33:27 -0400 (0:00:00.087) 0:03:27.789 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create encrypted Stratis pool] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:136 Saturday 17 August 2024 19:33:27 -0400 (0:00:00.088) 0:03:27.878 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:33:27 -0400 (0:00:00.214) 0:03:28.093 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:33:27 -0400 (0:00:00.252) 0:03:28.345 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:33:27 -0400 (0:00:00.218) 0:03:28.564 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:33:28 -0400 (0:00:00.262) 0:03:28.827 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:33:28 -0400 (0:00:00.327) 0:03:29.155 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:33:28 -0400 (0:00:00.140) 0:03:29.295 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:33:28 -0400 (0:00:00.174) 0:03:29.469 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:33:28 -0400 (0:00:00.162) 0:03:29.632 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:33:29 -0400 (0:00:00.221) 0:03:29.854 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:33:29 -0400 (0:00:00.095) 0:03:29.950 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_password": "yabbadabbadoo", "name": "foo", "type": "stratis", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:33:29 -0400 (0:00:00.152) 0:03:30.103 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:33:29 -0400 (0:00:00.162) 0:03:30.265 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:33:29 -0400 (0:00:00.164) 0:03:30.429 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:33:29 -0400 (0:00:00.160) 0:03:30.589 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:33:30 -0400 (0:00:00.141) 0:03:30.731 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:33:30 -0400 (0:00:00.097) 0:03:30.828 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:33:30 -0400 (0:00:00.187) 0:03:31.016 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:33:30 -0400 (0:00:00.100) 0:03:31.117 ******* changed: [managed_node2] => { "actions": [ { "action": "create format", "device": "/dev/sdi", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdh", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdg", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdf", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sde", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdd", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdc", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "create device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "create device", "device": "/dev/stratis/foo/test1", "fs_type": null }, { "action": "create format", "device": "/dev/stratis/foo/test1", "fs_type": "stratis xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/stratis/foo/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "mounted" } ], "packages": [ "stratisd", "e2fsprogs", "xfsprogs", "stratis-cli" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:35:01 -0400 (0:01:30.700) 0:05:01.818 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:35:01 -0400 (0:00:00.198) 0:05:02.017 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937570.1678374, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "040ba4405b5492ce3b98ec92daf6841922885fc7", "ctime": 1723937570.1668375, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937570.1668375, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:35:02 -0400 (0:00:00.694) 0:05:02.711 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:35:02 -0400 (0:00:00.600) 0:05:03.311 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:35:02 -0400 (0:00:00.099) 0:05:03.411 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdi", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdh", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdg", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdf", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sde", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdd", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdc", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "create format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "create device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "create device", "device": "/dev/stratis/foo/test1", "fs_type": null }, { "action": "create format", "device": "/dev/stratis/foo/test1", "fs_type": "stratis xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/stratis/foo/test1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "mounted" } ], "packages": [ "stratisd", "e2fsprogs", "xfsprogs", "stratis-cli" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:35:02 -0400 (0:00:00.124) 0:05:03.536 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:35:02 -0400 (0:00:00.136) 0:05:03.672 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:35:03 -0400 (0:00:00.135) 0:05:03.808 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:35:03 -0400 (0:00:00.130) 0:05:03.938 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:35:04 -0400 (0:00:01.072) 0:05:05.010 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed_node2] => (item={'src': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:35:04 -0400 (0:00:00.567) 0:05:05.578 ******* skipping: [managed_node2] => (item={'src': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:35:04 -0400 (0:00:00.099) 0:05:05.677 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:35:05 -0400 (0:00:00.908) 0:05:06.586 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:35:06 -0400 (0:00:00.504) 0:05:07.090 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:35:06 -0400 (0:00:00.168) 0:05:07.259 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:151 Saturday 17 August 2024 19:35:08 -0400 (0:00:02.431) 0:05:09.691 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:35:09 -0400 (0:00:00.095) 0:05:09.786 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:35:09 -0400 (0:00:00.091) 0:05:09.878 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:35:09 -0400 (0:00:00.132) 0:05:10.011 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/stratis-1-private-66e703dbb8f144a39bfe322bb4d906fd-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-66e703dbb8f144a39bfe322bb4d906fd-crypt", "size": "10G", "type": "crypt", "uuid": "66e703db-b8f1-44a3-9bfe-322bb4d906fd" }, "/dev/mapper/stratis-1-private-687efc66389c4456a3a271b8c56d0c78-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-687efc66389c4456a3a271b8c56d0c78-crypt", "size": "10G", "type": "crypt", "uuid": "687efc66-389c-4456-a3a2-71b8c56d0c78" }, "/dev/mapper/stratis-1-private-70f197c95f2a4a33968e899c03f460ce-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-70f197c95f2a4a33968e899c03f460ce-crypt", "size": "1024G", "type": "crypt", "uuid": "70f197c9-5f2a-4a33-968e-899c03f460ce" }, "/dev/mapper/stratis-1-private-934229c5b40141f3883a0598eb32e234-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-934229c5b40141f3883a0598eb32e234-crypt", "size": "1024G", "type": "crypt", "uuid": "934229c5-b401-41f3-883a-0598eb32e234" }, "/dev/mapper/stratis-1-private-a45b4fa7e53a400cbf300a0ce81c18fa-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-a45b4fa7e53a400cbf300a0ce81c18fa-crypt", "size": "10G", "type": "crypt", "uuid": "a45b4fa7-e53a-400c-bf30-0a0ce81c18fa" }, "/dev/mapper/stratis-1-private-a5689d99306c442da3cf829fe8bbdb3c-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-a5689d99306c442da3cf829fe8bbdb3c-crypt", "size": "10G", "type": "crypt", "uuid": "a5689d99-306c-442d-a3cf-829fe8bbdb3c" }, "/dev/mapper/stratis-1-private-be5ead6798f74f9caa6563061a783f42-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-be5ead6798f74f9caa6563061a783f42-crypt", "size": "1024G", "type": "crypt", "uuid": "be5ead67-98f7-4f9c-aa65-63061a783f42" }, "/dev/mapper/stratis-1-private-cbad00481ebd4c17a3ddffd33c4ffc82-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-cbad00481ebd4c17a3ddffd33c4ffc82-crypt", "size": "10G", "type": "crypt", "uuid": "cbad0048-1ebd-4c17-a3dd-ffd33c4ffc82" }, "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-mdv": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-mdv", "size": "512M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-thindata": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-thindata", "size": "50G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-thinmeta": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-thinmeta", "size": "799M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-physical-originsub": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-physical-originsub", "size": "52.1G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-thinpool-pool": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-thinpool-pool", "size": "50G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-ffffc1d441ca4d419b1ee02181596943-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-ffffc1d441ca4d419b1ee02181596943-crypt", "size": "10G", "type": "crypt", "uuid": "ffffc1d4-41ca-4d41-9b1e-e02181596943" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1a0b5ad3-ff8d-4947-9db0-fbd984fbbca1" }, "/dev/sdb": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "6bee3886-5b46-44b7-a606-bc67166de960" }, "/dev/sdc": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "047f4590-d899-4094-bcbb-f6561599c064" }, "/dev/sdd": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "6952d7f7-0ef3-473e-9cde-767713ff9fef" }, "/dev/sde": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "3acbdd92-37cd-4fdf-a4f0-523634af3a07" }, "/dev/sdf": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "426fff1c-6616-4f62-b3da-c14ff29d7389" }, "/dev/sdg": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "c8a3a835-314f-4925-9984-f351d82cb397" }, "/dev/sdh": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "922638d9-f282-4e0f-9007-5a44b7eb643d" }, "/dev/sdi": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "511368db-3247-4b67-b67d-02797b098cf7" }, "/dev/stratis/foo/test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/stratis/foo/test1", "size": "4G", "type": "stratis", "uuid": "61a2e543-73f3-4af0-900d-c94e8346453c" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:35:09 -0400 (0:00:00.564) 0:05:10.575 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003160", "end": "2024-08-17 19:35:10.313712", "rc": 0, "start": "2024-08-17 19:35:10.310552" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=61a2e543-73f3-4af0-900d-c94e8346453c /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:35:10 -0400 (0:00:00.579) 0:05:11.155 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003203", "end": "2024-08-17 19:35:10.822286", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:35:10.819083" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:35:10 -0400 (0:00:00.492) 0:05:11.647 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'stratis', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', '_kernel_device': '/dev/dm-14', '_raw_kernel_device': '/dev/dm-14'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:35:11 -0400 (0:00:00.308) 0:05:11.956 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:35:11 -0400 (0:00:00.132) 0:05:12.089 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:35:11 -0400 (0:00:00.200) 0:05:12.289 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:35:11 -0400 (0:00:00.068) 0:05:12.357 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:35:11 -0400 (0:00:00.152) 0:05:12.509 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:35:11 -0400 (0:00:00.095) 0:05:12.605 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:35:11 -0400 (0:00:00.082) 0:05:12.688 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:35:12 -0400 (0:00:00.108) 0:05:12.797 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:35:12 -0400 (0:00:00.129) 0:05:12.926 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:35:12 -0400 (0:00:00.144) 0:05:13.071 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:35:12 -0400 (0:00:00.096) 0:05:13.168 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:35:12 -0400 (0:00:00.099) 0:05:13.268 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:35:12 -0400 (0:00:00.097) 0:05:13.366 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:35:12 -0400 (0:00:00.073) 0:05:13.439 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:35:13 -0400 (0:00:00.763) 0:05:14.203 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:35:13 -0400 (0:00:00.236) 0:05:14.439 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:35:14 -0400 (0:00:00.304) 0:05:14.744 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:35:14 -0400 (0:00:00.127) 0:05:14.871 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:35:14 -0400 (0:00:00.100) 0:05:14.971 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:35:14 -0400 (0:00:00.103) 0:05:15.074 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:35:14 -0400 (0:00:00.136) 0:05:15.211 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:35:14 -0400 (0:00:00.156) 0:05:15.368 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:35:14 -0400 (0:00:00.137) 0:05:15.505 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:35:14 -0400 (0:00:00.103) 0:05:15.609 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:35:15 -0400 (0:00:00.102) 0:05:15.711 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:35:15 -0400 (0:00:00.117) 0:05:15.829 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:35:15 -0400 (0:00:00.214) 0:05:16.043 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:35:15 -0400 (0:00:00.111) 0:05:16.154 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:35:15 -0400 (0:00:00.208) 0:05:16.363 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', '_kernel_device': '/dev/dm-14', '_raw_kernel_device': '/dev/dm-14'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:35:15 -0400 (0:00:00.151) 0:05:16.514 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:35:16 -0400 (0:00:00.336) 0:05:16.854 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', '_kernel_device': '/dev/dm-14', '_raw_kernel_device': '/dev/dm-14'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:35:16 -0400 (0:00:00.166) 0:05:17.021 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:35:16 -0400 (0:00:00.252) 0:05:17.274 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:35:16 -0400 (0:00:00.170) 0:05:17.445 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:35:16 -0400 (0:00:00.161) 0:05:17.606 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:35:17 -0400 (0:00:00.107) 0:05:17.715 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:35:17 -0400 (0:00:00.213) 0:05:17.929 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:35:17 -0400 (0:00:00.213) 0:05:18.142 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', '_kernel_device': '/dev/dm-14', '_raw_kernel_device': '/dev/dm-14'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:35:17 -0400 (0:00:00.124) 0:05:18.266 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:35:17 -0400 (0:00:00.158) 0:05:18.425 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.375852", "end": "2024-08-17 19:35:18.448103", "rc": 0, "start": "2024-08-17 19:35:18.072251" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sda", "size": "20938752 sectors", "uuid": "a45b4fa7-e53a-400c-bf30-0a0ce81c18fa" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdb", "size": "20938752 sectors", "uuid": "ffffc1d4-41ca-4d41-9b1e-e02181596943" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdc", "size": "20938752 sectors", "uuid": "66e703db-b8f1-44a3-9bfe-322bb4d906fd" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdd", "size": "2147450880 sectors", "uuid": "be5ead67-98f7-4f9c-aa65-63061a783f42" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sde", "size": "2147450880 sectors", "uuid": "70f197c9-5f2a-4a33-968e-899c03f460ce" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdf", "size": "20938752 sectors", "uuid": "cbad0048-1ebd-4c17-a3dd-ffd33c4ffc82" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdg", "size": "2147450880 sectors", "uuid": "934229c5-b401-41f3-883a-0598eb32e234" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdh", "size": "20938752 sectors", "uuid": "687efc66-389c-4456-a3a2-71b8c56d0c78" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdi", "size": "20938752 sectors", "uuid": "a5689d99-306c-442d-a3cf-829fe8bbdb3c" } ] }, "filesystems": [ { "name": "test1", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "61a2e543-73f3-4af0-900d-c94e8346453c" } ], "fs_limit": 100, "name": "foo", "uuid": "e0de4b70-b7c2-4916-8e76-7728235a468a" } ], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:35:18 -0400 (0:00:00.819) 0:05:19.245 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sda", "size": "20938752 sectors", "uuid": "a45b4fa7-e53a-400c-bf30-0a0ce81c18fa" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdb", "size": "20938752 sectors", "uuid": "ffffc1d4-41ca-4d41-9b1e-e02181596943" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdc", "size": "20938752 sectors", "uuid": "66e703db-b8f1-44a3-9bfe-322bb4d906fd" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdd", "size": "2147450880 sectors", "uuid": "be5ead67-98f7-4f9c-aa65-63061a783f42" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sde", "size": "2147450880 sectors", "uuid": "70f197c9-5f2a-4a33-968e-899c03f460ce" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdf", "size": "20938752 sectors", "uuid": "cbad0048-1ebd-4c17-a3dd-ffd33c4ffc82" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdg", "size": "2147450880 sectors", "uuid": "934229c5-b401-41f3-883a-0598eb32e234" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdh", "size": "20938752 sectors", "uuid": "687efc66-389c-4456-a3a2-71b8c56d0c78" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdi", "size": "20938752 sectors", "uuid": "a5689d99-306c-442d-a3cf-829fe8bbdb3c" } ] }, "filesystems": [ { "name": "test1", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "61a2e543-73f3-4af0-900d-c94e8346453c" } ], "fs_limit": 100, "name": "foo", "uuid": "e0de4b70-b7c2-4916-8e76-7728235a468a" } ], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:35:18 -0400 (0:00:00.099) 0:05:19.344 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:35:18 -0400 (0:00:00.085) 0:05:19.430 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:35:18 -0400 (0:00:00.064) 0:05:19.494 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption_clevis_pin == 'tang'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:35:18 -0400 (0:00:00.071) 0:05:19.566 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:35:19 -0400 (0:00:00.182) 0:05:19.749 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:35:19 -0400 (0:00:00.160) 0:05:19.909 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', '_kernel_device': '/dev/dm-14', '_raw_kernel_device': '/dev/dm-14'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 17 August 2024 19:35:19 -0400 (0:00:00.353) 0:05:20.262 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 17 August 2024 19:35:19 -0400 (0:00:00.171) 0:05:20.434 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 => (item=mount) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 => (item=fstab) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 => (item=fs) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 => (item=device) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 => (item=encryption) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 => (item=md) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 => (item=size) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 17 August 2024 19:35:20 -0400 (0:00:00.519) 0:05:20.953 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/stratis/foo/test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 17 August 2024 19:35:20 -0400 (0:00:00.189) 0:05:21.143 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 17 August 2024 19:35:20 -0400 (0:00:00.268) 0:05:21.411 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and (storage_test_volume.mount_user or storage_test_volume.mount_group or storage_test_volume.mount_mode)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 17 August 2024 19:35:20 -0400 (0:00:00.143) 0:05:21.555 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 17 August 2024 19:35:21 -0400 (0:00:00.226) 0:05:21.781 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_user", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 17 August 2024 19:35:21 -0400 (0:00:00.212) 0:05:21.994 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_group", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 17 August 2024 19:35:21 -0400 (0:00:00.135) 0:05:22.132 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_mode", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 17 August 2024 19:35:21 -0400 (0:00:00.197) 0:05:22.330 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 17 August 2024 19:35:21 -0400 (0:00:00.291) 0:05:22.621 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 17 August 2024 19:35:22 -0400 (0:00:00.191) 0:05:22.813 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 17 August 2024 19:35:22 -0400 (0:00:00.188) 0:05:23.001 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 17 August 2024 19:35:22 -0400 (0:00:00.172) 0:05:23.174 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=61a2e543-73f3-4af0-900d-c94e8346453c " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 17 August 2024 19:35:22 -0400 (0:00:00.276) 0:05:23.451 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 17 August 2024 19:35:22 -0400 (0:00:00.190) 0:05:23.642 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 17 August 2024 19:35:23 -0400 (0:00:00.181) 0:05:23.823 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 17 August 2024 19:35:23 -0400 (0:00:00.214) 0:05:24.038 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 17 August 2024 19:35:23 -0400 (0:00:00.150) 0:05:24.188 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 17 August 2024 19:35:23 -0400 (0:00:00.184) 0:05:24.372 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 17 August 2024 19:35:23 -0400 (0:00:00.129) 0:05:24.502 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 17 August 2024 19:35:23 -0400 (0:00:00.111) 0:05:24.613 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937700.7387369, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1723937700.7387369, "dev": 6, "device_type": 64782, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5086, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1723937700.7387369, "nlink": 1, "path": "/dev/stratis/foo/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 17 August 2024 19:35:24 -0400 (0:00:00.509) 0:05:25.122 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 17 August 2024 19:35:24 -0400 (0:00:00.247) 0:05:25.370 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 17 August 2024 19:35:24 -0400 (0:00:00.098) 0:05:25.469 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 17 August 2024 19:35:24 -0400 (0:00:00.109) 0:05:25.578 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "stratis" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 17 August 2024 19:35:24 -0400 (0:00:00.106) 0:05:25.685 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 17 August 2024 19:35:25 -0400 (0:00:00.099) 0:05:25.784 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 17 August 2024 19:35:25 -0400 (0:00:00.107) 0:05:25.892 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 17 August 2024 19:35:25 -0400 (0:00:00.100) 0:05:25.993 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 17 August 2024 19:35:26 -0400 (0:00:01.532) 0:05:27.525 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 17 August 2024 19:35:26 -0400 (0:00:00.172) 0:05:27.698 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 17 August 2024 19:35:27 -0400 (0:00:00.132) 0:05:27.830 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 17 August 2024 19:35:27 -0400 (0:00:00.169) 0:05:27.999 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 17 August 2024 19:35:27 -0400 (0:00:00.104) 0:05:28.104 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 17 August 2024 19:35:27 -0400 (0:00:00.103) 0:05:28.208 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 17 August 2024 19:35:27 -0400 (0:00:00.150) 0:05:28.359 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 17 August 2024 19:35:27 -0400 (0:00:00.303) 0:05:28.663 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 17 August 2024 19:35:28 -0400 (0:00:00.135) 0:05:28.798 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 17 August 2024 19:35:28 -0400 (0:00:00.273) 0:05:29.072 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 17 August 2024 19:35:28 -0400 (0:00:00.207) 0:05:29.279 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 17 August 2024 19:35:28 -0400 (0:00:00.162) 0:05:29.442 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 17 August 2024 19:35:28 -0400 (0:00:00.165) 0:05:29.607 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 17 August 2024 19:35:29 -0400 (0:00:00.220) 0:05:29.827 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 17 August 2024 19:35:29 -0400 (0:00:00.144) 0:05:29.972 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 17 August 2024 19:35:29 -0400 (0:00:00.128) 0:05:30.100 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 17 August 2024 19:35:29 -0400 (0:00:00.105) 0:05:30.206 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 17 August 2024 19:35:29 -0400 (0:00:00.104) 0:05:30.310 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 17 August 2024 19:35:29 -0400 (0:00:00.109) 0:05:30.419 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 17 August 2024 19:35:29 -0400 (0:00:00.103) 0:05:30.523 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 17 August 2024 19:35:30 -0400 (0:00:00.235) 0:05:30.759 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 17 August 2024 19:35:30 -0400 (0:00:00.148) 0:05:30.908 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 17 August 2024 19:35:30 -0400 (0:00:00.134) 0:05:31.042 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 17 August 2024 19:35:30 -0400 (0:00:00.169) 0:05:31.212 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 17 August 2024 19:35:30 -0400 (0:00:00.118) 0:05:31.331 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 17 August 2024 19:35:31 -0400 (0:00:00.546) 0:05:31.877 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 17 August 2024 19:35:31 -0400 (0:00:00.092) 0:05:31.970 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 17 August 2024 19:35:31 -0400 (0:00:00.081) 0:05:32.051 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 17 August 2024 19:35:31 -0400 (0:00:00.068) 0:05:32.119 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 17 August 2024 19:35:31 -0400 (0:00:00.141) 0:05:32.261 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 17 August 2024 19:35:31 -0400 (0:00:00.157) 0:05:32.419 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 17 August 2024 19:35:31 -0400 (0:00:00.142) 0:05:32.561 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 17 August 2024 19:35:32 -0400 (0:00:00.146) 0:05:32.708 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 17 August 2024 19:35:32 -0400 (0:00:00.362) 0:05:33.071 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 17 August 2024 19:35:32 -0400 (0:00:00.119) 0:05:33.191 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 17 August 2024 19:35:32 -0400 (0:00:00.105) 0:05:33.296 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 17 August 2024 19:35:32 -0400 (0:00:00.073) 0:05:33.370 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 17 August 2024 19:35:32 -0400 (0:00:00.073) 0:05:33.443 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 17 August 2024 19:35:32 -0400 (0:00:00.062) 0:05:33.506 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 17 August 2024 19:35:32 -0400 (0:00:00.055) 0:05:33.561 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 17 August 2024 19:35:32 -0400 (0:00:00.053) 0:05:33.615 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 17 August 2024 19:35:32 -0400 (0:00:00.058) 0:05:33.673 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.055) 0:05:33.728 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.054) 0:05:33.783 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.056) 0:05:33.839 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.054) 0:05:33.893 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.053) 0:05:33.947 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.141) 0:05:34.088 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.056) 0:05:34.144 ******* ok: [managed_node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.060) 0:05:34.205 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.059) 0:05:34.264 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.076) 0:05:34.341 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.053) 0:05:34.395 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.055) 0:05:34.451 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.055) 0:05:34.506 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.054) 0:05:34.561 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.055) 0:05:34.617 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 17 August 2024 19:35:33 -0400 (0:00:00.058) 0:05:34.676 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 17 August 2024 19:35:34 -0400 (0:00:00.092) 0:05:34.768 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:35:34 -0400 (0:00:00.108) 0:05:34.877 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:35:34 -0400 (0:00:00.208) 0:05:35.085 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Repeat the previous invocation to verify idempotence] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:154 Saturday 17 August 2024 19:35:34 -0400 (0:00:00.113) 0:05:35.199 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:35:34 -0400 (0:00:00.204) 0:05:35.403 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:35:34 -0400 (0:00:00.234) 0:05:35.638 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:35:35 -0400 (0:00:00.311) 0:05:35.949 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:35:35 -0400 (0:00:00.438) 0:05:36.388 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:35:35 -0400 (0:00:00.170) 0:05:36.558 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:35:36 -0400 (0:00:00.143) 0:05:36.702 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:35:36 -0400 (0:00:00.204) 0:05:36.907 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:35:36 -0400 (0:00:00.127) 0:05:37.034 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:35:36 -0400 (0:00:00.355) 0:05:37.390 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:35:36 -0400 (0:00:00.158) 0:05:37.549 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_password": "yabbadabbadoo", "name": "foo", "type": "stratis", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:35:37 -0400 (0:00:00.163) 0:05:37.713 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:35:37 -0400 (0:00:00.129) 0:05:37.842 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:35:37 -0400 (0:00:00.099) 0:05:37.942 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:35:37 -0400 (0:00:00.105) 0:05:38.048 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:35:37 -0400 (0:00:00.162) 0:05:38.210 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:35:37 -0400 (0:00:00.096) 0:05:38.306 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:35:37 -0400 (0:00:00.169) 0:05:38.476 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:35:37 -0400 (0:00:00.044) 0:05:38.521 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/stratis/foo/test1", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "stratisd", "stratis-cli" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:35:43 -0400 (0:00:05.181) 0:05:43.703 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:35:43 -0400 (0:00:00.175) 0:05:43.878 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937704.7607646, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a281d3d99e3c3d4a308ce90e0082bd495d521e03", "ctime": 1723937704.7597647, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937704.7597647, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1436, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:35:43 -0400 (0:00:00.653) 0:05:44.532 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:35:44 -0400 (0:00:00.261) 0:05:44.793 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:35:44 -0400 (0:00:00.112) 0:05:44.906 ******* ok: [managed_node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/stratis/foo/test1", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "mounted" } ], "packages": [ "e2fsprogs", "xfsprogs", "stratisd", "stratis-cli" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:35:44 -0400 (0:00:00.166) 0:05:45.073 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:35:44 -0400 (0:00:00.172) 0:05:45.245 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:35:44 -0400 (0:00:00.172) 0:05:45.417 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:35:44 -0400 (0:00:00.277) 0:05:45.695 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:35:46 -0400 (0:00:01.154) 0:05:46.849 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [managed_node2] => (item={'src': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:35:46 -0400 (0:00:00.752) 0:05:47.601 ******* skipping: [managed_node2] => (item={'src': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:35:47 -0400 (0:00:00.258) 0:05:47.860 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:35:48 -0400 (0:00:01.029) 0:05:48.889 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:35:48 -0400 (0:00:00.516) 0:05:49.406 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:35:48 -0400 (0:00:00.080) 0:05:49.486 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:169 Saturday 17 August 2024 19:35:51 -0400 (0:00:02.607) 0:05:52.094 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:35:51 -0400 (0:00:00.194) 0:05:52.289 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:35:51 -0400 (0:00:00.153) 0:05:52.442 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:35:51 -0400 (0:00:00.152) 0:05:52.595 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/stratis-1-private-66e703dbb8f144a39bfe322bb4d906fd-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-66e703dbb8f144a39bfe322bb4d906fd-crypt", "size": "10G", "type": "crypt", "uuid": "66e703db-b8f1-44a3-9bfe-322bb4d906fd" }, "/dev/mapper/stratis-1-private-687efc66389c4456a3a271b8c56d0c78-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-687efc66389c4456a3a271b8c56d0c78-crypt", "size": "10G", "type": "crypt", "uuid": "687efc66-389c-4456-a3a2-71b8c56d0c78" }, "/dev/mapper/stratis-1-private-70f197c95f2a4a33968e899c03f460ce-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-70f197c95f2a4a33968e899c03f460ce-crypt", "size": "1024G", "type": "crypt", "uuid": "70f197c9-5f2a-4a33-968e-899c03f460ce" }, "/dev/mapper/stratis-1-private-934229c5b40141f3883a0598eb32e234-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-934229c5b40141f3883a0598eb32e234-crypt", "size": "1024G", "type": "crypt", "uuid": "934229c5-b401-41f3-883a-0598eb32e234" }, "/dev/mapper/stratis-1-private-a45b4fa7e53a400cbf300a0ce81c18fa-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-a45b4fa7e53a400cbf300a0ce81c18fa-crypt", "size": "10G", "type": "crypt", "uuid": "a45b4fa7-e53a-400c-bf30-0a0ce81c18fa" }, "/dev/mapper/stratis-1-private-a5689d99306c442da3cf829fe8bbdb3c-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-a5689d99306c442da3cf829fe8bbdb3c-crypt", "size": "10G", "type": "crypt", "uuid": "a5689d99-306c-442d-a3cf-829fe8bbdb3c" }, "/dev/mapper/stratis-1-private-be5ead6798f74f9caa6563061a783f42-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-be5ead6798f74f9caa6563061a783f42-crypt", "size": "1024G", "type": "crypt", "uuid": "be5ead67-98f7-4f9c-aa65-63061a783f42" }, "/dev/mapper/stratis-1-private-cbad00481ebd4c17a3ddffd33c4ffc82-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-cbad00481ebd4c17a3ddffd33c4ffc82-crypt", "size": "10G", "type": "crypt", "uuid": "cbad0048-1ebd-4c17-a3dd-ffd33c4ffc82" }, "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-mdv": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-mdv", "size": "512M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-thindata": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-thindata", "size": "50G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-thinmeta": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-flex-thinmeta", "size": "799M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-physical-originsub": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-physical-originsub", "size": "52.1G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-thinpool-pool": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-e0de4b70b7c249168e767728235a468a-thinpool-pool", "size": "50G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-ffffc1d441ca4d419b1ee02181596943-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-ffffc1d441ca4d419b1ee02181596943-crypt", "size": "10G", "type": "crypt", "uuid": "ffffc1d4-41ca-4d41-9b1e-e02181596943" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1a0b5ad3-ff8d-4947-9db0-fbd984fbbca1" }, "/dev/sdb": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "6bee3886-5b46-44b7-a606-bc67166de960" }, "/dev/sdc": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "047f4590-d899-4094-bcbb-f6561599c064" }, "/dev/sdd": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "6952d7f7-0ef3-473e-9cde-767713ff9fef" }, "/dev/sde": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "3acbdd92-37cd-4fdf-a4f0-523634af3a07" }, "/dev/sdf": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "426fff1c-6616-4f62-b3da-c14ff29d7389" }, "/dev/sdg": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "c8a3a835-314f-4925-9984-f351d82cb397" }, "/dev/sdh": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "922638d9-f282-4e0f-9007-5a44b7eb643d" }, "/dev/sdi": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "511368db-3247-4b67-b67d-02797b098cf7" }, "/dev/stratis/foo/test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/stratis/foo/test1", "size": "4G", "type": "stratis", "uuid": "61a2e543-73f3-4af0-900d-c94e8346453c" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:35:52 -0400 (0:00:00.522) 0:05:53.117 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003124", "end": "2024-08-17 19:35:52.798228", "rc": 0, "start": "2024-08-17 19:35:52.795104" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=61a2e543-73f3-4af0-900d-c94e8346453c /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:35:52 -0400 (0:00:00.504) 0:05:53.622 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003152", "end": "2024-08-17 19:35:53.324494", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:35:53.321342" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:35:53 -0400 (0:00:00.529) 0:05:54.151 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'stratis', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', '_kernel_device': '/dev/dm-14', '_raw_kernel_device': '/dev/dm-14'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:35:53 -0400 (0:00:00.241) 0:05:54.393 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:35:53 -0400 (0:00:00.108) 0:05:54.502 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:35:53 -0400 (0:00:00.096) 0:05:54.598 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:35:54 -0400 (0:00:00.259) 0:05:54.857 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:35:54 -0400 (0:00:00.233) 0:05:55.091 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:35:54 -0400 (0:00:00.140) 0:05:55.232 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:35:54 -0400 (0:00:00.118) 0:05:55.350 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:35:54 -0400 (0:00:00.123) 0:05:55.473 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:35:54 -0400 (0:00:00.113) 0:05:55.587 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:35:54 -0400 (0:00:00.102) 0:05:55.689 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:35:55 -0400 (0:00:00.100) 0:05:55.790 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:35:55 -0400 (0:00:00.115) 0:05:55.906 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:35:55 -0400 (0:00:00.118) 0:05:56.024 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:35:55 -0400 (0:00:00.110) 0:05:56.134 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:35:56 -0400 (0:00:00.589) 0:05:56.724 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:35:56 -0400 (0:00:00.181) 0:05:56.906 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:35:56 -0400 (0:00:00.504) 0:05:57.410 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:35:56 -0400 (0:00:00.110) 0:05:57.520 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:35:56 -0400 (0:00:00.117) 0:05:57.637 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:35:57 -0400 (0:00:00.104) 0:05:57.742 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:35:57 -0400 (0:00:00.098) 0:05:57.841 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:35:57 -0400 (0:00:00.137) 0:05:57.978 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:35:57 -0400 (0:00:00.210) 0:05:58.188 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:35:57 -0400 (0:00:00.146) 0:05:58.334 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:35:57 -0400 (0:00:00.109) 0:05:58.444 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:35:57 -0400 (0:00:00.119) 0:05:58.564 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:35:57 -0400 (0:00:00.100) 0:05:58.665 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:35:58 -0400 (0:00:00.101) 0:05:58.766 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:35:58 -0400 (0:00:00.271) 0:05:59.038 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', '_kernel_device': '/dev/dm-14', '_raw_kernel_device': '/dev/dm-14'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:35:58 -0400 (0:00:00.116) 0:05:59.154 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:35:58 -0400 (0:00:00.285) 0:05:59.440 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', '_kernel_device': '/dev/dm-14', '_raw_kernel_device': '/dev/dm-14'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:35:58 -0400 (0:00:00.197) 0:05:59.637 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:35:59 -0400 (0:00:00.267) 0:05:59.904 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:35:59 -0400 (0:00:00.154) 0:06:00.058 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:35:59 -0400 (0:00:00.076) 0:06:00.135 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:35:59 -0400 (0:00:00.082) 0:06:00.218 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:35:59 -0400 (0:00:00.120) 0:06:00.338 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:36:00 -0400 (0:00:00.372) 0:06:00.711 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', '_kernel_device': '/dev/dm-14', '_raw_kernel_device': '/dev/dm-14'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/stratis/foo/test1", "_kernel_device": "/dev/dm-14", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "_raw_kernel_device": "/dev/dm-14", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:36:00 -0400 (0:00:00.136) 0:06:00.848 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:36:00 -0400 (0:00:00.245) 0:06:01.093 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.377006", "end": "2024-08-17 19:36:01.164712", "rc": 0, "start": "2024-08-17 19:36:00.787706" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sda", "size": "20938752 sectors", "uuid": "a45b4fa7-e53a-400c-bf30-0a0ce81c18fa" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdb", "size": "20938752 sectors", "uuid": "ffffc1d4-41ca-4d41-9b1e-e02181596943" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdc", "size": "20938752 sectors", "uuid": "66e703db-b8f1-44a3-9bfe-322bb4d906fd" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdd", "size": "2147450880 sectors", "uuid": "be5ead67-98f7-4f9c-aa65-63061a783f42" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sde", "size": "2147450880 sectors", "uuid": "70f197c9-5f2a-4a33-968e-899c03f460ce" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdf", "size": "20938752 sectors", "uuid": "cbad0048-1ebd-4c17-a3dd-ffd33c4ffc82" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdg", "size": "2147450880 sectors", "uuid": "934229c5-b401-41f3-883a-0598eb32e234" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdh", "size": "20938752 sectors", "uuid": "687efc66-389c-4456-a3a2-71b8c56d0c78" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdi", "size": "20938752 sectors", "uuid": "a5689d99-306c-442d-a3cf-829fe8bbdb3c" } ] }, "filesystems": [ { "name": "test1", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "61a2e543-73f3-4af0-900d-c94e8346453c" } ], "fs_limit": 100, "name": "foo", "uuid": "e0de4b70-b7c2-4916-8e76-7728235a468a" } ], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:36:01 -0400 (0:00:00.901) 0:06:01.995 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sda", "size": "20938752 sectors", "uuid": "a45b4fa7-e53a-400c-bf30-0a0ce81c18fa" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdb", "size": "20938752 sectors", "uuid": "ffffc1d4-41ca-4d41-9b1e-e02181596943" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdc", "size": "20938752 sectors", "uuid": "66e703db-b8f1-44a3-9bfe-322bb4d906fd" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sdd", "size": "2147450880 sectors", "uuid": "be5ead67-98f7-4f9c-aa65-63061a783f42" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sde", "size": "2147450880 sectors", "uuid": "70f197c9-5f2a-4a33-968e-899c03f460ce" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdf", "size": "20938752 sectors", "uuid": "cbad0048-1ebd-4c17-a3dd-ffd33c4ffc82" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdg", "size": "2147450880 sectors", "uuid": "934229c5-b401-41f3-883a-0598eb32e234" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdh", "size": "20938752 sectors", "uuid": "687efc66-389c-4456-a3a2-71b8c56d0c78" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "in_use": false, "key_description": "blivet-foo", "path": "/dev/sdi", "size": "20938752 sectors", "uuid": "a5689d99-306c-442d-a3cf-829fe8bbdb3c" } ] }, "filesystems": [ { "name": "test1", "size": "8388608 sectors", "size_limit": "Not set", "used": "72351744 bytes", "uuid": "61a2e543-73f3-4af0-900d-c94e8346453c" } ], "fs_limit": 100, "name": "foo", "uuid": "e0de4b70-b7c2-4916-8e76-7728235a468a" } ], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:36:01 -0400 (0:00:00.147) 0:06:02.142 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:36:01 -0400 (0:00:00.147) 0:06:02.290 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:36:01 -0400 (0:00:00.103) 0:06:02.393 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption_clevis_pin == 'tang'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:36:01 -0400 (0:00:00.112) 0:06:02.506 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:36:01 -0400 (0:00:00.136) 0:06:02.643 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:36:02 -0400 (0:00:00.102) 0:06:02.745 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', '_kernel_device': '/dev/dm-14', '_raw_kernel_device': '/dev/dm-14'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 17 August 2024 19:36:02 -0400 (0:00:00.297) 0:06:03.043 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 17 August 2024 19:36:02 -0400 (0:00:00.127) 0:06:03.170 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 => (item=mount) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 => (item=fstab) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 => (item=fs) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 => (item=device) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 => (item=encryption) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 => (item=md) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 => (item=size) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 17 August 2024 19:36:02 -0400 (0:00:00.266) 0:06:03.437 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/stratis/foo/test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 17 August 2024 19:36:02 -0400 (0:00:00.072) 0:06:03.509 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 17 August 2024 19:36:03 -0400 (0:00:00.264) 0:06:03.774 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and (storage_test_volume.mount_user or storage_test_volume.mount_group or storage_test_volume.mount_mode)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 17 August 2024 19:36:03 -0400 (0:00:00.165) 0:06:03.940 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 17 August 2024 19:36:03 -0400 (0:00:00.131) 0:06:04.071 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_user", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 17 August 2024 19:36:03 -0400 (0:00:00.106) 0:06:04.178 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_group", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 17 August 2024 19:36:03 -0400 (0:00:00.100) 0:06:04.279 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_mode", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 17 August 2024 19:36:03 -0400 (0:00:00.099) 0:06:04.378 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 17 August 2024 19:36:03 -0400 (0:00:00.096) 0:06:04.475 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 17 August 2024 19:36:03 -0400 (0:00:00.091) 0:06:04.567 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 17 August 2024 19:36:04 -0400 (0:00:00.188) 0:06:04.756 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 17 August 2024 19:36:04 -0400 (0:00:00.129) 0:06:04.886 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=61a2e543-73f3-4af0-900d-c94e8346453c " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 17 August 2024 19:36:04 -0400 (0:00:00.200) 0:06:05.086 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 17 August 2024 19:36:04 -0400 (0:00:00.143) 0:06:05.230 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 17 August 2024 19:36:04 -0400 (0:00:00.210) 0:06:05.440 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 17 August 2024 19:36:04 -0400 (0:00:00.200) 0:06:05.641 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 17 August 2024 19:36:05 -0400 (0:00:00.154) 0:06:05.796 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 17 August 2024 19:36:05 -0400 (0:00:00.111) 0:06:05.907 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 17 August 2024 19:36:05 -0400 (0:00:00.096) 0:06:06.003 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 17 August 2024 19:36:05 -0400 (0:00:00.094) 0:06:06.098 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937700.7387369, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1723937700.7387369, "dev": 6, "device_type": 64782, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 5086, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1723937700.7387369, "nlink": 1, "path": "/dev/stratis/foo/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 17 August 2024 19:36:05 -0400 (0:00:00.502) 0:06:06.600 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 17 August 2024 19:36:06 -0400 (0:00:00.107) 0:06:06.707 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 17 August 2024 19:36:06 -0400 (0:00:00.122) 0:06:06.830 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 17 August 2024 19:36:06 -0400 (0:00:00.209) 0:06:07.040 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "stratis" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 17 August 2024 19:36:06 -0400 (0:00:00.110) 0:06:07.150 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 17 August 2024 19:36:06 -0400 (0:00:00.082) 0:06:07.232 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 17 August 2024 19:36:06 -0400 (0:00:00.072) 0:06:07.305 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 17 August 2024 19:36:06 -0400 (0:00:00.064) 0:06:07.369 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 17 August 2024 19:36:08 -0400 (0:00:01.469) 0:06:08.839 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 17 August 2024 19:36:08 -0400 (0:00:00.162) 0:06:09.001 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 17 August 2024 19:36:08 -0400 (0:00:00.098) 0:06:09.100 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 17 August 2024 19:36:08 -0400 (0:00:00.148) 0:06:09.248 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 17 August 2024 19:36:08 -0400 (0:00:00.098) 0:06:09.347 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 17 August 2024 19:36:08 -0400 (0:00:00.099) 0:06:09.447 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 17 August 2024 19:36:08 -0400 (0:00:00.098) 0:06:09.546 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 17 August 2024 19:36:08 -0400 (0:00:00.099) 0:06:09.645 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 17 August 2024 19:36:09 -0400 (0:00:00.180) 0:06:09.825 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 17 August 2024 19:36:09 -0400 (0:00:00.147) 0:06:09.973 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 17 August 2024 19:36:09 -0400 (0:00:00.203) 0:06:10.176 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 17 August 2024 19:36:09 -0400 (0:00:00.134) 0:06:10.311 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 17 August 2024 19:36:09 -0400 (0:00:00.135) 0:06:10.447 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 17 August 2024 19:36:09 -0400 (0:00:00.142) 0:06:10.590 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 17 August 2024 19:36:10 -0400 (0:00:00.156) 0:06:10.746 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 17 August 2024 19:36:10 -0400 (0:00:00.151) 0:06:10.898 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 17 August 2024 19:36:10 -0400 (0:00:00.143) 0:06:11.041 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 17 August 2024 19:36:10 -0400 (0:00:00.149) 0:06:11.191 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 17 August 2024 19:36:10 -0400 (0:00:00.175) 0:06:11.366 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 17 August 2024 19:36:10 -0400 (0:00:00.159) 0:06:11.526 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 17 August 2024 19:36:10 -0400 (0:00:00.101) 0:06:11.628 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 17 August 2024 19:36:11 -0400 (0:00:00.097) 0:06:11.725 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 17 August 2024 19:36:11 -0400 (0:00:00.249) 0:06:11.975 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 17 August 2024 19:36:11 -0400 (0:00:00.199) 0:06:12.174 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 17 August 2024 19:36:11 -0400 (0:00:00.197) 0:06:12.371 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 17 August 2024 19:36:12 -0400 (0:00:00.590) 0:06:12.962 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 17 August 2024 19:36:12 -0400 (0:00:00.139) 0:06:13.102 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 17 August 2024 19:36:12 -0400 (0:00:00.131) 0:06:13.233 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 17 August 2024 19:36:12 -0400 (0:00:00.105) 0:06:13.338 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 17 August 2024 19:36:12 -0400 (0:00:00.163) 0:06:13.502 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 17 August 2024 19:36:12 -0400 (0:00:00.130) 0:06:13.633 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 17 August 2024 19:36:13 -0400 (0:00:00.136) 0:06:13.769 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 17 August 2024 19:36:13 -0400 (0:00:00.134) 0:06:13.904 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 17 August 2024 19:36:13 -0400 (0:00:00.137) 0:06:14.042 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 17 August 2024 19:36:13 -0400 (0:00:00.098) 0:06:14.141 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 17 August 2024 19:36:13 -0400 (0:00:00.274) 0:06:14.415 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 17 August 2024 19:36:13 -0400 (0:00:00.161) 0:06:14.577 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 17 August 2024 19:36:14 -0400 (0:00:00.183) 0:06:14.761 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 17 August 2024 19:36:14 -0400 (0:00:00.126) 0:06:14.887 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 17 August 2024 19:36:14 -0400 (0:00:00.110) 0:06:14.997 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 17 August 2024 19:36:14 -0400 (0:00:00.100) 0:06:15.098 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 17 August 2024 19:36:14 -0400 (0:00:00.103) 0:06:15.202 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 17 August 2024 19:36:14 -0400 (0:00:00.108) 0:06:15.310 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 17 August 2024 19:36:14 -0400 (0:00:00.124) 0:06:15.435 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 17 August 2024 19:36:14 -0400 (0:00:00.142) 0:06:15.578 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 17 August 2024 19:36:15 -0400 (0:00:00.183) 0:06:15.761 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 17 August 2024 19:36:15 -0400 (0:00:00.161) 0:06:15.922 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 17 August 2024 19:36:15 -0400 (0:00:00.145) 0:06:16.067 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 17 August 2024 19:36:15 -0400 (0:00:00.199) 0:06:16.267 ******* ok: [managed_node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 17 August 2024 19:36:15 -0400 (0:00:00.108) 0:06:16.376 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 17 August 2024 19:36:15 -0400 (0:00:00.107) 0:06:16.483 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 17 August 2024 19:36:15 -0400 (0:00:00.140) 0:06:16.624 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 17 August 2024 19:36:16 -0400 (0:00:00.112) 0:06:16.737 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 17 August 2024 19:36:16 -0400 (0:00:00.145) 0:06:16.882 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 17 August 2024 19:36:16 -0400 (0:00:00.099) 0:06:16.982 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 17 August 2024 19:36:16 -0400 (0:00:00.098) 0:06:17.080 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 17 August 2024 19:36:16 -0400 (0:00:00.099) 0:06:17.180 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 17 August 2024 19:36:16 -0400 (0:00:00.105) 0:06:17.286 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 17 August 2024 19:36:16 -0400 (0:00:00.109) 0:06:17.396 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:36:16 -0400 (0:00:00.153) 0:06:17.549 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:36:17 -0400 (0:00:00.157) 0:06:17.707 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:172 Saturday 17 August 2024 19:36:17 -0400 (0:00:00.183) 0:06:17.891 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:36:17 -0400 (0:00:00.362) 0:06:18.254 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:36:17 -0400 (0:00:00.089) 0:06:18.344 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:36:17 -0400 (0:00:00.092) 0:06:18.436 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:36:17 -0400 (0:00:00.209) 0:06:18.646 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:36:18 -0400 (0:00:00.107) 0:06:18.754 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:36:18 -0400 (0:00:00.105) 0:06:18.859 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:36:18 -0400 (0:00:00.134) 0:06:18.993 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:36:18 -0400 (0:00:00.127) 0:06:19.120 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:36:18 -0400 (0:00:00.201) 0:06:19.322 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:36:18 -0400 (0:00:00.138) 0:06:19.460 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "name": "foo", "state": "absent", "type": "stratis", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g", "state": "absent" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:36:18 -0400 (0:00:00.068) 0:06:19.529 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:36:18 -0400 (0:00:00.059) 0:06:19.589 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:36:18 -0400 (0:00:00.061) 0:06:19.650 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:36:19 -0400 (0:00:00.057) 0:06:19.707 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:36:19 -0400 (0:00:00.056) 0:06:19.764 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:36:19 -0400 (0:00:00.056) 0:06:19.820 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:36:19 -0400 (0:00:00.101) 0:06:19.922 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:36:19 -0400 (0:00:00.043) 0:06:19.965 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/stratis/foo/test1", "fs_type": "stratis xfs" }, { "action": "destroy device", "device": "/dev/stratis/foo/test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sde", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdg", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdd", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdi", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdf", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdh", "fs_type": "stratis" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:36:31 -0400 (0:00:12.439) 0:06:32.405 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:36:31 -0400 (0:00:00.177) 0:06:32.582 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937704.7607646, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a281d3d99e3c3d4a308ce90e0082bd495d521e03", "ctime": 1723937704.7597647, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937704.7597647, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1436, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:36:32 -0400 (0:00:00.538) 0:06:33.121 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:36:32 -0400 (0:00:00.486) 0:06:33.607 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:36:33 -0400 (0:00:00.118) 0:06:33.726 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/stratis/foo/test1", "fs_type": "stratis xfs" }, { "action": "destroy device", "device": "/dev/stratis/foo/test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sde", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdg", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdd", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdi", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdf", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sdh", "fs_type": "stratis" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:36:33 -0400 (0:00:00.115) 0:06:33.841 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:36:33 -0400 (0:00:00.127) 0:06:33.969 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:36:33 -0400 (0:00:00.109) 0:06:34.078 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed_node2] => (item={'src': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:36:33 -0400 (0:00:00.482) 0:06:34.561 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:36:34 -0400 (0:00:00.844) 0:06:35.405 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:36:34 -0400 (0:00:00.074) 0:06:35.479 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:36:34 -0400 (0:00:00.075) 0:06:35.555 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:36:35 -0400 (0:00:00.852) 0:06:36.408 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:36:36 -0400 (0:00:00.459) 0:06:36.867 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:36:36 -0400 (0:00:00.046) 0:06:36.914 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:187 Saturday 17 August 2024 19:36:38 -0400 (0:00:02.502) 0:06:39.417 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:36:38 -0400 (0:00:00.186) 0:06:39.603 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:36:38 -0400 (0:00:00.087) 0:06:39.691 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:36:39 -0400 (0:00:00.076) 0:06:39.768 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:36:39 -0400 (0:00:00.426) 0:06:40.194 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004204", "end": "2024-08-17 19:36:40.829885", "rc": 0, "start": "2024-08-17 19:36:39.825681" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:36:40 -0400 (0:00:01.418) 0:06:41.612 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003078", "end": "2024-08-17 19:36:41.251992", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:36:41.248914" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:36:41 -0400 (0:00:00.421) 0:06:42.034 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'absent', 'type': 'stratis', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:36:41 -0400 (0:00:00.126) 0:06:42.160 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:36:41 -0400 (0:00:00.091) 0:06:42.251 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:36:41 -0400 (0:00:00.082) 0:06:42.333 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:36:41 -0400 (0:00:00.094) 0:06:42.428 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:36:41 -0400 (0:00:00.202) 0:06:42.630 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:36:42 -0400 (0:00:00.182) 0:06:42.813 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:36:42 -0400 (0:00:00.085) 0:06:42.899 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:36:42 -0400 (0:00:00.094) 0:06:42.993 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:36:42 -0400 (0:00:00.094) 0:06:43.088 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:36:42 -0400 (0:00:00.093) 0:06:43.181 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:36:42 -0400 (0:00:00.128) 0:06:43.310 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:36:42 -0400 (0:00:00.112) 0:06:43.423 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:36:42 -0400 (0:00:00.095) 0:06:43.519 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:36:42 -0400 (0:00:00.075) 0:06:43.594 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:36:43 -0400 (0:00:00.471) 0:06:44.066 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:36:43 -0400 (0:00:00.070) 0:06:44.136 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:36:43 -0400 (0:00:00.102) 0:06:44.239 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:36:43 -0400 (0:00:00.052) 0:06:44.291 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:36:43 -0400 (0:00:00.053) 0:06:44.344 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:36:43 -0400 (0:00:00.138) 0:06:44.483 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:36:43 -0400 (0:00:00.053) 0:06:44.537 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:36:43 -0400 (0:00:00.052) 0:06:44.590 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:36:43 -0400 (0:00:00.054) 0:06:44.644 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:36:43 -0400 (0:00:00.053) 0:06:44.698 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:36:44 -0400 (0:00:00.053) 0:06:44.752 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:36:44 -0400 (0:00:00.052) 0:06:44.804 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:36:44 -0400 (0:00:00.054) 0:06:44.859 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:36:44 -0400 (0:00:00.056) 0:06:44.916 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:36:44 -0400 (0:00:00.101) 0:06:45.018 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:36:44 -0400 (0:00:00.064) 0:06:45.082 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:36:44 -0400 (0:00:00.101) 0:06:45.184 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:36:44 -0400 (0:00:00.163) 0:06:45.348 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:36:44 -0400 (0:00:00.270) 0:06:45.618 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:36:45 -0400 (0:00:00.112) 0:06:45.731 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:36:45 -0400 (0:00:00.052) 0:06:45.784 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:36:45 -0400 (0:00:00.061) 0:06:45.845 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:36:45 -0400 (0:00:00.090) 0:06:45.935 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:36:45 -0400 (0:00:00.199) 0:06:46.135 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/stratis/foo/test1", "_mount_id": "UUID=61a2e543-73f3-4af0-900d-c94e8346453c", "_raw_device": "/dev/stratis/foo/test1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:36:45 -0400 (0:00:00.134) 0:06:46.269 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:36:45 -0400 (0:00:00.192) 0:06:46.462 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.373490", "end": "2024-08-17 19:36:46.530922", "rc": 0, "start": "2024-08-17 19:36:46.157432" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:36:46 -0400 (0:00:00.929) 0:06:47.392 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:36:46 -0400 (0:00:00.120) 0:06:47.513 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:36:46 -0400 (0:00:00.068) 0:06:47.581 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:36:47 -0400 (0:00:00.158) 0:06:47.740 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:36:47 -0400 (0:00:00.055) 0:06:47.796 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:36:47 -0400 (0:00:00.056) 0:06:47.853 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:36:47 -0400 (0:00:00.055) 0:06:47.908 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/stratis/foo/test1', '_raw_device': '/dev/stratis/foo/test1', '_mount_id': 'UUID=61a2e543-73f3-4af0-900d-c94e8346453c'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 17 August 2024 19:36:47 -0400 (0:00:00.098) 0:06:48.007 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 17 August 2024 19:36:47 -0400 (0:00:00.125) 0:06:48.133 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 => (item=mount) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 => (item=fstab) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 => (item=fs) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 => (item=device) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 => (item=encryption) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 => (item=md) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 => (item=size) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 17 August 2024 19:36:47 -0400 (0:00:00.443) 0:06:48.577 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/stratis/foo/test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 17 August 2024 19:36:47 -0400 (0:00:00.099) 0:06:48.677 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 17 August 2024 19:36:48 -0400 (0:00:00.202) 0:06:48.879 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and (storage_test_volume.mount_user or storage_test_volume.mount_group or storage_test_volume.mount_mode)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 17 August 2024 19:36:48 -0400 (0:00:00.138) 0:06:49.018 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 17 August 2024 19:36:48 -0400 (0:00:00.081) 0:06:49.099 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_user", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 17 August 2024 19:36:48 -0400 (0:00:00.150) 0:06:49.250 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_group", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 17 August 2024 19:36:48 -0400 (0:00:00.061) 0:06:49.312 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_mode", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 17 August 2024 19:36:48 -0400 (0:00:00.089) 0:06:49.402 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 17 August 2024 19:36:48 -0400 (0:00:00.099) 0:06:49.502 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 17 August 2024 19:36:48 -0400 (0:00:00.102) 0:06:49.604 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 17 August 2024 19:36:48 -0400 (0:00:00.064) 0:06:49.668 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 17 August 2024 19:36:49 -0400 (0:00:00.076) 0:06:49.745 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 17 August 2024 19:36:49 -0400 (0:00:00.152) 0:06:49.898 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 17 August 2024 19:36:49 -0400 (0:00:00.099) 0:06:49.997 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 17 August 2024 19:36:49 -0400 (0:00:00.102) 0:06:50.099 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 17 August 2024 19:36:49 -0400 (0:00:00.075) 0:06:50.175 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 17 August 2024 19:36:49 -0400 (0:00:00.059) 0:06:50.234 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 17 August 2024 19:36:49 -0400 (0:00:00.056) 0:06:50.291 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 17 August 2024 19:36:49 -0400 (0:00:00.079) 0:06:50.370 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 17 August 2024 19:36:49 -0400 (0:00:00.083) 0:06:50.454 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 17 August 2024 19:36:50 -0400 (0:00:00.570) 0:06:51.025 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present or storage_test_volume.type == 'disk'", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 17 August 2024 19:36:50 -0400 (0:00:00.095) 0:06:51.120 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 17 August 2024 19:36:50 -0400 (0:00:00.071) 0:06:51.191 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 17 August 2024 19:36:50 -0400 (0:00:00.077) 0:06:51.268 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "stratis" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 17 August 2024 19:36:50 -0400 (0:00:00.076) 0:06:51.344 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 17 August 2024 19:36:50 -0400 (0:00:00.096) 0:06:51.440 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 17 August 2024 19:36:50 -0400 (0:00:00.090) 0:06:51.531 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 17 August 2024 19:36:50 -0400 (0:00:00.099) 0:06:51.631 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 17 August 2024 19:36:52 -0400 (0:00:01.497) 0:06:53.128 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 17 August 2024 19:36:52 -0400 (0:00:00.093) 0:06:53.221 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 17 August 2024 19:36:52 -0400 (0:00:00.091) 0:06:53.313 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 17 August 2024 19:36:52 -0400 (0:00:00.098) 0:06:53.411 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 17 August 2024 19:36:52 -0400 (0:00:00.094) 0:06:53.506 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 17 August 2024 19:36:52 -0400 (0:00:00.115) 0:06:53.621 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 17 August 2024 19:36:53 -0400 (0:00:00.104) 0:06:53.726 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 17 August 2024 19:36:53 -0400 (0:00:00.243) 0:06:53.970 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 17 August 2024 19:36:53 -0400 (0:00:00.158) 0:06:54.128 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 17 August 2024 19:36:53 -0400 (0:00:00.190) 0:06:54.319 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 17 August 2024 19:36:53 -0400 (0:00:00.166) 0:06:54.485 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 17 August 2024 19:36:53 -0400 (0:00:00.112) 0:06:54.598 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.143) 0:06:54.741 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.133) 0:06:54.875 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.066) 0:06:54.942 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.056) 0:06:54.999 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.053) 0:06:55.053 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.059) 0:06:55.113 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.114) 0:06:55.227 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.079) 0:06:55.307 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.091) 0:06:55.398 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.172) 0:06:55.571 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.056) 0:06:55.627 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 17 August 2024 19:36:54 -0400 (0:00:00.054) 0:06:55.682 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 17 August 2024 19:36:55 -0400 (0:00:00.053) 0:06:55.735 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 17 August 2024 19:36:55 -0400 (0:00:00.079) 0:06:55.815 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 17 August 2024 19:36:55 -0400 (0:00:00.115) 0:06:55.931 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 17 August 2024 19:36:55 -0400 (0:00:00.094) 0:06:56.025 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 17 August 2024 19:36:55 -0400 (0:00:00.071) 0:06:56.096 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 17 August 2024 19:36:55 -0400 (0:00:00.082) 0:06:56.178 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 17 August 2024 19:36:55 -0400 (0:00:00.162) 0:06:56.341 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 17 August 2024 19:36:55 -0400 (0:00:00.221) 0:06:56.562 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 17 August 2024 19:36:56 -0400 (0:00:00.176) 0:06:56.739 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 17 August 2024 19:36:56 -0400 (0:00:00.176) 0:06:56.916 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 17 August 2024 19:36:56 -0400 (0:00:00.174) 0:06:57.090 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 17 August 2024 19:36:56 -0400 (0:00:00.190) 0:06:57.280 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 17 August 2024 19:36:56 -0400 (0:00:00.348) 0:06:57.629 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 17 August 2024 19:36:57 -0400 (0:00:00.112) 0:06:57.742 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 17 August 2024 19:36:57 -0400 (0:00:00.102) 0:06:57.845 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 17 August 2024 19:36:57 -0400 (0:00:00.095) 0:06:57.940 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 17 August 2024 19:36:57 -0400 (0:00:00.128) 0:06:58.068 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 17 August 2024 19:36:57 -0400 (0:00:00.122) 0:06:58.191 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 17 August 2024 19:36:57 -0400 (0:00:00.175) 0:06:58.366 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 17 August 2024 19:36:57 -0400 (0:00:00.181) 0:06:58.548 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 17 August 2024 19:36:57 -0400 (0:00:00.109) 0:06:58.658 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 17 August 2024 19:36:58 -0400 (0:00:00.075) 0:06:58.734 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 17 August 2024 19:36:58 -0400 (0:00:00.063) 0:06:58.797 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 17 August 2024 19:36:58 -0400 (0:00:00.054) 0:06:58.851 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 17 August 2024 19:36:58 -0400 (0:00:00.054) 0:06:58.906 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 17 August 2024 19:36:58 -0400 (0:00:00.077) 0:06:58.983 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 17 August 2024 19:36:58 -0400 (0:00:00.111) 0:06:59.094 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 17 August 2024 19:36:58 -0400 (0:00:00.229) 0:06:59.324 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 17 August 2024 19:36:58 -0400 (0:00:00.100) 0:06:59.424 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 17 August 2024 19:36:58 -0400 (0:00:00.100) 0:06:59.525 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 17 August 2024 19:36:58 -0400 (0:00:00.111) 0:06:59.636 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 17 August 2024 19:36:59 -0400 (0:00:00.118) 0:06:59.754 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 17 August 2024 19:36:59 -0400 (0:00:00.101) 0:06:59.856 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 17 August 2024 19:36:59 -0400 (0:00:00.102) 0:06:59.958 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 17 August 2024 19:36:59 -0400 (0:00:00.129) 0:07:00.087 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:36:59 -0400 (0:00:00.138) 0:07:00.226 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:36:59 -0400 (0:00:00.103) 0:07:00.330 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create one Stratis pool on one disk] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:190 Saturday 17 August 2024 19:36:59 -0400 (0:00:00.125) 0:07:00.455 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:37:00 -0400 (0:00:00.328) 0:07:00.783 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:37:00 -0400 (0:00:00.389) 0:07:01.173 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:37:00 -0400 (0:00:00.229) 0:07:01.402 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:37:00 -0400 (0:00:00.298) 0:07:01.700 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:37:01 -0400 (0:00:00.179) 0:07:01.880 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:37:01 -0400 (0:00:00.142) 0:07:02.023 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:37:01 -0400 (0:00:00.128) 0:07:02.152 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:37:01 -0400 (0:00:00.161) 0:07:02.314 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:37:01 -0400 (0:00:00.272) 0:07:02.586 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:37:02 -0400 (0:00:00.138) 0:07:02.725 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": "sda", "name": "foo", "type": "stratis" } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:37:02 -0400 (0:00:00.149) 0:07:02.875 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:37:02 -0400 (0:00:00.166) 0:07:03.041 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:37:02 -0400 (0:00:00.140) 0:07:03.182 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:37:02 -0400 (0:00:00.122) 0:07:03.304 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:37:02 -0400 (0:00:00.111) 0:07:03.415 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:37:02 -0400 (0:00:00.214) 0:07:03.630 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:37:03 -0400 (0:00:00.185) 0:07:03.815 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:37:03 -0400 (0:00:00.082) 0:07:03.897 ******* changed: [managed_node2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "create device", "device": "/dev/stratis/foo", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/stratis/foo" ], "mounts": [], "packages": [ "stratisd", "stratis-cli", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:37:06 -0400 (0:00:03.307) 0:07:07.205 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:37:06 -0400 (0:00:00.243) 0:07:07.448 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937793.7743778, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "040ba4405b5492ce3b98ec92daf6841922885fc7", "ctime": 1723937793.773378, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937793.773378, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:37:07 -0400 (0:00:00.561) 0:07:08.010 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:37:07 -0400 (0:00:00.447) 0:07:08.458 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:37:07 -0400 (0:00:00.060) 0:07:08.519 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "create device", "device": "/dev/stratis/foo", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/stratis/foo" ], "mounts": [], "packages": [ "stratisd", "stratis-cli", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:37:07 -0400 (0:00:00.064) 0:07:08.584 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:37:07 -0400 (0:00:00.059) 0:07:08.643 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:37:08 -0400 (0:00:00.059) 0:07:08.703 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:37:08 -0400 (0:00:00.067) 0:07:08.770 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:37:08 -0400 (0:00:00.056) 0:07:08.826 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:37:08 -0400 (0:00:00.069) 0:07:08.895 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:37:08 -0400 (0:00:00.133) 0:07:09.029 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:37:08 -0400 (0:00:00.053) 0:07:09.082 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:37:08 -0400 (0:00:00.521) 0:07:09.603 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:37:08 -0400 (0:00:00.087) 0:07:09.691 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:199 Saturday 17 August 2024 19:37:11 -0400 (0:00:02.512) 0:07:12.204 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:37:11 -0400 (0:00:00.206) 0:07:12.410 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:37:11 -0400 (0:00:00.172) 0:07:12.582 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:37:12 -0400 (0:00:00.143) 0:07:12.726 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-mdv": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-mdv", "size": "512M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-thindata": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-thindata", "size": "9.5G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-thinmeta": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-thinmeta", "size": "6M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-physical-originsub": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-physical-originsub", "size": "10G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-thinpool-pool": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-thinpool-pool", "size": "9.5G", "type": "stratis", "uuid": "" }, "/dev/sda": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "bf555001-88d9-4083-8c2f-ac44cd551693" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:37:12 -0400 (0:00:00.587) 0:07:13.313 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003167", "end": "2024-08-17 19:37:12.996269", "rc": 0, "start": "2024-08-17 19:37:12.993102" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:37:13 -0400 (0:00:00.536) 0:07:13.850 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003174", "end": "2024-08-17 19:37:13.591538", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:37:13.588364" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:37:13 -0400 (0:00:00.590) 0:07:14.440 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'stratis', 'volumes': []}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:37:14 -0400 (0:00:00.346) 0:07:14.786 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:37:14 -0400 (0:00:00.154) 0:07:14.941 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:37:14 -0400 (0:00:00.145) 0:07:15.087 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:37:14 -0400 (0:00:00.140) 0:07:15.227 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:37:14 -0400 (0:00:00.222) 0:07:15.449 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:37:14 -0400 (0:00:00.099) 0:07:15.548 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:37:14 -0400 (0:00:00.080) 0:07:15.629 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:37:15 -0400 (0:00:00.093) 0:07:15.723 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:37:15 -0400 (0:00:00.092) 0:07:15.815 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:37:15 -0400 (0:00:00.095) 0:07:15.910 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:37:15 -0400 (0:00:00.099) 0:07:16.010 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:37:15 -0400 (0:00:00.098) 0:07:16.109 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:37:15 -0400 (0:00:00.200) 0:07:16.309 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:37:15 -0400 (0:00:00.266) 0:07:16.575 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:37:16 -0400 (0:00:00.504) 0:07:17.080 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:37:16 -0400 (0:00:00.084) 0:07:17.164 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:37:16 -0400 (0:00:00.142) 0:07:17.306 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:37:16 -0400 (0:00:00.056) 0:07:17.363 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:37:16 -0400 (0:00:00.055) 0:07:17.418 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:37:16 -0400 (0:00:00.053) 0:07:17.472 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:37:16 -0400 (0:00:00.052) 0:07:17.524 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:37:16 -0400 (0:00:00.053) 0:07:17.578 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:37:16 -0400 (0:00:00.053) 0:07:17.632 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:37:16 -0400 (0:00:00.053) 0:07:17.685 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.052) 0:07:17.737 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.053) 0:07:17.791 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.117) 0:07:17.909 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.058) 0:07:17.967 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.109) 0:07:18.076 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.042) 0:07:18.119 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.104) 0:07:18.223 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.042) 0:07:18.266 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.158) 0:07:18.424 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.104) 0:07:18.529 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.052) 0:07:18.581 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.047) 0:07:18.629 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:37:17 -0400 (0:00:00.057) 0:07:18.686 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:37:18 -0400 (0:00:00.219) 0:07:18.905 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:37:18 -0400 (0:00:00.080) 0:07:18.986 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:37:18 -0400 (0:00:00.225) 0:07:19.212 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.374318", "end": "2024-08-17 19:37:19.265253", "rc": 0, "start": "2024-08-17 19:37:18.890935" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sda", "size": "20971520 sectors", "uuid": "bf555001-88d9-4083-8c2f-ac44cd551693" } ] }, "filesystems": [], "fs_limit": 100, "name": "foo", "uuid": "b0ea45b8-c188-46a0-bdd7-55f2e2cfd03b" } ], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:37:19 -0400 (0:00:00.877) 0:07:20.089 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sda", "size": "20971520 sectors", "uuid": "bf555001-88d9-4083-8c2f-ac44cd551693" } ] }, "filesystems": [], "fs_limit": 100, "name": "foo", "uuid": "b0ea45b8-c188-46a0-bdd7-55f2e2cfd03b" } ], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:37:19 -0400 (0:00:00.120) 0:07:20.209 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:37:19 -0400 (0:00:00.105) 0:07:20.315 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:37:19 -0400 (0:00:00.059) 0:07:20.375 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:37:19 -0400 (0:00:00.056) 0:07:20.432 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:37:19 -0400 (0:00:00.086) 0:07:20.518 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:37:19 -0400 (0:00:00.075) 0:07:20.593 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:37:19 -0400 (0:00:00.079) 0:07:20.673 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:37:20 -0400 (0:00:00.077) 0:07:20.751 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Add the second disk to the pool] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:202 Saturday 17 August 2024 19:37:20 -0400 (0:00:00.212) 0:07:20.963 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:37:20 -0400 (0:00:00.176) 0:07:21.140 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:37:20 -0400 (0:00:00.086) 0:07:21.227 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:37:20 -0400 (0:00:00.128) 0:07:21.355 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:37:20 -0400 (0:00:00.253) 0:07:21.609 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:37:20 -0400 (0:00:00.066) 0:07:21.675 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:37:21 -0400 (0:00:00.068) 0:07:21.744 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:37:21 -0400 (0:00:00.057) 0:07:21.801 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:37:21 -0400 (0:00:00.057) 0:07:21.859 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:37:21 -0400 (0:00:00.119) 0:07:21.979 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:37:21 -0400 (0:00:00.193) 0:07:22.172 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda", "sdb" ], "name": "foo", "type": "stratis" } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:37:21 -0400 (0:00:00.104) 0:07:22.277 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:37:21 -0400 (0:00:00.155) 0:07:22.432 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:37:21 -0400 (0:00:00.214) 0:07:22.646 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:37:22 -0400 (0:00:00.164) 0:07:22.810 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:37:22 -0400 (0:00:00.120) 0:07:22.931 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:37:22 -0400 (0:00:00.114) 0:07:23.045 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:37:22 -0400 (0:00:00.188) 0:07:23.233 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:37:22 -0400 (0:00:00.086) 0:07:23.320 ******* changed: [managed_node2] => { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "add container member", "device": "/dev/sdb", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/stratis/foo", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [], "packages": [ "stratis-cli", "stratisd", "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:37:25 -0400 (0:00:03.151) 0:07:26.471 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:37:25 -0400 (0:00:00.075) 0:07:26.547 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937793.7743778, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "040ba4405b5492ce3b98ec92daf6841922885fc7", "ctime": 1723937793.773378, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937793.773378, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:37:26 -0400 (0:00:00.458) 0:07:27.005 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:37:26 -0400 (0:00:00.419) 0:07:27.425 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:37:26 -0400 (0:00:00.043) 0:07:27.468 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "add container member", "device": "/dev/sdb", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/stratis/foo", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [], "packages": [ "stratis-cli", "stratisd", "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:37:26 -0400 (0:00:00.133) 0:07:27.601 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:37:26 -0400 (0:00:00.060) 0:07:27.662 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:37:27 -0400 (0:00:00.060) 0:07:27.723 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:37:27 -0400 (0:00:00.071) 0:07:27.795 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:37:27 -0400 (0:00:00.055) 0:07:27.851 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:37:27 -0400 (0:00:00.071) 0:07:27.923 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:37:27 -0400 (0:00:00.076) 0:07:27.999 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:37:27 -0400 (0:00:00.085) 0:07:28.084 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:37:27 -0400 (0:00:00.458) 0:07:28.542 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:37:27 -0400 (0:00:00.080) 0:07:28.623 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:211 Saturday 17 August 2024 19:37:30 -0400 (0:00:02.578) 0:07:31.201 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:37:30 -0400 (0:00:00.144) 0:07:31.345 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:37:30 -0400 (0:00:00.254) 0:07:31.600 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:37:31 -0400 (0:00:00.195) 0:07:31.796 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-mdv": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-mdv", "size": "512M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-thindata": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-thindata", "size": "9.5G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-thinmeta": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-flex-thinmeta", "size": "6M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-physical-originsub": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-physical-originsub", "size": "10G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-thinpool-pool": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-b0ea45b8c18846a0bdd755f2e2cfd03b-thinpool-pool", "size": "9.5G", "type": "stratis", "uuid": "" }, "/dev/sda": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "bf555001-88d9-4083-8c2f-ac44cd551693" }, "/dev/sdb": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "474920a2-bdbd-4a8f-8c05-c4df35dd6c9b" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:37:31 -0400 (0:00:00.589) 0:07:32.386 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003274", "end": "2024-08-17 19:37:32.087239", "rc": 0, "start": "2024-08-17 19:37:32.083965" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:37:32 -0400 (0:00:00.520) 0:07:32.906 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003044", "end": "2024-08-17 19:37:32.577881", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:37:32.574837" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:37:32 -0400 (0:00:00.563) 0:07:33.470 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda', 'sdb'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'stratis', 'volumes': []}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:37:33 -0400 (0:00:00.269) 0:07:33.739 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:37:33 -0400 (0:00:00.105) 0:07:33.845 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:37:33 -0400 (0:00:00.113) 0:07:33.959 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:37:33 -0400 (0:00:00.168) 0:07:34.128 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:37:33 -0400 (0:00:00.233) 0:07:34.361 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:37:33 -0400 (0:00:00.105) 0:07:34.467 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:37:33 -0400 (0:00:00.041) 0:07:34.508 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:37:33 -0400 (0:00:00.114) 0:07:34.623 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:37:33 -0400 (0:00:00.055) 0:07:34.678 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:37:34 -0400 (0:00:00.053) 0:07:34.732 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:37:34 -0400 (0:00:00.053) 0:07:34.785 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:37:34 -0400 (0:00:00.054) 0:07:34.840 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:37:34 -0400 (0:00:00.055) 0:07:34.895 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:37:34 -0400 (0:00:00.042) 0:07:34.938 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:37:34 -0400 (0:00:00.486) 0:07:35.424 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:37:34 -0400 (0:00:00.191) 0:07:35.616 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:37:35 -0400 (0:00:00.270) 0:07:35.887 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:37:35 -0400 (0:00:00.096) 0:07:35.984 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:37:35 -0400 (0:00:00.096) 0:07:36.081 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:37:35 -0400 (0:00:00.102) 0:07:36.183 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:37:35 -0400 (0:00:00.199) 0:07:36.383 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:37:35 -0400 (0:00:00.097) 0:07:36.480 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:37:35 -0400 (0:00:00.098) 0:07:36.579 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:37:36 -0400 (0:00:00.174) 0:07:36.753 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:37:36 -0400 (0:00:00.111) 0:07:36.864 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:37:36 -0400 (0:00:00.163) 0:07:37.028 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:37:36 -0400 (0:00:00.136) 0:07:37.164 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:37:36 -0400 (0:00:00.140) 0:07:37.304 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:37:36 -0400 (0:00:00.209) 0:07:37.514 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:37:36 -0400 (0:00:00.077) 0:07:37.591 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:37:37 -0400 (0:00:00.300) 0:07:37.892 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:37:37 -0400 (0:00:00.086) 0:07:37.978 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:37:37 -0400 (0:00:00.356) 0:07:38.335 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:37:37 -0400 (0:00:00.169) 0:07:38.504 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:37:37 -0400 (0:00:00.075) 0:07:38.580 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:37:37 -0400 (0:00:00.086) 0:07:38.666 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:37:38 -0400 (0:00:00.167) 0:07:38.834 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:37:38 -0400 (0:00:00.290) 0:07:39.124 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:37:38 -0400 (0:00:00.083) 0:07:39.208 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:37:38 -0400 (0:00:00.258) 0:07:39.466 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.374504", "end": "2024-08-17 19:37:39.576975", "rc": 0, "start": "2024-08-17 19:37:39.202471" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sda", "size": "20971520 sectors", "uuid": "bf555001-88d9-4083-8c2f-ac44cd551693" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdb", "size": "20971520 sectors", "uuid": "474920a2-bdbd-4a8f-8c05-c4df35dd6c9b" } ] }, "filesystems": [], "fs_limit": 100, "name": "foo", "uuid": "b0ea45b8-c188-46a0-bdd7-55f2e2cfd03b" } ], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:37:39 -0400 (0:00:00.954) 0:07:40.420 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sda", "size": "20971520 sectors", "uuid": "bf555001-88d9-4083-8c2f-ac44cd551693" }, { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: None", "in_use": true, "path": "/dev/sdb", "size": "20971520 sectors", "uuid": "474920a2-bdbd-4a8f-8c05-c4df35dd6c9b" } ] }, "filesystems": [], "fs_limit": 100, "name": "foo", "uuid": "b0ea45b8-c188-46a0-bdd7-55f2e2cfd03b" } ], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:37:39 -0400 (0:00:00.157) 0:07:40.578 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:37:40 -0400 (0:00:00.141) 0:07:40.719 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:37:40 -0400 (0:00:00.154) 0:07:40.874 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:37:40 -0400 (0:00:00.062) 0:07:40.936 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:37:40 -0400 (0:00:00.058) 0:07:40.995 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:37:40 -0400 (0:00:00.063) 0:07:41.058 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:37:40 -0400 (0:00:00.074) 0:07:41.133 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:37:40 -0400 (0:00:00.079) 0:07:41.212 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:214 Saturday 17 August 2024 19:37:40 -0400 (0:00:00.107) 0:07:41.320 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:37:40 -0400 (0:00:00.267) 0:07:41.587 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:37:40 -0400 (0:00:00.105) 0:07:41.693 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:37:41 -0400 (0:00:00.134) 0:07:41.827 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:37:41 -0400 (0:00:00.237) 0:07:42.064 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:37:41 -0400 (0:00:00.242) 0:07:42.307 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:37:41 -0400 (0:00:00.100) 0:07:42.407 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:37:41 -0400 (0:00:00.105) 0:07:42.513 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:37:41 -0400 (0:00:00.107) 0:07:42.621 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:37:42 -0400 (0:00:00.216) 0:07:42.837 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:37:42 -0400 (0:00:00.132) 0:07:42.969 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "name": "foo", "state": "absent", "type": "stratis", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g", "state": "absent" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:37:42 -0400 (0:00:00.158) 0:07:43.128 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:37:42 -0400 (0:00:00.185) 0:07:43.313 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:37:42 -0400 (0:00:00.162) 0:07:43.476 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:37:42 -0400 (0:00:00.137) 0:07:43.613 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:37:43 -0400 (0:00:00.171) 0:07:43.784 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:37:43 -0400 (0:00:00.176) 0:07:43.961 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:37:43 -0400 (0:00:00.261) 0:07:44.223 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:37:43 -0400 (0:00:00.232) 0:07:44.455 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "stratis" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:37:47 -0400 (0:00:03.779) 0:07:48.235 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:37:47 -0400 (0:00:00.134) 0:07:48.370 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937793.7743778, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "040ba4405b5492ce3b98ec92daf6841922885fc7", "ctime": 1723937793.773378, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937793.773378, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:37:48 -0400 (0:00:00.561) 0:07:48.931 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:37:48 -0400 (0:00:00.516) 0:07:49.447 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:37:48 -0400 (0:00:00.137) 0:07:49.585 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "stratis" }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "stratis" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:37:49 -0400 (0:00:00.145) 0:07:49.731 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:37:49 -0400 (0:00:00.138) 0:07:49.869 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:37:49 -0400 (0:00:00.123) 0:07:49.993 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:37:49 -0400 (0:00:00.149) 0:07:50.143 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:37:49 -0400 (0:00:00.140) 0:07:50.283 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:37:49 -0400 (0:00:00.188) 0:07:50.472 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:37:49 -0400 (0:00:00.168) 0:07:50.640 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:37:50 -0400 (0:00:00.124) 0:07:50.765 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:37:50 -0400 (0:00:00.545) 0:07:51.310 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:37:50 -0400 (0:00:00.044) 0:07:51.355 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:229 Saturday 17 August 2024 19:37:52 -0400 (0:00:02.253) 0:07:53.609 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:37:53 -0400 (0:00:00.120) 0:07:53.730 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [ { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:37:53 -0400 (0:00:00.086) 0:07:53.816 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:37:53 -0400 (0:00:00.077) 0:07:53.893 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:37:53 -0400 (0:00:00.466) 0:07:54.360 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003341", "end": "2024-08-17 19:37:54.021614", "rc": 0, "start": "2024-08-17 19:37:54.018273" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:37:54 -0400 (0:00:00.460) 0:07:54.820 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003074", "end": "2024-08-17 19:37:54.461543", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:37:54.458469" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:37:54 -0400 (0:00:00.453) 0:07:55.274 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda', 'sdb', 'sdc', 'sdd', 'sde', 'sdf', 'sdg', 'sdh', 'sdi'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'absent', 'type': 'stratis', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '', '_raw_device': '', '_mount_id': ''}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:37:54 -0400 (0:00:00.124) 0:07:55.399 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:37:54 -0400 (0:00:00.057) 0:07:55.456 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:37:54 -0400 (0:00:00.120) 0:07:55.577 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:37:54 -0400 (0:00:00.055) 0:07:55.632 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:37:55 -0400 (0:00:00.116) 0:07:55.749 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:37:55 -0400 (0:00:00.053) 0:07:55.803 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:37:55 -0400 (0:00:00.042) 0:07:55.845 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:37:55 -0400 (0:00:00.054) 0:07:55.900 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:37:55 -0400 (0:00:00.054) 0:07:55.954 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:37:55 -0400 (0:00:00.052) 0:07:56.007 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:37:55 -0400 (0:00:00.053) 0:07:56.061 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:37:55 -0400 (0:00:00.053) 0:07:56.114 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:37:55 -0400 (0:00:00.065) 0:07:56.180 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:37:55 -0400 (0:00:00.060) 0:07:56.241 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:37:56 -0400 (0:00:00.493) 0:07:56.734 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:37:56 -0400 (0:00:00.214) 0:07:56.948 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:37:56 -0400 (0:00:00.189) 0:07:57.138 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:37:56 -0400 (0:00:00.095) 0:07:57.234 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:37:56 -0400 (0:00:00.098) 0:07:57.332 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:37:56 -0400 (0:00:00.135) 0:07:57.468 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:37:56 -0400 (0:00:00.092) 0:07:57.560 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:37:56 -0400 (0:00:00.099) 0:07:57.660 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:37:57 -0400 (0:00:00.100) 0:07:57.761 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:37:57 -0400 (0:00:00.097) 0:07:57.859 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:37:57 -0400 (0:00:00.094) 0:07:57.953 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:37:57 -0400 (0:00:00.100) 0:07:58.054 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:37:57 -0400 (0:00:00.099) 0:07:58.154 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:37:57 -0400 (0:00:00.104) 0:07:58.258 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:37:57 -0400 (0:00:00.274) 0:07:58.532 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '', '_raw_device': '', '_mount_id': ''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:37:57 -0400 (0:00:00.066) 0:07:58.599 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:37:58 -0400 (0:00:00.111) 0:07:58.711 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '', '_raw_device': '', '_mount_id': ''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:37:58 -0400 (0:00:00.104) 0:07:58.816 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:37:58 -0400 (0:00:00.214) 0:07:59.030 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:37:58 -0400 (0:00:00.134) 0:07:59.165 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:37:58 -0400 (0:00:00.067) 0:07:59.233 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:37:58 -0400 (0:00:00.054) 0:07:59.287 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:37:58 -0400 (0:00:00.071) 0:07:59.359 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:37:58 -0400 (0:00:00.115) 0:07:59.474 ******* skipping: [managed_node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '', '_raw_device': '', '_mount_id': ''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "", "_mount_id": "", "_raw_device": "", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "stratis", "vdo_pool_size": null } } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:37:58 -0400 (0:00:00.083) 0:07:59.557 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:37:59 -0400 (0:00:00.212) 0:07:59.770 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.374080", "end": "2024-08-17 19:37:59.780493", "rc": 0, "start": "2024-08-17 19:37:59.406413" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:37:59 -0400 (0:00:00.801) 0:08:00.571 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:38:00 -0400 (0:00:00.167) 0:08:00.738 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:38:00 -0400 (0:00:00.101) 0:08:00.839 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:38:00 -0400 (0:00:00.102) 0:08:00.942 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:38:00 -0400 (0:00:00.102) 0:08:01.044 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:38:00 -0400 (0:00:00.102) 0:08:01.147 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:38:00 -0400 (0:00:00.125) 0:08:01.272 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'absent', 'type': 'stratis', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '', '_raw_device': '', '_mount_id': ''}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 17 August 2024 19:38:00 -0400 (0:00:00.262) 0:08:01.535 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 17 August 2024 19:38:01 -0400 (0:00:00.179) 0:08:01.714 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 => (item=mount) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 => (item=fstab) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 => (item=fs) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 => (item=device) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 => (item=encryption) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 => (item=md) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 => (item=size) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 17 August 2024 19:38:01 -0400 (0:00:00.703) 0:08:02.417 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 17 August 2024 19:38:01 -0400 (0:00:00.123) 0:08:02.541 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 17 August 2024 19:38:01 -0400 (0:00:00.159) 0:08:02.700 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and (storage_test_volume.mount_user or storage_test_volume.mount_group or storage_test_volume.mount_mode)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 17 August 2024 19:38:02 -0400 (0:00:00.124) 0:08:02.824 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 17 August 2024 19:38:02 -0400 (0:00:00.184) 0:08:03.009 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_user", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 17 August 2024 19:38:02 -0400 (0:00:00.164) 0:08:03.173 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_group", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 17 August 2024 19:38:02 -0400 (0:00:00.150) 0:08:03.324 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.mount_point and storage_test_volume.mount_mode", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 17 August 2024 19:38:02 -0400 (0:00:00.175) 0:08:03.499 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 17 August 2024 19:38:02 -0400 (0:00:00.156) 0:08:03.656 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 17 August 2024 19:38:03 -0400 (0:00:00.142) 0:08:03.798 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 17 August 2024 19:38:03 -0400 (0:00:00.113) 0:08:03.912 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 17 August 2024 19:38:03 -0400 (0:00:00.141) 0:08:04.054 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 17 August 2024 19:38:03 -0400 (0:00:00.191) 0:08:04.246 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 17 August 2024 19:38:03 -0400 (0:00:00.130) 0:08:04.376 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 17 August 2024 19:38:03 -0400 (0:00:00.152) 0:08:04.528 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 17 August 2024 19:38:04 -0400 (0:00:00.330) 0:08:04.858 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 17 August 2024 19:38:04 -0400 (0:00:00.129) 0:08:04.988 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 17 August 2024 19:38:04 -0400 (0:00:00.144) 0:08:05.132 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 17 August 2024 19:38:04 -0400 (0:00:00.118) 0:08:05.251 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type != \"stratis\"", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 17 August 2024 19:38:04 -0400 (0:00:00.108) 0:08:05.360 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 17 August 2024 19:38:05 -0400 (0:00:00.555) 0:08:05.916 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present or storage_test_volume.type == 'disk'", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 17 August 2024 19:38:05 -0400 (0:00:00.197) 0:08:06.113 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 17 August 2024 19:38:05 -0400 (0:00:00.189) 0:08:06.303 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 17 August 2024 19:38:05 -0400 (0:00:00.089) 0:08:06.392 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "stratis" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 17 August 2024 19:38:05 -0400 (0:00:00.073) 0:08:06.466 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 17 August 2024 19:38:05 -0400 (0:00:00.054) 0:08:06.520 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 17 August 2024 19:38:05 -0400 (0:00:00.055) 0:08:06.576 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 17 August 2024 19:38:05 -0400 (0:00:00.055) 0:08:06.631 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 17 August 2024 19:38:07 -0400 (0:00:01.441) 0:08:08.072 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 17 August 2024 19:38:07 -0400 (0:00:00.066) 0:08:08.138 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 17 August 2024 19:38:07 -0400 (0:00:00.148) 0:08:08.287 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 17 August 2024 19:38:07 -0400 (0:00:00.087) 0:08:08.375 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 17 August 2024 19:38:07 -0400 (0:00:00.093) 0:08:08.468 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 17 August 2024 19:38:07 -0400 (0:00:00.073) 0:08:08.541 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 17 August 2024 19:38:07 -0400 (0:00:00.069) 0:08:08.611 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 17 August 2024 19:38:07 -0400 (0:00:00.057) 0:08:08.668 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 17 August 2024 19:38:08 -0400 (0:00:00.054) 0:08:08.723 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 17 August 2024 19:38:08 -0400 (0:00:00.099) 0:08:08.823 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 17 August 2024 19:38:08 -0400 (0:00:00.138) 0:08:08.962 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 17 August 2024 19:38:08 -0400 (0:00:00.134) 0:08:09.096 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 17 August 2024 19:38:08 -0400 (0:00:00.112) 0:08:09.209 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 17 August 2024 19:38:08 -0400 (0:00:00.099) 0:08:09.308 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 17 August 2024 19:38:08 -0400 (0:00:00.062) 0:08:09.370 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 17 August 2024 19:38:08 -0400 (0:00:00.086) 0:08:09.456 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 17 August 2024 19:38:08 -0400 (0:00:00.053) 0:08:09.510 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 17 August 2024 19:38:08 -0400 (0:00:00.167) 0:08:09.678 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 17 August 2024 19:38:09 -0400 (0:00:00.068) 0:08:09.746 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 17 August 2024 19:38:09 -0400 (0:00:00.055) 0:08:09.801 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 17 August 2024 19:38:09 -0400 (0:00:00.054) 0:08:09.856 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 17 August 2024 19:38:09 -0400 (0:00:00.055) 0:08:09.912 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 17 August 2024 19:38:09 -0400 (0:00:00.079) 0:08:09.992 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 17 August 2024 19:38:09 -0400 (0:00:00.082) 0:08:10.074 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 17 August 2024 19:38:09 -0400 (0:00:00.102) 0:08:10.177 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 17 August 2024 19:38:09 -0400 (0:00:00.135) 0:08:10.312 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 17 August 2024 19:38:09 -0400 (0:00:00.130) 0:08:10.442 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 17 August 2024 19:38:09 -0400 (0:00:00.184) 0:08:10.627 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 17 August 2024 19:38:10 -0400 (0:00:00.100) 0:08:10.728 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 17 August 2024 19:38:10 -0400 (0:00:00.148) 0:08:10.877 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 17 August 2024 19:38:10 -0400 (0:00:00.149) 0:08:11.026 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 17 August 2024 19:38:10 -0400 (0:00:00.186) 0:08:11.213 ******* skipping: [managed_node2] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 17 August 2024 19:38:10 -0400 (0:00:00.074) 0:08:11.288 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 17 August 2024 19:38:10 -0400 (0:00:00.128) 0:08:11.416 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 17 August 2024 19:38:10 -0400 (0:00:00.139) 0:08:11.556 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 17 August 2024 19:38:10 -0400 (0:00:00.133) 0:08:11.689 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.064) 0:08:11.753 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.071) 0:08:11.825 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.055) 0:08:11.881 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.055) 0:08:11.936 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.053) 0:08:11.990 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.055) 0:08:12.045 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.054) 0:08:12.099 ******* skipping: [managed_node2] => { "false_condition": "storage_test_volume.thin" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.055) 0:08:12.155 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.053) 0:08:12.208 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.054) 0:08:12.263 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.132) 0:08:12.395 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.062) 0:08:12.457 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.thin", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.086) 0:08:12.544 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 17 August 2024 19:38:11 -0400 (0:00:00.073) 0:08:12.618 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.088) 0:08:12.706 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.090) 0:08:12.797 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.060) 0:08:12.857 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.072) 0:08:12.930 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.110) 0:08:13.040 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.103) 0:08:13.144 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.098) 0:08:13.242 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.063) 0:08:13.306 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.055) 0:08:13.361 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.057) 0:08:13.419 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.048) 0:08:13.468 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Setup Tang server on localhost for testing] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:232 Saturday 17 August 2024 19:38:12 -0400 (0:00:00.220) 0:08:13.688 ******* included: fedora.linux_system_roles.nbde_server for managed_node2 TASK [fedora.linux_system_roles.nbde_server : Set version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main.yml:6 Saturday 17 August 2024 19:38:13 -0400 (0:00:00.329) 0:08:14.017 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.nbde_server : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/set_vars.yml:2 Saturday 17 August 2024 19:38:13 -0400 (0:00:00.262) 0:08:14.280 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__nbde_server_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.nbde_server : Check if system is ostree] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/set_vars.yml:10 Saturday 17 August 2024 19:38:13 -0400 (0:00:00.298) 0:08:14.579 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.nbde_server : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/set_vars.yml:15 Saturday 17 August 2024 19:38:14 -0400 (0:00:00.541) 0:08:15.120 ******* ok: [managed_node2] => { "ansible_facts": { "__nbde_server_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.nbde_server : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/set_vars.yml:19 Saturday 17 August 2024 19:38:14 -0400 (0:00:00.093) 0:08:15.214 ******* ok: [managed_node2] => { "ansible_facts": { "__nbde_server_cachedir": "/var/cache/tang", "__nbde_server_group": "tang", "__nbde_server_keydir": "/var/db/tang", "__nbde_server_keygen": "/usr/libexec/tangd-keygen", "__nbde_server_packages": [ "tang" ], "__nbde_server_services": [ "tangd.socket" ], "__nbde_server_update": "/usr/libexec/tangd-update", "__nbde_server_user": "tang" }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/vars/default.yml" ], "changed": false } TASK [fedora.linux_system_roles.nbde_server : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main.yml:9 Saturday 17 August 2024 19:38:14 -0400 (0:00:00.123) 0:08:15.337 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main-tang.yml for managed_node2 TASK [fedora.linux_system_roles.nbde_server : Ensure tang is installed] ******** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main-tang.yml:2 Saturday 17 August 2024 19:38:14 -0400 (0:00:00.093) 0:08:15.430 ******* changed: [managed_node2] => { "changed": true, "rc": 0, "results": [ "Installed: llhttp-9.2.1-1.fc40.x86_64", "Installed: tang-15-2.fc40.x86_64" ] } lsrpackages: tang TASK [fedora.linux_system_roles.nbde_server : Ensure keys are rotated] ********* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main-tang.yml:8 Saturday 17 August 2024 19:38:17 -0400 (0:00:02.955) 0:08:18.385 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "nbde_server_rotate_keys | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.nbde_server : Ensure we have keys] ************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main-tang.yml:17 Saturday 17 August 2024 19:38:17 -0400 (0:00:00.090) 0:08:18.476 ******* changed: [managed_node2] => { "arguments": { "cachedir": "/var/cache/tang", "force": false, "keydir": "/var/db/tang", "keygen": "/usr/libexec/tangd-keygen", "keys_to_deploy_dir": null, "state": "keys-created", "update": "/usr/libexec/tangd-update" }, "changed": true, "state": "keys-created" } TASK [fedora.linux_system_roles.nbde_server : Perform key management (fetch/deploy) tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main-tang.yml:26 Saturday 17 August 2024 19:38:18 -0400 (0:00:01.032) 0:08:19.508 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "(nbde_server_fetch_keys | bool) or (nbde_server_deploy_keys | bool)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.nbde_server : Manage firewall and SELinux for port] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main-tang.yml:30 Saturday 17 August 2024 19:38:19 -0400 (0:00:00.587) 0:08:20.096 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/tangd-custom-port.yml for managed_node2 TASK [Ensure tang port is labeled tangd_port_t for SELinux] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/tangd-custom-port.yml:2 Saturday 17 August 2024 19:38:19 -0400 (0:00:00.207) 0:08:20.303 ******* redirecting (type: modules) ansible.builtin.selinux to ansible.posix.selinux redirecting (type: modules) ansible.builtin.selinux to ansible.posix.selinux redirecting (type: modules) ansible.builtin.seboolean to ansible.posix.seboolean included: fedora.linux_system_roles.selinux for managed_node2 TASK [fedora.linux_system_roles.selinux : Set ansible_facts required by role and install packages] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:2 Saturday 17 August 2024 19:38:20 -0400 (0:00:00.423) 0:08:20.727 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml for managed_node2 TASK [fedora.linux_system_roles.selinux : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:2 Saturday 17 August 2024 19:38:20 -0400 (0:00:00.303) 0:08:21.031 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__selinux_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Ensure SELinux packages] ************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:7 Saturday 17 August 2024 19:38:20 -0400 (0:00:00.186) 0:08:21.218 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml for managed_node2 TASK [fedora.linux_system_roles.selinux : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:5 Saturday 17 August 2024 19:38:20 -0400 (0:00:00.216) 0:08:21.434 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.selinux : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:10 Saturday 17 August 2024 19:38:21 -0400 (0:00:00.505) 0:08:21.939 ******* ok: [managed_node2] => { "ansible_facts": { "__selinux_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.selinux : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:17 Saturday 17 August 2024 19:38:21 -0400 (0:00:00.109) 0:08:22.049 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.selinux : Set flag if transactional-update exists] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:22 Saturday 17 August 2024 19:38:22 -0400 (0:00:00.654) 0:08:22.704 ******* ok: [managed_node2] => { "ansible_facts": { "__selinux_is_transactional": false }, "changed": false } TASK [fedora.linux_system_roles.selinux : Install SELinux python2 tools] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:26 Saturday 17 August 2024 19:38:22 -0400 (0:00:00.111) 0:08:22.815 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_python_version is version('3', '<')", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Install SELinux python3 tools] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:35 Saturday 17 August 2024 19:38:22 -0400 (0:00:00.117) 0:08:22.933 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: python3-libselinux python3-policycoreutils TASK [fedora.linux_system_roles.selinux : Install SELinux python3 tools] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:46 Saturday 17 August 2024 19:38:23 -0400 (0:00:01.625) 0:08:24.559 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_os_family == \"Suse\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Install SELinux tool semanage] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:58 Saturday 17 August 2024 19:38:23 -0400 (0:00:00.096) 0:08:24.656 ******* changed: [managed_node2] => { "changed": true, "rc": 0, "results": [ "Installed: policycoreutils-python-utils-3.6-3.fc40.noarch" ] } lsrpackages: policycoreutils-python-utils TASK [fedora.linux_system_roles.selinux : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:72 Saturday 17 August 2024 19:38:26 -0400 (0:00:02.345) 0:08:27.001 ******* skipping: [managed_node2] => { "false_condition": "__selinux_is_transactional | d(false)" } TASK [fedora.linux_system_roles.selinux : Reboot transactional update systems] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:77 Saturday 17 August 2024 19:38:26 -0400 (0:00:00.136) 0:08:27.138 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__selinux_is_transactional | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Fail if reboot is needed and not set] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:82 Saturday 17 August 2024 19:38:26 -0400 (0:00:00.134) 0:08:27.272 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__selinux_is_transactional | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Refresh facts] *********************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/ensure_selinux_packages.yml:89 Saturday 17 August 2024 19:38:26 -0400 (0:00:00.129) 0:08:27.401 ******* ok: [managed_node2] TASK [fedora.linux_system_roles.selinux : Set permanent SELinux state if enabled] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:5 Saturday 17 August 2024 19:38:29 -0400 (0:00:02.480) 0:08:29.882 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_selinux.status == \"enabled\" and (selinux_state or selinux_policy)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set permanent SELinux state if disabled] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:13 Saturday 17 August 2024 19:38:29 -0400 (0:00:00.117) 0:08:30.000 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_selinux.status == \"disabled\" and selinux_state", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set selinux_reboot_required] ********* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:21 Saturday 17 August 2024 19:38:29 -0400 (0:00:00.137) 0:08:30.138 ******* ok: [managed_node2] => { "ansible_facts": { "selinux_reboot_required": false }, "changed": false } TASK [fedora.linux_system_roles.selinux : Fail if reboot is required] ********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:25 Saturday 17 August 2024 19:38:29 -0400 (0:00:00.172) 0:08:30.311 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "selinux_reboot_required", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Warn if SELinux is disabled] ********* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:30 Saturday 17 August 2024 19:38:29 -0400 (0:00:00.094) 0:08:30.405 ******* skipping: [managed_node2] => { "false_condition": "ansible_selinux.status == \"disabled\"" } TASK [fedora.linux_system_roles.selinux : Drop all local modifications] ******** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:35 Saturday 17 August 2024 19:38:29 -0400 (0:00:00.108) 0:08:30.514 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "selinux_all_purge | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux boolean local modifications] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:43 Saturday 17 August 2024 19:38:30 -0400 (0:00:00.346) 0:08:30.860 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "selinux_booleans_purge | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux file context local modifications] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:48 Saturday 17 August 2024 19:38:30 -0400 (0:00:00.155) 0:08:31.015 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "selinux_fcontexts_purge | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux port local modifications] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:53 Saturday 17 August 2024 19:38:30 -0400 (0:00:00.141) 0:08:31.157 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "selinux_ports_purge | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux login local modifications] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:58 Saturday 17 August 2024 19:38:30 -0400 (0:00:00.155) 0:08:31.312 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "selinux_logins_purge | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set SELinux booleans] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:63 Saturday 17 August 2024 19:38:30 -0400 (0:00:00.134) 0:08:31.446 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.selinux : Set SELinux file contexts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:74 Saturday 17 August 2024 19:38:30 -0400 (0:00:00.079) 0:08:31.526 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.selinux : Set an SELinux label on a port] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:87 Saturday 17 August 2024 19:38:30 -0400 (0:00:00.087) 0:08:31.614 ******* changed: [managed_node2] => (item={'ports': 7500, 'proto': 'tcp', 'setype': 'tangd_port_t', 'state': 'present', 'local': True}) => { "__selinux_item": { "local": true, "ports": 7500, "proto": "tcp", "setype": "tangd_port_t", "state": "present" }, "ansible_loop_var": "__selinux_item", "changed": true, "ports": [ "7500" ], "proto": "tcp", "setype": "tangd_port_t", "state": "present" } TASK [fedora.linux_system_roles.selinux : Set linux user to SELinux user mapping] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:99 Saturday 17 August 2024 19:38:33 -0400 (0:00:02.431) 0:08:34.045 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.selinux : Get SELinux modules facts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:112 Saturday 17 August 2024 19:38:33 -0400 (0:00:00.085) 0:08:34.131 ******* ok: [managed_node2] => { "ansible_facts": { "selinux_checksums": true, "selinux_installed_modules": { "abrt": { "100": { "checksum": "sha256:4180b8fba242f924ae2816f290fd2f447533bff3ae4b8b6b60f999c7b5a10caa", "enabled": 1 } }, "accountsd": { "100": { "checksum": "sha256:bf97d40f7ec7e4318b7d31bfd664735627b76b6e616a0cff06c15040960620ec", "enabled": 1 } }, "acct": { "100": { "checksum": "sha256:da4a2c785c040f6c34a3e602eeb1ed612a74ac170b7f5089d8ba55e12f750818", "enabled": 1 } }, "afs": { "100": { "checksum": "sha256:fb7e253fdbe3d08b3929bb97c3b6d52f56123bd4759aa1008c667abd2021b2aa", "enabled": 1 } }, "afterburn": { "100": { "checksum": "sha256:86bb7a2815cf9d64e9efd2c917153b238cbc60d7d03abac54f428bf676e4bdb6", "enabled": 1 } }, "aiccu": { "100": { "checksum": "sha256:836a35abbbf400c117eae70a77a4596a73b3d548e6fa82e3a0f55ae96f21d64b", "enabled": 1 } }, "aide": { "100": { "checksum": "sha256:4f387a45821ab709a325b7ccd4af3a4951c231c34440b5fa877b8fc6387d2aff", "enabled": 1 } }, "ajaxterm": { "100": { "checksum": "sha256:6e3edee489b6c939e15c49b4819463a7021d6188eaff9b05e9c8867de1f018fd", "enabled": 1 } }, "alsa": { "100": { "checksum": "sha256:9de454604bdef8775f6ebae39b8bc4ba7b5e79a2972d3e32ca7d50abffa7a85a", "enabled": 1 } }, "amanda": { "100": { "checksum": "sha256:133a6a6da4b4ab3b13dd255bd7974bb6e0494fa7c5e0b1034b7926823d6621a2", "enabled": 1 } }, "amtu": { "100": { "checksum": "sha256:80ccb9e85608a483fa07c37c2772933f5e63025bcc667540bec377b02bde2f3c", "enabled": 1 } }, "anaconda": { "100": { "checksum": "sha256:409af50bd5708e662302020ddef0f314e0da72c996c670db45aea57478e8ee9a", "enabled": 1 } }, "antivirus": { "100": { "checksum": "sha256:f1ed2e1ec9355b898716cc1ed8f18d7983b45b5bb88c2624fe31ffc968dcd6ca", "enabled": 1 } }, "apache": { "100": { "checksum": "sha256:be4ca99796e2a67e5756d9242dad3f02e1d63be342cdedbb2c744a79cbb52ccb", "enabled": 1 } }, "apcupsd": { "100": { "checksum": "sha256:47a5360325c4aebd9de805b5113bc715736bb01801cbbf317ac2112d1bae15c8", "enabled": 1 } }, "apm": { "100": { "checksum": "sha256:757508a02ab22f3ad35ce7bc9e596f7c84e62b7744db341f9d4f63ef74ee92ff", "enabled": 1 } }, "application": { "100": { "checksum": "sha256:3a91f2a5b7473a60ba38440265946e298936584c7954ac53b9a63926557d3d0d", "enabled": 1 } }, "arpwatch": { "100": { "checksum": "sha256:a8786979d69981a916a89856cce9aadaa540fe2baf93053bf9274eac36cc38fb", "enabled": 1 } }, "asterisk": { "100": { "checksum": "sha256:f113821fcc3ce3e5e04130937924e8aeb5741935d8b5422af25ef8c4540cedbd", "enabled": 1 } }, "auditadm": { "100": { "checksum": "sha256:c7c679838339ee165e9e71dbbf7c7bf40964333a74256ee98d4b56bc72e310f9", "enabled": 1 } }, "authconfig": { "100": { "checksum": "sha256:42a66af256fa45dd7be3e09166aaa0f2a035f5670f39466a2d4ee84e417fd76d", "enabled": 1 } }, "authlogin": { "100": { "checksum": "sha256:0e0de66c5efe773f5749d1a9d18b46c8e1264efe8dc77f1ab1ee34936429a6dd", "enabled": 1 } }, "automount": { "100": { "checksum": "sha256:3196fce29123f8eeecd6175b098063735d77aecfaffa7f6cb641adbcbd9ee977", "enabled": 1 } }, "avahi": { "100": { "checksum": "sha256:9bfb194e5b283124abae90dc92abb883e1cf7b3e2f51bcb9fcf842672c86ac60", "enabled": 1 } }, "awstats": { "100": { "checksum": "sha256:ccb0373eb1228e3a34112ba0124b37cee80def8c7ddcc0981b0676aa59bba2f8", "enabled": 1 } }, "bacula": { "100": { "checksum": "sha256:4ee194dcd976488e6eb0e0ba84d9b6875befc4eca8389c9d0102cac06c04809f", "enabled": 1 } }, "base": { "100": { "checksum": "sha256:6caa2cbaa3caf74ac1652751ba8cd482e6792685d960cf7f6f6e56edbed280e7", "enabled": 1 } }, "bcfg2": { "100": { "checksum": "sha256:3e147aa1e329d87cdfd30b20b3b80ee0c039e9665e944dc6a28231ebac38d5d4", "enabled": 1 } }, "bind": { "100": { "checksum": "sha256:50cd4920e108bb448da391a6e0136bc2023f3f6b7d825b39505b7244a6a55493", "enabled": 1 } }, "bitlbee": { "100": { "checksum": "sha256:9324bebe4e0cac756830480f53dd8e23a01f3c4c7185b30dd72024394dec2783", "enabled": 1 } }, "blkmapd": { "100": { "checksum": "sha256:fd38032cb8b3e9c24024364c02b07a830e97432cd763f41e82612fd83de06654", "enabled": 1 } }, "blueman": { "100": { "checksum": "sha256:2150b8a1afae75f4d73c76e2982ccc575f599577b8b7042d7bfa87fe2fed3a30", "enabled": 1 } }, "bluetooth": { "100": { "checksum": "sha256:b97526aa2398ec96e9185d11b1429064e6b88b9e47f2d0460d5c8dbfd2289bce", "enabled": 1 } }, "boinc": { "100": { "checksum": "sha256:1038d007468960019e0957df9b9dd0a76308b2dd5d3942dcef18c937a5130fa6", "enabled": 1 } }, "boltd": { "100": { "checksum": "sha256:89dc609017c6f2173d442769cbec67f3039fd77c8bec78ee038ff565856711a8", "enabled": 1 } }, "boothd": { "100": { "checksum": "sha256:d097d6711f0b31c706c8a186a615dcaf2b91d070a69304c05d5d7868c88bab5f", "enabled": 1 } }, "bootloader": { "100": { "checksum": "sha256:dabf91939aac32788c03a2c0b658d6e951cf80c3b5337e297f6c53bbe3f4418e", "enabled": 1 } }, "bootupd": { "100": { "checksum": "sha256:b0bda821f499f688d6d1d560a55673989cef71e7cd4e20f9cf3c9ce45e91d369", "enabled": 1 } }, "brctl": { "100": { "checksum": "sha256:78276818c60a76e84a9b3263a906d7109a69028c557a2d6a4a39ded2ef713a44", "enabled": 1 } }, "brltty": { "100": { "checksum": "sha256:0230419d38276d2c823a427d44fa1362f89784c7ac17f76826be95d3c8e6de1b", "enabled": 1 } }, "bugzilla": { "100": { "checksum": "sha256:8e4fd93cbe0357fc894c7898adf9a14f8c845d87a08f673f410b702f798c2383", "enabled": 1 } }, "bumblebee": { "100": { "checksum": "sha256:3d6e2bb58a928e53f4d89d5bdbdb6fea3f9ae98e5be81548df29fdb13b204f9b", "enabled": 1 } }, "cachefilesd": { "100": { "checksum": "sha256:d595466a7aa404ade4ad75d45e10f6a37d30529858cf990407f545f7826b364c", "enabled": 1 } }, "calamaris": { "100": { "checksum": "sha256:4c65dae6d30fea1bebc4e63ff693b2041f18d0adb710b783002c98c529be84d8", "enabled": 1 } }, "callweaver": { "100": { "checksum": "sha256:cede7c726cd48a09e9b2974e9f78d9482cc9d01a4ceb74ff190c4a219900f2b3", "enabled": 1 } }, "canna": { "100": { "checksum": "sha256:d523a4d1579256bb5faf1f761ac6a22cb4e8bb8e5544a5843615d4927abb5e76", "enabled": 1 } }, "ccs": { "100": { "checksum": "sha256:496749576c65d4a03024ee5d3be0814965308561f565ec6cd29587c3a3b7ec57", "enabled": 1 } }, "cdrecord": { "100": { "checksum": "sha256:ba7fd7c8a2a4d0fc658dd4aeb27b1c9c2049e7b8b5b5c3afa9523c74f6aae263", "enabled": 1 } }, "certmaster": { "100": { "checksum": "sha256:f9a4c97defbe94abd597286e65015baa5cdbc494a6404f3e467ba2484fd753d8", "enabled": 1 } }, "certmonger": { "100": { "checksum": "sha256:b156ebbd9f20cfe8d9fbb273f031fe309a0b850147f2b2a4f4316e4492f5fd30", "enabled": 1 } }, "certwatch": { "100": { "checksum": "sha256:c875a519672af0d8291b9097f547dc940b9f0132a174d49cb4a332e08fe92b89", "enabled": 1 } }, "cfengine": { "100": { "checksum": "sha256:8fc798d839b2d5a643625812c65fd6859b386328ef9b685f279082a38baa1b24", "enabled": 1 } }, "cgroup": { "100": { "checksum": "sha256:689e8e4f1b836e9e95320ac85cad26e50af1663439260c4bbb0fa65b0a7640ee", "enabled": 1 } }, "chrome": { "100": { "checksum": "sha256:6ae0aef5714b891c8d27622893515e8de72454ab279b4726975d9c313401797e", "enabled": 1 } }, "chronyd": { "100": { "checksum": "sha256:28d17217c3c515dbb1c0b04d5dc36d50c7f3165144a9ba56b3adce749ceb8467", "enabled": 1 } }, "cifsutils": { "100": { "checksum": "sha256:689032aad1e66934c35a72d87b428787df267d8ba743de741bc7fc4cec47c5ad", "enabled": 1 } }, "cinder": { "100": { "checksum": "sha256:1bca5805677f6de8372027eab3e7b091d28d9098bf5c363090b12f4195051441", "enabled": 1 } }, "cipe": { "100": { "checksum": "sha256:9037f8704235ce4b481935dbe9f81fb7eb065e593e07988b869938948b15f432", "enabled": 1 } }, "clock": { "100": { "checksum": "sha256:62a84539a2764eb9eea9d29b86128f1deea9adb3a3a534c88f5fcdd7637b8575", "enabled": 1 } }, "clogd": { "100": { "checksum": "sha256:f79fddbcdd23396829c104b213648ffd5d5cac6dec8adf2f5f0efa2f7061b615", "enabled": 1 } }, "cloudform": { "100": { "checksum": "sha256:e985f49df5d3f27855f9c727f19138103110d987ed0f9ed8081e39a39aeb813b", "enabled": 1 } }, "cmirrord": { "100": { "checksum": "sha256:2c94298413c5d1fdb88ca748b4c597d2b3e543ed82604378a6ee333276dcaab4", "enabled": 1 } }, "cobbler": { "100": { "checksum": "sha256:7fc7afc3aec313b51b6508fe564813eccb4ce5bc1d650ba9f7eb61a438cee86f", "enabled": 1 } }, "collectd": { "100": { "checksum": "sha256:e574da01bdbc51b1b08385713fa8c9780df9aa657ddab4cd93211e4e9c8b6ff0", "enabled": 1 } }, "colord": { "100": { "checksum": "sha256:1887fa9a04336ceeeb73c20f3f7f6d6e6994f2a4b1cff802ca64e142ea1bff05", "enabled": 1 } }, "comsat": { "100": { "checksum": "sha256:38a4f293f3f8cdb18203c347a11304f161264aaf794ba6da418804f090c0e7a3", "enabled": 1 } }, "condor": { "100": { "checksum": "sha256:e8d56cababfbca2569c42055e8fac5782d6ae8b42ba2b236b9563f7eb6c2ef4c", "enabled": 1 } }, "conman": { "100": { "checksum": "sha256:e172ba28d687e96a2ffc463bffbf202ce44e7bff6e0685adc4b89a67cdcdfd40", "enabled": 1 } }, "conntrackd": { "100": { "checksum": "sha256:794a65c72b01138ecd19d9e679f62129d33183f9b1324b064c16301fa79d53d4", "enabled": 1 } }, "consolekit": { "100": { "checksum": "sha256:cbc53cf7577ba92ef76acae3e99df2159a8aa432f25bbc2ee03bc0eee036f5e5", "enabled": 1 } }, "coreos_installer": { "100": { "checksum": "sha256:699c69e48f1ec1dd8f915f094a83a6e2f6b6fdd515233fd8867649f254509270", "enabled": 1 } }, "couchdb": { "100": { "checksum": "sha256:743992902ac8514de4dd54b16317a8edb9cb1e383e29d0450821a9ac304059db", "enabled": 1 } }, "courier": { "100": { "checksum": "sha256:3bb1b730a3a6f236b312c3f71172364911992d35e93c1e5cd1e1fb076eb2d3eb", "enabled": 1 } }, "cpucontrol": { "100": { "checksum": "sha256:969a757fa353d20473f2eab457378e50d6f32b1da19f6744f7f20d5f3f99744e", "enabled": 1 } }, "cpufreqselector": { "100": { "checksum": "sha256:0b0bd5618356e7da7018c86dc262758bb6f04e7618de2942995ad59e897c3244", "enabled": 1 } }, "cpuplug": { "100": { "checksum": "sha256:211dc5348ec5f69d5d76aa3d07c74624bb39cd2aa082262e5bc4c6c165f677e7", "enabled": 1 } }, "cron": { "100": { "checksum": "sha256:d87419acfca2174158477e7e62a1c6d5a6b90f9fc759b1caa7fc2c80b2b66fc2", "enabled": 1 } }, "ctdb": { "100": { "checksum": "sha256:aadb573940c57a51a13c72bfa7a074e440d7b6b3d2440c3a435c1b41742d2b3a", "enabled": 1 } }, "cups": { "100": { "checksum": "sha256:4855d463c5142f1f001a028136bb737a837c6a0d87807c0b97b1eb23b897a05f", "enabled": 1 } }, "cvs": { "100": { "checksum": "sha256:b188990c62c036cf96deb2fd121236499dcb5cff830e38ac24b74d4692c0323d", "enabled": 1 } }, "cyphesis": { "100": { "checksum": "sha256:ffe9e6e4bb673f6822a874b86eb21769ef0f5fdbfd2e2659485abc4c6215ffa2", "enabled": 1 } }, "cyrus": { "100": { "checksum": "sha256:8b077e1368da1daaa96272b0d6f22f08bcb264617e6884c9074423a9b066912d", "enabled": 1 } }, "daemontools": { "100": { "checksum": "sha256:c9631ff4c56b588338bd87c7d8cba95e78285ecfc4f12653ea5c888d2fee8af0", "enabled": 1 } }, "dbadm": { "100": { "checksum": "sha256:cdf7e54b6a3fceceb16ba54647c10c12bc09bd9541550a9a516823d771c66ee6", "enabled": 1 } }, "dbskk": { "100": { "checksum": "sha256:5c3f9427b9dcfa7aa76a67db7c3f9a33ca299916a5608cd985fcb5f30fcc405a", "enabled": 1 } }, "dbus": { "100": { "checksum": "sha256:e149fe7195a124d01d7afab5ba6a34552b50e47890e94153d06763826ee7c00d", "enabled": 1 } }, "dcc": { "100": { "checksum": "sha256:e111315155b31d6b375dcaae05acf5e957dffc94e9efa04d4eac9ee74888b4e6", "enabled": 1 } }, "ddclient": { "100": { "checksum": "sha256:eee9cdfaed63e516667ca9f76565fcc909c173c6440bfd19dca5c12fca61692d", "enabled": 1 } }, "denyhosts": { "100": { "checksum": "sha256:7a239615a4db634ae1ae8564acc54c55c86c68bd2e44d47ad9c24758d492b22c", "enabled": 1 } }, "devicekit": { "100": { "checksum": "sha256:17ec39ae4334df1f47531199b3f12954bc848810d1eccaadc0463174aad953f4", "enabled": 1 } }, "dhcp": { "100": { "checksum": "sha256:15aac55748ee10a93ba0547d3c20670148e01d14b74afae89df0e660e3396dc3", "enabled": 1 } }, "dictd": { "100": { "checksum": "sha256:3821e27717e50b0e999807d79f0eacd2ad6012a7a252b2c6280a0303e438237e", "enabled": 1 } }, "dirsrv": { "100": { "checksum": "sha256:a33b7321ba8fd3e69c610462584e0147b9bfbae8498de44c2eee891f209a03bd", "enabled": 1 } }, "dirsrv-admin": { "100": { "checksum": "sha256:e7ec5090958fbde915f221e49396a45b968b8a2d6429d74000ffefa8b6e87e3f", "enabled": 1 } }, "dmesg": { "100": { "checksum": "sha256:19cf6b1658e2a7ff92db6c537944a989ea72e88173663ba4589f71132a80be61", "enabled": 1 } }, "dmidecode": { "100": { "checksum": "sha256:55423cc64cb3aeccc3e05952327db479669ddd5f5e98bcbc275859d38333e79b", "enabled": 1 } }, "dnsmasq": { "100": { "checksum": "sha256:18f7d99ff73fb7c2256ac3d86d4328098274cf2b8fa92c8b42da37867f6c7b31", "enabled": 1 } }, "dnssec": { "100": { "checksum": "sha256:1584e02889b2969514ddfcdc5fa31152ba8505c66c1b5f88e75b3bf23f96bec3", "enabled": 1 } }, "dovecot": { "100": { "checksum": "sha256:57d7924556ab25fb130b93002b1da35e886ef1ad01c4391266cedcc01848ea8b", "enabled": 1 } }, "drbd": { "100": { "checksum": "sha256:b96580e08581043a5e5fe37b67ab937e074e4218d685b23df6f9421ca49d5248", "enabled": 1 } }, "dspam": { "100": { "checksum": "sha256:90cdf514efbb5bf01f1e2b90d7d58eed2e2b458242905b5cfa32e1de01d71da3", "enabled": 1 } }, "entropyd": { "100": { "checksum": "sha256:ca075b4e9b9aee4bb5946407a418fc229a1c6fe4c7abc9487b9ba9770692c78d", "enabled": 1 } }, "exim": { "100": { "checksum": "sha256:413b889a36fb01d9d3ee8fe149aa2d11702e5e44afceef42a0d7450a421cc89c", "enabled": 1 } }, "fail2ban": { "100": { "checksum": "sha256:bfd591b0e99fc96cdd443d01c045583e45fc0ba1e8d6b7a18f2596473fe11ac9", "enabled": 1 } }, "fcoe": { "100": { "checksum": "sha256:8e2cebfc7249a86f6b40690946f6281f83a343eeef929d7e1588bc77622004fa", "enabled": 1 } }, "fdo": { "100": { "checksum": "sha256:7e7c69d205d75239503dd305a759ac9ba8406c0a1eaa20a7c97a033f38b13006", "enabled": 1 } }, "fedoratp": { "100": { "checksum": "sha256:d6f5f286ed1d5705e62598f143c0150deb59d708d7dd1ad89bdc6d68f59ed853", "enabled": 1 } }, "fetchmail": { "100": { "checksum": "sha256:3acaf8ff26c8d81010cc1ccf88872df39199279fa44d85b06903c97f50def08c", "enabled": 1 } }, "finger": { "100": { "checksum": "sha256:17aedac05f634eb036f29d301d8b0416cdae5f3967f88c0286761e209858f07e", "enabled": 1 } }, "firewalld": { "100": { "checksum": "sha256:935b69863cf4def7a44b43cab96cf43fe663ec8d5ee103d6f412936391125a50", "enabled": 1 } }, "firewallgui": { "100": { "checksum": "sha256:18bfb3bc0946b731f05e969de50189403372c76fab801f83e2687a9d43b86622", "enabled": 1 } }, "firstboot": { "100": { "checksum": "sha256:4c9915a1c1cf1e67ea429fab342c29f2caa911f5d615b86f9eb8da496d7b5d54", "enabled": 1 } }, "fprintd": { "100": { "checksum": "sha256:ff74bb68f3369f30d2ae98cfce6acecbd6b7adbad46a18cc03bba25037a26beb", "enabled": 1 } }, "freeipmi": { "100": { "checksum": "sha256:38cefd0e603535dcb11f9291f4948f059693fa7371f3ee92654cd3176a5195a5", "enabled": 1 } }, "freqset": { "100": { "checksum": "sha256:3f2c6b1336ed28a0a7d90f283e9289a5f83e5ceb71d11b7814fdc65ae529314a", "enabled": 1 } }, "fstools": { "100": { "checksum": "sha256:602b25a03e6a5d5a25d3ed528e4c68f6a495e2dea3b6060ca32845483f9d2d88", "enabled": 1 } }, "ftp": { "100": { "checksum": "sha256:22e96b5040327ea8554d0216ef4e28d6a94abdabfe3a211a6535941bebebd013", "enabled": 1 } }, "fwupd": { "100": { "checksum": "sha256:3da67a17b85cde7a0f181e6599726357fb2eeeb251fdd980a9d6669772415922", "enabled": 1 } }, "games": { "100": { "checksum": "sha256:020d175da904ce5e5f94eaab9e60a4a86b70ed9767f7660157c7fa459c747b34", "enabled": 1 } }, "gdomap": { "100": { "checksum": "sha256:f41fda08cf3bd33ac571a4ca5d6d505a6dc7062c29e45b1d5d160c953c532230", "enabled": 1 } }, "geoclue": { "100": { "checksum": "sha256:beec56305b71ccb0174a7c790eb972244d10b8381a84812f01021b9e8eb7ad99", "enabled": 1 } }, "getty": { "100": { "checksum": "sha256:a6fac65e97c9962f1066c0a2fc839e67597cda9061557affb60a01df78184a4a", "enabled": 1 } }, "git": { "100": { "checksum": "sha256:9f2c83dc9cafe17c8d650f8c78c39c61bd3d2c8b4d77a535eb61ce6a04a3e3f4", "enabled": 1 } }, "gitosis": { "100": { "checksum": "sha256:b7decc3fbc86e4b9e3fe1fec22021737ad607cf795224b5d5793069716a7eda8", "enabled": 1 } }, "glance": { "100": { "checksum": "sha256:097520793eaae6b878f14f2b767d40a80007786065abf1b252d898a685957fc5", "enabled": 1 } }, "glusterd": { "100": { "checksum": "sha256:9fa405b48cc837c307845447cc5444caf258bd0bd7f00bb6ba6c9bc623d7b15a", "enabled": 1 } }, "gnome": { "100": { "checksum": "sha256:cc9c1db5bd96467067486759e63ec280a5292165516d258a9a167af19a5d865a", "enabled": 1 } }, "gpg": { "100": { "checksum": "sha256:bf938826905f5d23a97e4adb465766f1dbce18fcc6da669f6e7ff2f287b5fd0f", "enabled": 1 } }, "gpm": { "100": { "checksum": "sha256:f3799d846d2d0bdc1e1d3d075565002b1cda87f798b852691135012b5f545d1b", "enabled": 1 } }, "gpsd": { "100": { "checksum": "sha256:c38a1e2a1cb3c1a8d400f506cc9f3f6dd15d78234f982a04bf4ad8b4f6308937", "enabled": 1 } }, "gssproxy": { "100": { "checksum": "sha256:7d88832619945a1af5fa3b40ad467b816e5a1e24b7f875e10b8c4d75a33d79b3", "enabled": 1 } }, "guest": { "100": { "checksum": "sha256:c0f9f2a0be3fe24b4f74acf4d1924183310b454591b8e8219fccc61c2194f167", "enabled": 1 } }, "hddtemp": { "100": { "checksum": "sha256:82815af35d160bae556a7c6b8862da02b469fbb90ffd5472a6b22c18bf0f1fec", "enabled": 1 } }, "hostapd": { "100": { "checksum": "sha256:ea0ff5fc3de7fc7a93d4b64c0bc54a4040d9427c9eaf8200b97388166cc0dce6", "enabled": 1 } }, "hostname": { "100": { "checksum": "sha256:17a1021557fb77a340af451c33cf80f05ab309a719778d84813d8ab67cd18f6f", "enabled": 1 } }, "hsqldb": { "100": { "checksum": "sha256:201698320ad8f5f35391ea84f0aad40c26ba8db2fe006b38c7b65b8dc71f45b7", "enabled": 1 } }, "hwloc": { "100": { "checksum": "sha256:a5454f4868d22b9ae798ee6f85df6d8b6b35b456617a02084af47d393322ab76", "enabled": 1 } }, "hypervkvp": { "100": { "checksum": "sha256:c89e58ec9c52039bc7ddfd048fa5b287780a48f78e70ce64181607ad33da6c47", "enabled": 1 } }, "ibacm": { "100": { "checksum": "sha256:b7ab0f90999e86d8fff9de3abeca0f093f2bf9f73330d434e947fdb5bfcc63b3", "enabled": 1 } }, "ica": { "100": { "checksum": "sha256:bd8eccb4266130eddfae34318855e6aa3a8624998f0a906a970a04802510ad86", "enabled": 1 } }, "icecast": { "100": { "checksum": "sha256:3199646b83c13466308cbe243d6c75f828875ab0dfca1deb585dcd7bfb28ae60", "enabled": 1 } }, "inetd": { "100": { "checksum": "sha256:0e2a80d0943b3fea8756b2d487038182ed624dd36e9177305fafbc1ca0374af3", "enabled": 1 } }, "init": { "100": { "checksum": "sha256:358dbf02ed7fb53d0f7abafebad35d5bad7a2cf830676a1fcdf93055569b1c21", "enabled": 1 } }, "inn": { "100": { "checksum": "sha256:d60fd47ec62ca3bca272ca38d8fe86fce36d6f04c44d01410d30428c8b914861", "enabled": 1 } }, "insights_client": { "100": { "checksum": "sha256:c4e83821f0150988e86e714cf9e088ef11abd0a30dada90cb7e68a95c406a921", "enabled": 1 } }, "iodine": { "100": { "checksum": "sha256:6185225b7c5910d94396a18d93429704434d564b4478aeea9d1a8c54f6b12834", "enabled": 1 } }, "iotop": { "100": { "checksum": "sha256:7db93e52e29915a08c82b0ebac02856e9a484923d2ce215946c8f54cf5297704", "enabled": 1 } }, "ipmievd": { "100": { "checksum": "sha256:ea599044fe2e2d25c8078ac0b5b0d2c589b4410dd7958eefdc0217f8b8c8c3d0", "enabled": 1 } }, "ipsec": { "100": { "checksum": "sha256:c5b3f0780b99bd8773d7723c5f2a3ff188d36888611ffb524151c9943c8c5d5d", "enabled": 1 } }, "iptables": { "100": { "checksum": "sha256:e315834c2a287c9b9f8d31a6fb5f62360241fab584b68ce05984035ff67f01dc", "enabled": 1 } }, "irc": { "100": { "checksum": "sha256:1ed937287398d60e302478f4880a25c1aef605a5d370c1f583426c59067d8ed5", "enabled": 1 } }, "irqbalance": { "100": { "checksum": "sha256:12e969265cff2e8d4ee9be8b99da0bd47138ff8754039dd1387a14b2a4a03b79", "enabled": 1 } }, "iscsi": { "100": { "checksum": "sha256:6666da694f33e444225327d9fc58208b230a1e1244e05335e16217c507ac0627", "enabled": 1 } }, "isns": { "100": { "checksum": "sha256:85a313d121f764081dc92c4c1ed5773b8cb7a7113b84b66339a48f76c57efe54", "enabled": 1 } }, "jabber": { "100": { "checksum": "sha256:08288d791f1b1aa3a8f154ae7b2ff73d574560bf5bb82246020baa3c26f6c988", "enabled": 1 } }, "jetty": { "100": { "checksum": "sha256:6aea2d07d8f040434ff0c0d5e9b81e825b1de56cd01c0e0a069ee1f0d5196885", "enabled": 1 } }, "jockey": { "100": { "checksum": "sha256:f35d7f2d61175ba7f908459f738c7d6b6adf2ae678d88531fa527c4d50b21985", "enabled": 1 } }, "journalctl": { "100": { "checksum": "sha256:6ba4c0331da8031107dba9c447502f4809d0392a56de2c9d4ce8c333c14f1627", "enabled": 1 } }, "kafs": { "100": { "checksum": "sha256:5a35385072b421cdb86fc66190dc00e6c8e7a587a60ffeedfe2a931b6b92a11d", "enabled": 1 } }, "kdump": { "100": { "checksum": "sha256:370090682ef9410c2d12556dfd1cb6f8f7f7ab60287148ead03b3b11d64146e0", "enabled": 1 } }, "kdumpgui": { "100": { "checksum": "sha256:c6d5f08d82062fe0634f0dcc9ee434fcea7760141740b79cbae187d970984a0c", "enabled": 1 } }, "keepalived": { "100": { "checksum": "sha256:6fa6bedd26d8e4b178a2d74158634f26fc2b322782e4685d69ca48f4854e037b", "enabled": 1 } }, "kerberos": { "100": { "checksum": "sha256:e120007ccfb6133d0bd394a6d93bf6c27b39e61feb2abf6bfdaef8fe2ffc2a54", "enabled": 1 } }, "keyboardd": { "100": { "checksum": "sha256:4eae5ab2b926695b3ed881a26e7b6a79f4dd48c1ffa3e3c6809cfa4958b54091", "enabled": 1 } }, "keystone": { "100": { "checksum": "sha256:6d9a8c7420b3aee2c752119ef6d72dc828cd76ea4f535bdcd44461a3cbb43454", "enabled": 1 } }, "keyutils": { "100": { "checksum": "sha256:565423cfb2b311894224451ce1a281beb84b85e347897a4402da63ee9a141d91", "enabled": 1 } }, "kismet": { "100": { "checksum": "sha256:30ba7e91c6fe3cff8ac488e6160a5037fb8f1719275991ec44e5bfcbd9089aca", "enabled": 1 } }, "kmscon": { "100": { "checksum": "sha256:f84aad2ecda576c41ff72438aaa394122fe6f13d274adb17a2cf145718776969", "enabled": 1 } }, "kpatch": { "100": { "checksum": "sha256:16b2c313e69ba9a2726025ee54bbfb638c633f91d53db454e204adef45e5b5e1", "enabled": 1 } }, "ksmtuned": { "100": { "checksum": "sha256:328ce002901ec9be77a5035fe50b01b9e01b016128efb981a7fb0aa586b55a77", "enabled": 1 } }, "ktalk": { "100": { "checksum": "sha256:311f6dd8403d55da5f3173593f5fa6331d4cf7570af2b9d0182f7b260ba7285e", "enabled": 1 } }, "l2tp": { "100": { "checksum": "sha256:cc0faedcac2d86171aeb9a89ec83f2d03cb905f54296f6f085415fcf7ddd1a06", "enabled": 1 } }, "ldap": { "100": { "checksum": "sha256:f1a7908800f08a04aade8083ad79d002aba0c85f1699f35ded5a0eaf46dbc41a", "enabled": 1 } }, "libraries": { "100": { "checksum": "sha256:91a8d1cb1ea2e44b9395cb7a44776e7992983378fbc72dc3725d0e03fa673a40", "enabled": 1 } }, "likewise": { "100": { "checksum": "sha256:c83754418ce993f0d9dff4ad73f0e64362ceb3ed073a093dcdaec0ac6d8c8b5a", "enabled": 1 } }, "linuxptp": { "100": { "checksum": "sha256:6a6b3d623e27e1e8a1ce185ba3ca42d3c7fb5dcad8cc91ab54950a1f69853dd7", "enabled": 1 } }, "lircd": { "100": { "checksum": "sha256:564e3801937503b310cd4fef750cea8bd1be7ce323cf41276cfb0eedb0fbdc8c", "enabled": 1 } }, "livecd": { "100": { "checksum": "sha256:9070ba1f03a04a5906bafb1399a0eb181dfdeab868dfb021505c116c61decbe3", "enabled": 1 } }, "lldpad": { "100": { "checksum": "sha256:26feff7ac4722e995c622b4ec3cc242deeda12f88077bff1b419193ed76ffef1", "enabled": 1 } }, "loadkeys": { "100": { "checksum": "sha256:ee0cf6f2b19c24b7bd76829de7de00b60f67ae9d96f2ebcb8eb09b961b6dd751", "enabled": 1 } }, "locallogin": { "100": { "checksum": "sha256:56bf0d727637f1964d5f71a7ccc5edba76cfd50d2e546e9558d0551ffc968858", "enabled": 1 } }, "lockdev": { "100": { "checksum": "sha256:3cd6822d612f355cc8000823f47b7ab0087035b59b57904c145db627cc58d22c", "enabled": 1 } }, "logadm": { "100": { "checksum": "sha256:6b69c0a0528321f39311d34a94e750c5cfffd9bbcc00dd3c06f445f2a7bad660", "enabled": 1 } }, "logging": { "100": { "checksum": "sha256:504f0b31266cd6c9adfe480b750be32de121dafae4acca6aa320d9a50d6c613e", "enabled": 1 } }, "logrotate": { "100": { "checksum": "sha256:5ad554f859ea8313b48b7592cb1d1c3e13884199c9d9f939a99bb0651cadc5bf", "enabled": 1 } }, "logwatch": { "100": { "checksum": "sha256:2ef8bb580d1bb1de08940ad1cd29d82f66bf03fb31c361716e8465b0f05db62f", "enabled": 1 } }, "lpd": { "100": { "checksum": "sha256:fe9e6623bfd94678b32ba9a20c3e04469aa69a5175a1f626368c1fcacf7c5026", "enabled": 1 } }, "lsm": { "100": { "checksum": "sha256:3979d77408c282fc3abfe32908df4889d7ff0ee8096ccb68a6a91d44bebfa56a", "enabled": 1 } }, "lttng-tools": { "100": { "checksum": "sha256:b216fc3f1025723b5bf96002b39dcb9ae8c80c6d6708dc2db348f419af025a45", "enabled": 1 } }, "lvm": { "100": { "checksum": "sha256:e67d95db5bc472d9d8a819db5c947190543bb6eb971955dde57e5249655033e1", "enabled": 1 } }, "mailman": { "100": { "checksum": "sha256:0abc78cfd35d9e43c0c21983a250f874ee7ca7c10d7eb1c6e0d3106f538cf482", "enabled": 1 } }, "mailscanner": { "100": { "checksum": "sha256:b76bb64b5c5dd0db6346ae7ba19faed97ed414c7d03d7b8747e0919867a7b8c9", "enabled": 1 } }, "man2html": { "100": { "checksum": "sha256:3269314b5d8bbfc843900344d65e604432222bebc7e2dd628abb9d186fca1cd1", "enabled": 1 } }, "mandb": { "100": { "checksum": "sha256:c5c6c2cfc4930bb865330ac4bf1e04976290772b0e058cce0d73de071ccf7819", "enabled": 1 } }, "mcelog": { "100": { "checksum": "sha256:7a137918a31fc7684a8ae929a5046517be97399b2534a6b471e3cf3bd7f355b3", "enabled": 1 } }, "mediawiki": { "100": { "checksum": "sha256:4fb79885af1e12d162aaea04067c9dbd19c248b8bc6bb4fa91240acb6986c84c", "enabled": 1 } }, "memcached": { "100": { "checksum": "sha256:10e59adbf0210a83d2506cdcc189bed58fbb49f01ecc07afe00a5fc06a0f025c", "enabled": 1 } }, "milter": { "100": { "checksum": "sha256:686b92d2ed35a14042a3d6d95af167aa0b1bdb0bb1d1894909c3d4b5ad70ca34", "enabled": 1 } }, "minidlna": { "100": { "checksum": "sha256:6b200b3b9aafc62c019eba4121e1743ef9670896b70f8b5201caa19b975d6964", "enabled": 1 } }, "minissdpd": { "100": { "checksum": "sha256:33b06281053cbe0c55752bfcda928a48a50743eb2d061d429b07ec3b0dc28a04", "enabled": 1 } }, "mip6d": { "100": { "checksum": "sha256:1c657a9f8eeacab2a91a9725d7c5980624ab90e0e5c67c2f9270df3b0ebc9fca", "enabled": 1 } }, "mirrormanager": { "100": { "checksum": "sha256:5b9141bfd8861fae68d1054e5c15d503815c2191e68348405d5f8f2e53757b5f", "enabled": 1 } }, "miscfiles": { "100": { "checksum": "sha256:4d0081333a7d5871e89e8955ab640c3374bf2bde197557cb6a83e122aee9137e", "enabled": 1 } }, "mock": { "100": { "checksum": "sha256:a03da1b4a0188f7ed4057cc37b734745624b265fdad71ed6c2d00467b72a4846", "enabled": 1 } }, "modemmanager": { "100": { "checksum": "sha256:b618ed40113167b489f2e3ba560d1b2fe3013f15cffaa4d7263ec04f77bc67e5", "enabled": 1 } }, "modutils": { "100": { "checksum": "sha256:03fa2b1f672c7902f9c88bc3048c996ed5189bf6a965ffbae30b2cc555662aa6", "enabled": 1 } }, "mojomojo": { "100": { "checksum": "sha256:ce4c112a710ad3b571ab733fbf785a6ac7b9eeb9c29f3ced5f994965e9386ab4", "enabled": 1 } }, "mon_statd": { "100": { "checksum": "sha256:db72ccb5d6952aecb9188c455e70a0b8c6feb25946b44dd0884942e74074a65a", "enabled": 1 } }, "mongodb": { "100": { "checksum": "sha256:a5753efd9c79243f87ee59feb5c3914e9c6a435f70a053ff59657baa91f0c8a4", "enabled": 1 } }, "motion": { "100": { "checksum": "sha256:13c108e4971acbc506790783af8287ab766941ef3745594a69e020dfa769e75d", "enabled": 1 } }, "mount": { "100": { "checksum": "sha256:865539a39bf40f78356e8b2cbb08b24926913db3a08fa9796e4c2543882aeb25", "enabled": 1 } }, "mozilla": { "100": { "checksum": "sha256:c8f4b2cd46adef88902943506de8358f988c8e867e681fb9029f7f8e4bcc3a2c", "enabled": 1 } }, "mpd": { "100": { "checksum": "sha256:2b0acf523137d68497ea06f19b5494cb5c7ad6e4102fd7c626b9b74fef062f19", "enabled": 1 } }, "mplayer": { "100": { "checksum": "sha256:b68a6f0cf3b61e296607fd8557480c3706ee0d211722c53708b8605d387bb434", "enabled": 1 } }, "mptcpd": { "100": { "checksum": "sha256:3d4452b8e5b08d56b16c141515f2169809ab160544da9164cd91c3872fa5ca59", "enabled": 1 } }, "mrtg": { "100": { "checksum": "sha256:6757ab3ee1b84340b5a58935d090b926ffc96f43d03c9cb243802b4d01d5a29b", "enabled": 1 } }, "mta": { "100": { "checksum": "sha256:8dd4e90b95311cc5dae5b11733cf9ff8be46637d84c5aec28ae62a0b11dc452d", "enabled": 1 } }, "munin": { "100": { "checksum": "sha256:403d5e09c6c84eee2d3c80a1c50502d8850469e5c9d74347aacc1c8f7496861e", "enabled": 1 } }, "mysql": { "100": { "checksum": "sha256:2d19d6dc11b839551e1a98aa95588c06a3e1dd84dcdfe615bf61f26dd9ef31a1", "enabled": 1 } }, "mythtv": { "100": { "checksum": "sha256:f260b769c9d70fe26cddea75c71e1c3d16348233c40a3b8a844358d138c19d2f", "enabled": 1 } }, "naemon": { "100": { "checksum": "sha256:9c21b6bd9db730c7b79a4852236f92228282eb2dd06a2ff722b36acf70553386", "enabled": 1 } }, "nagios": { "100": { "checksum": "sha256:2681295ba94471abc9d7d49b6da5d757409c6c95745586d1671a4765ed61b130", "enabled": 1 } }, "namespace": { "100": { "checksum": "sha256:cc65424c4edcef752cf3d9223a0a49d84f7250bbc9c42d08d0b5727e0168dff6", "enabled": 1 } }, "ncftool": { "100": { "checksum": "sha256:9795f4fc6fa6e09a6976b87c80aab11801ae7b43d4f0df1b603b8799f4ff0115", "enabled": 1 } }, "netlabel": { "100": { "checksum": "sha256:229788201a8048a7a80661a258cabee570ce65b4c5fef318a86f8ba7f027975d", "enabled": 1 } }, "netutils": { "100": { "checksum": "sha256:78163350e82824e0c5bd14daf11f9a12833f34ac2a9c5c90c7949aaf9c7f9b1a", "enabled": 1 } }, "networkmanager": { "100": { "checksum": "sha256:90f6997a4b239629eded8b4615f91fc81f1bae33af97fcaad09c1cc3012c6e0e", "enabled": 1 } }, "ninfod": { "100": { "checksum": "sha256:13dcb24aa31902b75b5416a1f2d3dbba243445fa8ae1581ea41029ca21cdbc60", "enabled": 1 } }, "nis": { "100": { "checksum": "sha256:51dded020c93ce723bb8cd739359983f6ed82b5d70e93980d2fd095aaadcc168", "enabled": 1 } }, "nova": { "100": { "checksum": "sha256:000c5853f5261b435c8a0362720ce04a1ed300aed5f22dfee9659e8a024466a0", "enabled": 1 } }, "nscd": { "100": { "checksum": "sha256:9fb6cd0d5e35bcfcd051889e81e2834b4b50861455a0d8ee30920df919d3067a", "enabled": 1 } }, "nsd": { "100": { "checksum": "sha256:c940ff7748160326587bf9084d5b7906d501ecf9576707bc19f20ee759023442", "enabled": 1 } }, "nslcd": { "100": { "checksum": "sha256:a1af29229de29bfe64bde5357ea9e2a47d5b82f8329f62be4ddbc184fb8ead22", "enabled": 1 } }, "ntop": { "100": { "checksum": "sha256:20d8e86dcf15719ed0b481b0f5cd521b2ada532e712c6e1cb9f89e1736466ea0", "enabled": 1 } }, "ntp": { "100": { "checksum": "sha256:74dfbd9db80b05cc69c090005f02ab089695f21c0e0b70c2a7d8300247624e1e", "enabled": 1 } }, "numad": { "100": { "checksum": "sha256:eca77e7334e62b4b1a7186d51a9817b9259eb72dde54df6a23c83c6a20e26546", "enabled": 1 } }, "nut": { "100": { "checksum": "sha256:028a9e891f008817a604f55f77416af67a4332393386af9624aa99c20742a570", "enabled": 1 } }, "nvme_stas": { "100": { "checksum": "sha256:873f782179ba1a156db5c05a22e3391d65ac4748427d3d19d094ea8906e10fdd", "enabled": 1 } }, "nx": { "100": { "checksum": "sha256:dd19058574329f7c7ce709bc94f6c4be87028cdd184cc365a61d5c0113b78bdf", "enabled": 1 } }, "obex": { "100": { "checksum": "sha256:649c497ab74a203064009d553b42829aac89ac5c4273b7c0ccc0a23530001fcc", "enabled": 1 } }, "oddjob": { "100": { "checksum": "sha256:4c0624c2e3095eb392ee93e8cf98a2242598b7694de12ae75ceffdc0043d1043", "enabled": 1 } }, "opafm": { "100": { "checksum": "sha256:06c9fb3964a855ec2ffd00719ed70b104d40e4b33720e2b109cd22ccc7157b8b", "enabled": 1 } }, "openct": { "100": { "checksum": "sha256:37ac5651f2ce2bfda7d898a024d560c6cdcd54da32409ced32c8b6df059370c3", "enabled": 1 } }, "opendnssec": { "100": { "checksum": "sha256:9890fb3013a4287e6850621c5c6a462a254299b927609fbe6bf67f141862f3c7", "enabled": 1 } }, "openfortivpn": { "100": { "checksum": "sha256:75fa83f68fa800bef996da027f14f6c8f0ded93f93569f3b559495dba39a6176", "enabled": 1 } }, "openhpid": { "100": { "checksum": "sha256:dfbf83e8feccc45bd9de7bd5a3f90edd49c3190b5010f9f98457d995ae974f8c", "enabled": 1 } }, "openshift": { "100": { "checksum": "sha256:1b7ff4ebc979d45777979c0340790dcf10a3e9353272c2e99f6f4f4f93987020", "enabled": 1 } }, "openshift-origin": { "100": { "checksum": "sha256:e8e6bb283142b08b3a1ce7c097f49f402bb6d59eb6d03138ef0b69f3579e466c", "enabled": 1 } }, "opensm": { "100": { "checksum": "sha256:c1bc06d4e5a22837586d3d7ee07922f26b4cd025687cdfd7e3e3789e1c5bdf3d", "enabled": 1 } }, "openvpn": { "100": { "checksum": "sha256:800615bda29f1b49c90de283570ccb018a28ddfff34a36bfe84a4d0485c37138", "enabled": 1 } }, "openvswitch": { "100": { "checksum": "sha256:0bb8334afe603f02ef4f1a46517f970bb30fb191e1d9d944daee33437ae89b46", "enabled": 1 } }, "openwsman": { "100": { "checksum": "sha256:c6d157e74586892234883ad01bb4d2e386637cdd04e45c3587a1cbc8c34e35f9", "enabled": 1 } }, "oracleasm": { "100": { "checksum": "sha256:802090ff2dcc2db35b369ac359e51b1418a5a779cf94e5a14a01a1882b583d64", "enabled": 1 } }, "osad": { "100": { "checksum": "sha256:d3b5dec9cf6640c48875229fcdadc4f9c4bfa1088bff93cb61a18360197a691e", "enabled": 1 } }, "pads": { "100": { "checksum": "sha256:23da314c2255bb1c19dad65a242681330d6bdddc2ba4a0b80ae2ad2249c0a52b", "enabled": 1 } }, "passenger": { "100": { "checksum": "sha256:75e644cc79f5447d17ed5461e52b37f29dc86806320b0ef562b2acd8e2870121", "enabled": 1 } }, "pcmcia": { "100": { "checksum": "sha256:6d1d142817f6c7013fa94819eaae391f98b69357bfb352584c1ef104d93aa08a", "enabled": 1 } }, "pcp": { "100": { "checksum": "sha256:3f0071ca72c2269c6568f2695aea954cbb6a8504efa48a8ed1e94e2c0baac106", "enabled": 1 } }, "pcscd": { "100": { "checksum": "sha256:9bd33415ca0bd34a05461719de21697d486512341069567c26b25cd6a7b06c00", "enabled": 1 } }, "pdns": { "100": { "checksum": "sha256:5dcb629d0e7578776fb9476a8b66ff883fe52da8be37e3b1f7a3bd4810354511", "enabled": 1 } }, "pegasus": { "100": { "checksum": "sha256:0312d08cfc7b9ad408ce3d34a27b27c64c62e1b00d280c484616927a859aeb47", "enabled": 1 } }, "permissivedomains": { "100": { "checksum": "sha256:2453bad4ace526f3cf2c60b358e95a5476692ef25da107b10f52f3af27c056d2", "enabled": 1 } }, "pesign": { "100": { "checksum": "sha256:597ab5f0f472ea2f2aca1b3754c2457cc7769f79daaec0856ad2675e53867b80", "enabled": 1 } }, "pingd": { "100": { "checksum": "sha256:877bbc42240491f32726ac44a082942e80724e3bd3985c64845dee49a060a144", "enabled": 1 } }, "piranha": { "100": { "checksum": "sha256:4e9b8ae11387a45c9266a82f79f31728bce6397b8e7f8d39287b78dd58ddfd96", "enabled": 1 } }, "pkcs": { "100": { "checksum": "sha256:81dc1bee85071c65da6d7e4dd12f390fdaeae5d7a285c132c952290152f90c0b", "enabled": 1 } }, "pkcs11proxyd": { "100": { "checksum": "sha256:184ad49c7d752eae216e8e7313fc50952515caf2d4fe77b1315204a436979cbc", "enabled": 1 } }, "pki": { "100": { "checksum": "sha256:611152fa77dc465824018a6b6e2687f2a1e483f92e20d9754c33f8032d7a210d", "enabled": 1 } }, "plymouthd": { "100": { "checksum": "sha256:2ed5421126ee7ac09bddedec68697ddef1d14aeb02432eb766ce64aa364d04f7", "enabled": 1 } }, "podsleuth": { "100": { "checksum": "sha256:ac303c7ac224ff7fec004f7db6bdc6cabc5e9fbf1df3e27ab454f3f87de18b8d", "enabled": 1 } }, "policykit": { "100": { "checksum": "sha256:f651b9a5f05bc1884e58553525034b9fa376084bc3eee5ecd5cb7b70c42a55aa", "enabled": 1 } }, "polipo": { "100": { "checksum": "sha256:e30571ba606507dd4439ea7b455a115170923229f105614f0b044ed07a8ba832", "enabled": 1 } }, "portmap": { "100": { "checksum": "sha256:15e3c27ffaa1d13d785823bd82e53c398e0b5c96c4e0efce502137b326f3710e", "enabled": 1 } }, "portreserve": { "100": { "checksum": "sha256:b1795d2f89c94a43339f1b659f29173491eed1d079a21cc290fb53afb74efd1f", "enabled": 1 } }, "postfix": { "100": { "checksum": "sha256:3a8a7cf3801af09851b2c9c88522c7f7a463b7005189e318088afd59ded88bd2", "enabled": 1 } }, "postgresql": { "100": { "checksum": "sha256:dd827cd694ba2389e77fee7137a741677590dc56dff0015fa6b58b93ab354c7c", "enabled": 1 } }, "postgrey": { "100": { "checksum": "sha256:698a04231cb1ca9a1a657c942f22b26ad57e06af78dbce2006dd9a7991d01b19", "enabled": 1 } }, "ppp": { "100": { "checksum": "sha256:640c55f9dc7e32054fc5dcb34e4080a848526a35cbc4225b32deec213e9906e4", "enabled": 1 } }, "prelink": { "100": { "checksum": "sha256:db21e33d0ffe48bd93a10808ea536532acad2578c499333de859afca9b4bfd5f", "enabled": 1 } }, "prelude": { "100": { "checksum": "sha256:2592c97f035c97e88768678244baa38c031afeb5e94730133a0a59e7b451aa57", "enabled": 1 } }, "privoxy": { "100": { "checksum": "sha256:2f9555f9f047d752841d98608cd372c9f9743fdaa88a59c96a0b2e85f9016b00", "enabled": 1 } }, "procmail": { "100": { "checksum": "sha256:ee1a9203e7dd07d84cfa3c20a2e069bb0552dc430a6d2feed2919e53731b1864", "enabled": 1 } }, "prosody": { "100": { "checksum": "sha256:360e32fc4cb037faf2b5684ce2d376c42cff14785fc95d82fd750fe66dabe1e2", "enabled": 1 } }, "psad": { "100": { "checksum": "sha256:f07697452277aee96c92093916a241aa1e170359e0137e82bed4e84294c71d83", "enabled": 1 } }, "ptchown": { "100": { "checksum": "sha256:861c6b803141841f4756be8531775c7d37e8ec378b2b9e90f37e1932d35e5e36", "enabled": 1 } }, "publicfile": { "100": { "checksum": "sha256:a932f4ee30fe06d891909b328e9ed103a77010e273b19cd7e9debf3ba43204c6", "enabled": 1 } }, "pulseaudio": { "100": { "checksum": "sha256:02a332ab93358096dfeafa4b77d9ca1c8dcf79fe6ab9b18150646f1a2efee4dc", "enabled": 1 } }, "puppet": { "100": { "checksum": "sha256:0ac394773bec589974cc6ecc5f99dc5faf2871e771295cb84f33a86bdbd51a6d", "enabled": 1 } }, "pwauth": { "100": { "checksum": "sha256:4cabb0c5c75a395579d7523d3e7616db9fb0e1f40d3b9f581f6e94eebb049810", "enabled": 1 } }, "qatlib": { "100": { "checksum": "sha256:1bb5095ae763a4d56558d3599cdff31fe5ee5f86ebc16049b7c1709bbd6c4665", "enabled": 1 } }, "qmail": { "100": { "checksum": "sha256:7972bb152e68b8fdb1a77c84868b66e420365c9f526254fb272a7263500acbff", "enabled": 1 } }, "qpid": { "100": { "checksum": "sha256:7cd981f1dfd524edc5e5992f10cd1b3e723adcb3cfaf8c7bb42c8f5197e64378", "enabled": 1 } }, "quantum": { "100": { "checksum": "sha256:a25e72eac82204e6200ec843bc06d7b59a1ce7c755666bf1ed9effd12ee466d7", "enabled": 1 } }, "quota": { "100": { "checksum": "sha256:292ac0c56b89fb197298bc0cda18828e0c6c524414d7ccc38133e2552283136f", "enabled": 1 } }, "rabbitmq": { "100": { "checksum": "sha256:6ae09222a0b9aefa3ead90b1f1ee08972570b1f5e39d25c05108228e4d315d50", "enabled": 1 } }, "radius": { "100": { "checksum": "sha256:76e245eff00a30a5db15851804f5047caa072e5f418e44fd75e70fcaba611e73", "enabled": 1 } }, "radvd": { "100": { "checksum": "sha256:b0ab6f58a00a32e8119b2b598d56acdc5919c7d9414eec24beb5eca9d84c90dc", "enabled": 1 } }, "raid": { "100": { "checksum": "sha256:3ab88c070cf4ab980219197d2125cba4272dce884c9d00a673b8aa3b316e1611", "enabled": 1 } }, "rasdaemon": { "100": { "checksum": "sha256:f6d50ea9aa365ffb71b924d04e606e7e7be9fc0ff67899ad1fd348bb209a7ad1", "enabled": 1 } }, "rdisc": { "100": { "checksum": "sha256:d45a4dad3bf9f007f05eca41b5c04d4ae329405b60f307f8d6959fc92dee841e", "enabled": 1 } }, "readahead": { "100": { "checksum": "sha256:338d67a9fa343855efdd6d99bf0d28bdcf03631bdff7591c1a21e975471a510a", "enabled": 1 } }, "realmd": { "100": { "checksum": "sha256:e2efb27c1fed30b949be0f29464b99701353b8eb65bd2da624505c7fc1e586a4", "enabled": 1 } }, "redis": { "100": { "checksum": "sha256:642738508f133e3709992b5542f048755e1ca6bdd1c3eed8fbf6de60135fe754", "enabled": 1 } }, "remotelogin": { "100": { "checksum": "sha256:31f6f5efb0759335de46ac3ab4c8a64955f838afc9743a20f2e81a8cb54cb36a", "enabled": 1 } }, "restraint": { "400": { "checksum": "sha256:b14c8c21fe8ee2121e5817382e83fd2a25699836be8e79269026a2d2494718f4", "enabled": 1 } }, "rhcd": { "100": { "checksum": "sha256:439afb2b8b905d32073d55ee61bb5ebc085ac1dd2c4bbb6b25f90051827dc6d2", "enabled": 1 } }, "rhcs": { "100": { "checksum": "sha256:554186e44a5763b0c4411968a57bf6266fcc4a0eeef5e971449135245df6807f", "enabled": 1 } }, "rhev": { "100": { "checksum": "sha256:4c8dc74bd9e7cebd9f78bae013483cdc6350fb8eb90debbf2888a89a0c30232c", "enabled": 1 } }, "rhgb": { "100": { "checksum": "sha256:4b222d1ae2a6931560eca08e087c56c7835ce79c3f82514c5c6d3e98a7de89b1", "enabled": 1 } }, "rhnsd": { "100": { "checksum": "sha256:12ea9c8c79218c6ef2da68bbe6f321ca6a7247a0f40142f1be8b85cb5b84d54d", "enabled": 1 } }, "rhsmcertd": { "100": { "checksum": "sha256:1f2f53ec9801fa0fff759f87f132d50b84cd6eb10350ac7fb5f0ea3c5fa09121", "enabled": 1 } }, "rhts": { "400": { "checksum": "sha256:e13af8921501fa4e869a20c87963951145bc762c670fc1b996c4858ca430cc71", "enabled": 1 } }, "ricci": { "100": { "checksum": "sha256:c9afadeea7fc4762d923383de22df24de13563a6dbe205b9ab73a6ad0a5cff79", "enabled": 1 } }, "rkhunter": { "100": { "checksum": "sha256:5f31c86e2f2cc425040785cc22a9040c7cdb80bd1145dfd668d2b2597534d6e3", "enabled": 1 } }, "rkt": { "100": { "checksum": "sha256:6154fe3dc060aa2df1f629434a417621e043836c3fc0d1cfe128db80a0c3a5d9", "enabled": 1 } }, "rlogin": { "100": { "checksum": "sha256:2001491066517e9d1718973ef9ce30640101b146abaac53218e9fafc5838eba5", "enabled": 1 } }, "rngd": { "100": { "checksum": "sha256:932eb717c50cc3b89b0d39754e8d42285224c7112dc8fb773b979ae335c61b81", "enabled": 1 } }, "rolekit": { "100": { "checksum": "sha256:08975182ac9221887a5cec3f1e51639802794cfa718d6bfe245aa5f132bac74d", "enabled": 1 } }, "roundup": { "100": { "checksum": "sha256:931c928c6bf89625bce5b6cb76084615486f91d6e86a09fcaf007c47c5e68ba7", "enabled": 1 } }, "rpc": { "100": { "checksum": "sha256:2b9fd485fa6fa2df485152c5ecf7b72a27e67db4f16cb86201b360a380a29e1d", "enabled": 1 } }, "rpcbind": { "100": { "checksum": "sha256:3368e81d1e4c4368ed76661f4b636f38651dcc804843c4b1ef0f8ea0eff45d0d", "enabled": 1 } }, "rpm": { "100": { "checksum": "sha256:ac6b3768152d1d94a1ab2c28f23681e16fe3a923e0880d32cc448daacd9cbfcd", "enabled": 1 } }, "rrdcached": { "100": { "checksum": "sha256:340e748f2da66a79889ffe2d77224bce0090a2954743d5c1098ac3cb1eb48866", "enabled": 1 } }, "rshd": { "100": { "checksum": "sha256:58d4cffb8b490c67aa4b1a335b0a3b933af4dfe6cc022fa0bc7400841d1efba7", "enabled": 1 } }, "rshim": { "100": { "checksum": "sha256:0188c6dcedbe5ab251e3958c40024bbfa40ed2cc84d5fa71aceaa2cb25cd964d", "enabled": 1 } }, "rssh": { "100": { "checksum": "sha256:26f6c19589d58fd23c303ac699697517d6883a9531837ad406e2f09b7507278d", "enabled": 1 } }, "rsync": { "100": { "checksum": "sha256:0aa06de248b996ddd0afc67811e82a96bde2dc7a2c328ecbbf6c6a5c9c780784", "enabled": 1 } }, "rtas": { "100": { "checksum": "sha256:3cf7dfe541071b3898cd76674ccce511cb2b25626b0e5f9e43fb903f928508c5", "enabled": 1 } }, "rtkit": { "100": { "checksum": "sha256:d8e666993d2c3c43a5efc6628d04fed230f380cec2feafc9fb1eb305239ad954", "enabled": 1 } }, "rwho": { "100": { "checksum": "sha256:c66a34fe691a84d52c8dd62ed1e9c8525796ccb3114743202a1d11dab4397c1b", "enabled": 1 } }, "samba": { "100": { "checksum": "sha256:98eb081b4cda2ac0cb8efdb83ed66772e7079fb81091c8a64cb3cfdeccd928c6", "enabled": 1 } }, "sambagui": { "100": { "checksum": "sha256:4c91ce35da807c4194c847e713628c8c5a8c2edabbc32800a9f0245175d61f9a", "enabled": 1 } }, "sandboxX": { "100": { "checksum": "sha256:3af5738f6c839b9e1f918287b3c852c9c7ac8fd2e82588952f8d423f8c0e82c8", "enabled": 1 } }, "sanlock": { "100": { "checksum": "sha256:12903fc32000cd4e9785975352f7cfc11ebb15acfbe277be575edb318b15131f", "enabled": 1 } }, "sasl": { "100": { "checksum": "sha256:5a49a242fb57fbde3ec1d0b7c33432f0beb2992005df650c1387cd727cfc31a8", "enabled": 1 } }, "sbd": { "100": { "checksum": "sha256:95253a6493fcb163e729a0937735e74fce0900975863996e7461a0590c36f2da", "enabled": 1 } }, "sblim": { "100": { "checksum": "sha256:cf43ae0bb1072dfbd3f191147458ab654f4bf36c7e6e52b3ddc5e21d565e527b", "enabled": 1 } }, "screen": { "100": { "checksum": "sha256:5c749a6c028d95eabf66a4af046f252d580a96be344cc6a4cbaf6c41bc3d7052", "enabled": 1 } }, "secadm": { "100": { "checksum": "sha256:645ea8eba12cb23c2f2a651d936cbe29c280bff2b7eecfd52ad99c592cc5a6b3", "enabled": 1 } }, "sectoolm": { "100": { "checksum": "sha256:35e4d2a3208b8c0d74c4016309c3447efac46618ce4209c78af9861f95cf36c6", "enabled": 1 } }, "selinuxutil": { "100": { "checksum": "sha256:64baf197e41fe7d76da2de1aaefa3da468cd99982d7f4078b497ae3968511005", "enabled": 1 } }, "sendmail": { "100": { "checksum": "sha256:b74fad36b6b99dce25b9723b5c83278e4cf0ad7a66b7f2d1788fa930737b6b26", "enabled": 1 } }, "sensord": { "100": { "checksum": "sha256:cf1265945526b166650edb71077e9778d22a9a35e59b12e1edae8b233e8e656b", "enabled": 1 } }, "setrans": { "100": { "checksum": "sha256:95193c003b42fd26e548de51bb8652289fc2fc66bd38f571c6e7173befc6e33d", "enabled": 1 } }, "setroubleshoot": { "100": { "checksum": "sha256:60686aef35513ad652ac553180e26c9864b890a3a5f442a091375b00f886b443", "enabled": 1 } }, "seunshare": { "100": { "checksum": "sha256:a8c3633cf363e103e840a32cace7793040117dbc627ae5ebdbb4509f79273cef", "enabled": 1 } }, "sge": { "100": { "checksum": "sha256:a78576b227ed6d26f57c3ee3bceb45529e50e6662914cf22ad89cc4dedce2251", "enabled": 1 } }, "shorewall": { "100": { "checksum": "sha256:f7c098e79dd886767fe6d5ae4e564d458a4b1243e99c0c74d8c36432e138599a", "enabled": 1 } }, "slocate": { "100": { "checksum": "sha256:763088f3e0392356a8bad6a06d56305a49c73690573a0c1e3d7cd8e7a7583f2e", "enabled": 1 } }, "slpd": { "100": { "checksum": "sha256:ba397cdd0a7a730fc70c7cad3cbe4ed2475c670b65ee5f9914aeb08b58200845", "enabled": 1 } }, "smartmon": { "100": { "checksum": "sha256:8ef416256baedc5250ebe2cee413dec8082c9dcb2e9d11a4efa164326551eae8", "enabled": 1 } }, "smokeping": { "100": { "checksum": "sha256:81de53921f1a522847961acb56f77dbad54284bcf8841f52afe11ced904a20d7", "enabled": 1 } }, "smoltclient": { "100": { "checksum": "sha256:95c7ee276c5baae8d1a63c94a349ea997f421cebdf73d013ab4b90b34d986fd0", "enabled": 1 } }, "smsd": { "100": { "checksum": "sha256:d871da983b281e8a8daec281efefbc6f6bf1ce5d5cbea4b352b49a08b0d89c77", "enabled": 1 } }, "snapper": { "100": { "checksum": "sha256:4e5c5d73145bbed79c08e82d0a21a965d8cb7a92db76b202b5d134fa42c95fe1", "enabled": 1 } }, "snmp": { "100": { "checksum": "sha256:af32d5989dfb265da12ba1be45f553eb0cca6ad12097540b811f7c30fbb208fe", "enabled": 1 } }, "snort": { "100": { "checksum": "sha256:b1696f79ceb514e5ffed2caea4f90f783c94ca4a73a22f278e48438182d4f07e", "enabled": 1 } }, "sosreport": { "100": { "checksum": "sha256:8059bae88c09a10f717526e476d1a4007db637fea6adc08fad6b17449dbc9e4d", "enabled": 1 } }, "soundserver": { "100": { "checksum": "sha256:3d50318423313efcf8160d08a3bd655e53f34b30374d9dc30ad5bea5cc7e0776", "enabled": 1 } }, "spamassassin": { "100": { "checksum": "sha256:4054ceef428a1df14d5fd4c620fce5d039a236500c1368a7a6e4bb07135d307a", "enabled": 1 } }, "speech-dispatcher": { "100": { "checksum": "sha256:b08147b36ed3fa54428c34c8ac8e2781717ecb453c2372760d41a7738b7757da", "enabled": 1 } }, "squid": { "100": { "checksum": "sha256:d6ec108e7f32a9cfe6f024cd5287a9b020783887f6a71e03b113eab3304342ab", "enabled": 1 } }, "ssh": { "100": { "checksum": "sha256:9bd8be75c6db916fa07e119e325784328632d1755a1ae942d528772aeb2d35c2", "enabled": 1 } }, "sslh": { "100": { "checksum": "sha256:5f14c1526eaa63f5176c2e6401410159cacd7f06f89d3c37c6670f5f3193eaa4", "enabled": 1 } }, "sssd": { "100": { "checksum": "sha256:2a1fce6e52ac5a339cda1617ae65ae121a681ab9b88a9ebabcc7afc03a6324ed", "enabled": 1 } }, "staff": { "100": { "checksum": "sha256:817fd51715b8897c09a36169ce0da31ac4493012bf09cb268faf9ae48cb5c359", "enabled": 1 } }, "stalld": { "100": { "checksum": "sha256:f51e2b820929b6b7e3ff75de0c6b944683c0c9572388fd4b55d125d167145d41", "enabled": 1 } }, "stapserver": { "100": { "checksum": "sha256:0a08f155a5545909cceeb2c2221dcee1980385b52a4afd3f8b8f6704617d14a5", "enabled": 1 } }, "stratisd": { "100": { "checksum": "sha256:46468db34c31c668f4b213b8ed14fcfb53e1e183431f6237364acf686d83b8f0", "enabled": 1 } }, "stunnel": { "100": { "checksum": "sha256:b24d3a5bcd4ecd35cd51836670ce97820a48a3669c0d590e716aee30709dc1f7", "enabled": 1 } }, "su": { "100": { "checksum": "sha256:858e8488feb83d85137a778c3b8c5140d9bc7d9b83a43077a23fbfd27091349c", "enabled": 1 } }, "sudo": { "100": { "checksum": "sha256:c274be08afec52e985bfc508199ef983f1f2eab41bf9b72b0921aa0276e47a51", "enabled": 1 } }, "svnserve": { "100": { "checksum": "sha256:2a78595b73c7ea25c5b395ec91f18b3dad58002dd8ef3652d69edd5a8c13f3f5", "enabled": 1 } }, "swift": { "100": { "checksum": "sha256:ccacc18e643d0ca081b36d910abb0ad6fae2acdd1f92a52b4fc9004fc31f4677", "enabled": 1 } }, "sysadm": { "100": { "checksum": "sha256:252d43a3b5e1fc1e91699252a7bcd47f56199e7169dfe3023f12ada6770be35d", "enabled": 1 } }, "sysadm_secadm": { "100": { "checksum": "sha256:8b3d56e43b270a3c8e85c91782ceb793a817dadff0af988ed17ec41251b9b315", "enabled": 1 } }, "sysnetwork": { "100": { "checksum": "sha256:e7d47e23523de1fde711e49fde546e1c8eb632409bf15fcf004ad97d8c55bf59", "enabled": 1 } }, "sysstat": { "100": { "checksum": "sha256:e35d3df921d581298273023c20b12e4d1168c249b06e08458a27732a4ff082ce", "enabled": 1 } }, "systemd": { "100": { "checksum": "sha256:8bb4c5ffbe0c0aecb72767f7c2e277a2b0906698b30ddaee84c2fa9c8af9504b", "enabled": 1 } }, "tangd": { "100": { "checksum": "sha256:b09965ae1db2d4aeab2d1c8775897288e6d55224183205ed48002daa816d2bbd", "enabled": 1 } }, "targetd": { "100": { "checksum": "sha256:8d31e97d05e23cbca57938b58d725c33af5e6aaa3e563971a398936cd3a21b36", "enabled": 1 } }, "tcpd": { "100": { "checksum": "sha256:2ea50e3363cefebf4b9bbcb6bcdd9e923da455c54a14c509f8c5ba9ca4da1c5d", "enabled": 1 } }, "tcsd": { "100": { "checksum": "sha256:a4d57e1e23ddfdbf6977fae56c0797b5791bd7a03bc3e7da5ae87d6f9ca870cc", "enabled": 1 } }, "telepathy": { "100": { "checksum": "sha256:98fac790af3d7a87e75899e112ba5d4cd2455261e44b60f1a0d7387ba0e0ad49", "enabled": 1 } }, "telnet": { "100": { "checksum": "sha256:e531681ec043e98c7aa5feeb5948eb339c71c657dd9a93a7dc35909aee56de2b", "enabled": 1 } }, "tftp": { "100": { "checksum": "sha256:7746597ed5d872b63e1e842421771ebacdcde5288bfd6d006346ad554e4799e7", "enabled": 1 } }, "tgtd": { "100": { "checksum": "sha256:e599de319c72ac9f8ca525508c1d2ce54b46a85f378b56b04ff8fa1781250c60", "enabled": 1 } }, "thin": { "100": { "checksum": "sha256:9825f5ea5ecf0720ae08c5fb7a50d3318b3dfb520801cf5ec8c0663364df5a62", "enabled": 1 } }, "thumb": { "100": { "checksum": "sha256:4f6d98b3bc45e22184ef8d39e7960e9ba575b52c8d8b57f18526591be0db8f4d", "enabled": 1 } }, "timedatex": { "100": { "checksum": "sha256:127a1f551291ce8c39e638b32770fdeadda67312a1c16dc1fad7da9b5a4f5c01", "enabled": 1 } }, "tlp": { "100": { "checksum": "sha256:c3135382d1212256d9ad9ce4c893271cc8093256fdfd34e0a37d9912dc75b16d", "enabled": 1 } }, "tmpreaper": { "100": { "checksum": "sha256:5650adc4d1a2c1db0f3733e73fe7b7b9aa4fda69a1d3308e0b59a8c4cf30c5f5", "enabled": 1 } }, "tomcat": { "100": { "checksum": "sha256:59312967f089bd0efa786367f1587ca124a49b7eb0ba7b5f21c2d66bc66e1b1b", "enabled": 1 } }, "tor": { "100": { "checksum": "sha256:210d53406ae0727c0243c597928358124d2c44c16bf33a47767a3ac88227414b", "enabled": 1 } }, "tuned": { "100": { "checksum": "sha256:758f35f5b9f4195f564792cf864a87a9b6e8de155fc2d348d4eec0ca30d8463d", "enabled": 1 } }, "tvtime": { "100": { "checksum": "sha256:2d5107d1b764ddd411e4abf44d7b3a2770c192bf0d0e19c88856d6593f0f2891", "enabled": 1 } }, "udev": { "100": { "checksum": "sha256:acb78447445a0d74b28f8e7b5db170464c56476be775e8d2577b04cb55ce3f69", "enabled": 1 } }, "ulogd": { "100": { "checksum": "sha256:fbd1de444b4bfe75df73aee4560955dac1378eb7e5654d05d849c67db43546b5", "enabled": 1 } }, "uml": { "100": { "checksum": "sha256:c05987c6a9f49b3370c011d79431dc52d6d435e89a577d7d10a8db02587a3786", "enabled": 1 } }, "unconfined": { "100": { "checksum": "sha256:be7e30d2c6b24f9d403c9b96acbf9258310b570cb8bf11fdf05166b4b4144664", "enabled": 1 } }, "unconfineduser": { "100": { "checksum": "sha256:8006817bbdf1a8de02fbe47d42fd5a79fd515cbb9b40a49cd5677cd83dc5b466", "enabled": 1 } }, "unlabelednet": { "100": { "checksum": "sha256:1b37d8de7cf505da7e184ad33a2d04904d9fc51d64707b5d7e449b9d615deaa2", "enabled": 1 } }, "unprivuser": { "100": { "checksum": "sha256:e2a1b07953c01d2f93d5fcc82af0564346b1a1079f7eb8110bef1bcd2d475067", "enabled": 1 } }, "updfstab": { "100": { "checksum": "sha256:e698b5479929ff28896c7e38efc410a47b6a5cd38eafd56c0c8de7fa210d16e3", "enabled": 1 } }, "usbmodules": { "100": { "checksum": "sha256:00518ce2d8f4e15cf7186a7467c8710848f2312adbbfb919cd6bed45d40d10d4", "enabled": 1 } }, "usbmuxd": { "100": { "checksum": "sha256:495bbffef9f008519da7918d89fdb853bab29a6c718ed72b13496574d16a18af", "enabled": 1 } }, "userdomain": { "100": { "checksum": "sha256:655a95a2a52d68a40e5a412ac675c6b8bca97ca2ef26cb292d79f1ada5f3d654", "enabled": 1 } }, "userhelper": { "100": { "checksum": "sha256:ec2502e379bebb8c8b29386cb8324f0fe8b3935bf065702a1b0a3cf4224e9798", "enabled": 1 } }, "usermanage": { "100": { "checksum": "sha256:35178826cea105b5cbec665c4032d007e453cc6897c787d14b00700e37ac5cdc", "enabled": 1 } }, "usernetctl": { "100": { "checksum": "sha256:24d457c012fb774bde2ecfb530699e6b4768604525b33f8d3cc99844f53918bb", "enabled": 1 } }, "uucp": { "100": { "checksum": "sha256:5451b95ffd75ed6e3a933bc0518e3dd2e9dfad315b4a6d3b08653469e8168668", "enabled": 1 } }, "uuidd": { "100": { "checksum": "sha256:7e39bde0db4bb71a014b04fe58282e134c94c1e753bd1b15e3e7f64b016bcb51", "enabled": 1 } }, "varnishd": { "100": { "checksum": "sha256:b6a3a26dac9580a586502165719438a72e9d41f72ff3d21eb05225a6d368f296", "enabled": 1 } }, "vdagent": { "100": { "checksum": "sha256:f513668cef5d03425903904f53f97a3640fa138faa4d1c13192d46b752b78c71", "enabled": 1 } }, "vhostmd": { "100": { "checksum": "sha256:bc8ddc6407a2ff415e265da8ff8609013d8d173979235acf4fcd1cbfc313a571", "enabled": 1 } }, "virt": { "100": { "checksum": "sha256:0f01a0f3bf8b449185addaf4d10360b1de985c4eaf0bde302e37ac54624cd73e", "enabled": 1 } }, "virt_supplementary": { "100": { "checksum": "sha256:d80a998844ea68164a2ac00d46b8c5ce51107066dc44a30ab41057a4e3ac259e", "enabled": 1 } }, "vlock": { "100": { "checksum": "sha256:ec8354aba09fd7ab45ed59a737034196f8b6c4df25e6079c4a00985e36b15f92", "enabled": 1 } }, "vmtools": { "100": { "checksum": "sha256:2b632a2cf0d1be1e6bffaf66a320cfd73cd3cb175711cfcab9ff77147838db40", "enabled": 1 } }, "vmware": { "100": { "checksum": "sha256:adf42ce5e6b76b7cfc85d8e6f83c0597ab9cba435011fd779c2202831dfba8dc", "enabled": 1 } }, "vnstatd": { "100": { "checksum": "sha256:fc8312deb1ef563cfbf8da562e6afba6dd16f4adc97e027c294ff06b1f1ef29f", "enabled": 1 } }, "vpn": { "100": { "checksum": "sha256:e1a9309d48b6463d32450421e867adadb08adbd641eaa072dcb5cd3d629f5dae", "enabled": 1 } }, "w3c": { "100": { "checksum": "sha256:6b19f10f4a51f6f1f343b082d4a96c3335b191f00d41152ae090627f727b8360", "enabled": 1 } }, "watchdog": { "100": { "checksum": "sha256:1171d89af3fa7fc84808f2b6f44fa9a6f8c161198fd3e326fa0f6f63b3820aac", "enabled": 1 } }, "wdmd": { "100": { "checksum": "sha256:9d1a40c730927da34e987d2aae6018f328fd6e9bf2d6134644515f50a359eb8c", "enabled": 1 } }, "webadm": { "100": { "checksum": "sha256:f836d4af3abab47d1c5afe5f6a83ca9303506403e75e65f10fe4c4c719dee46a", "enabled": 1 } }, "webalizer": { "100": { "checksum": "sha256:43f69260eb8c399ceb227825b190ee1758a60b1b78052b1fbfe21d5fce8daab5", "enabled": 1 } }, "wine": { "100": { "checksum": "sha256:204369c49480adaf6c3bbee72b72ec17a5fac47821e371ac9b0e633666af050e", "enabled": 1 } }, "wireguard": { "100": { "checksum": "sha256:7ce468b04b03ed26fb5ebadc5020c3fd578c678d97337d1d8afa471bf472c6d5", "enabled": 1 } }, "wireshark": { "100": { "checksum": "sha256:e1b5a6483deac005288672619780282eb889108244815c872e59166ae8df750a", "enabled": 1 } }, "xen": { "100": { "checksum": "sha256:e79e6baf86bd76cda73219fd891bd706c38ecb6e37a92c24ba4291b28e8782de", "enabled": 1 } }, "xguest": { "100": { "checksum": "sha256:aa67ee5990009528b3dd86133a60dddabbad37ac073700d62e58e149ef98ed73", "enabled": 1 } }, "xserver": { "100": { "checksum": "sha256:37683b073599cf9a0ceb9c6621515cd7fb56a36719d23fdbb75c9458b2b73da9", "enabled": 1 } }, "zabbix": { "100": { "checksum": "sha256:a7eab2820c4bf5c9e51dd30942fce48426c0e35c25d5c13efda6e9313602a66f", "enabled": 1 } }, "zarafa": { "100": { "checksum": "sha256:b1738ace3c35a58867613fabb433a761136afae86bab322ea4d192436c5b0ddd", "enabled": 1 } }, "zebra": { "100": { "checksum": "sha256:913b9c2802fc6607d811ffc278dff3fb84d81942603cf21e1b4efcebf6a7529e", "enabled": 1 } }, "zoneminder": { "100": { "checksum": "sha256:19a33723d291446ee9617d0120088d7bae884e5a963c48a8afec20fc6bacc4bc", "enabled": 1 } }, "zosremote": { "100": { "checksum": "sha256:0e85101587e037fa8552703dfe40e3c31d4b86d65981d03ca3bcb9f91cde9e06", "enabled": 1 } } }, "selinux_priorities": true }, "changed": false } TASK [fedora.linux_system_roles.selinux : Load SELinux modules] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:115 Saturday 17 August 2024 19:38:36 -0400 (0:00:03.460) 0:08:37.592 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "selinux_modules is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Restore SELinux labels on filesystem tree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:128 Saturday 17 August 2024 19:38:37 -0400 (0:00:00.286) 0:08:37.879 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.selinux : Restore SELinux labels on filesystem tree in check mode] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:136 Saturday 17 August 2024 19:38:37 -0400 (0:00:00.164) 0:08:38.043 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.nbde_server : Stat the tangd custom port systemd directory] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/tangd-custom-port.yml:14 Saturday 17 August 2024 19:38:37 -0400 (0:00:00.362) 0:08:38.405 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.nbde_server : Get a list of files in the tangd custom directory] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/tangd-custom-port.yml:19 Saturday 17 August 2024 19:38:38 -0400 (0:00:00.621) 0:08:39.027 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__nbde_server_tangd_dir_stat.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.nbde_server : Manage tangd custom port systemd directory] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/tangd-custom-port.yml:35 Saturday 17 August 2024 19:38:38 -0400 (0:00:00.152) 0:08:39.180 ******* changed: [managed_node2] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/systemd/system/tangd.socket.d", "secontext": "unconfined_u:object_r:systemd_unit_file_t:s0", "size": 4096, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.nbde_server : Creates the file with the port entry that we want tangd to listen to] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/tangd-custom-port.yml:44 Saturday 17 August 2024 19:38:39 -0400 (0:00:00.815) 0:08:39.995 ******* changed: [managed_node2] => { "changed": true, "checksum": "cab519df8c21e60fd06ac780e2c7bd41ad441042", "dest": "/etc/systemd/system/tangd.socket.d/override.conf", "gid": 0, "group": "root", "md5sum": "fc727969e0bd264a9cc7f9c6bc56714c", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tangd_unit_file_t:s0", "size": 90, "src": "/root/.ansible/tmp/ansible-tmp-1723937919.4058845-168650-273975859625714/.source.conf", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.nbde_server : Set flag to to tell main that the port has changed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/tangd-custom-port.yml:53 Saturday 17 August 2024 19:38:40 -0400 (0:00:01.378) 0:08:41.373 ******* ok: [managed_node2] => { "ansible_facts": { "__nbde_server_port_changed": true }, "changed": false } TASK [Ensure the desired port is added to firewalld] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/tangd-custom-port.yml:57 Saturday 17 August 2024 19:38:40 -0400 (0:00:00.226) 0:08:41.600 ******* included: fedora.linux_system_roles.firewall for managed_node2 TASK [fedora.linux_system_roles.firewall : Setup firewalld] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:2 Saturday 17 August 2024 19:38:41 -0400 (0:00:00.542) 0:08:42.142 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml for managed_node2 TASK [fedora.linux_system_roles.firewall : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:2 Saturday 17 August 2024 19:38:42 -0400 (0:00:00.565) 0:08:42.708 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__firewall_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Check if system is ostree] ********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:10 Saturday 17 August 2024 19:38:42 -0400 (0:00:00.256) 0:08:42.965 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.firewall : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:15 Saturday 17 August 2024 19:38:42 -0400 (0:00:00.681) 0:08:43.646 ******* ok: [managed_node2] => { "ansible_facts": { "__firewall_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.firewall : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:22 Saturday 17 August 2024 19:38:43 -0400 (0:00:00.223) 0:08:43.870 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.firewall : Set flag if transactional-update exists] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:27 Saturday 17 August 2024 19:38:43 -0400 (0:00:00.628) 0:08:44.499 ******* ok: [managed_node2] => { "ansible_facts": { "__firewall_is_transactional": false }, "changed": false } TASK [fedora.linux_system_roles.firewall : Install firewalld] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:31 Saturday 17 August 2024 19:38:43 -0400 (0:00:00.175) 0:08:44.675 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: firewalld TASK [fedora.linux_system_roles.firewall : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:43 Saturday 17 August 2024 19:38:45 -0400 (0:00:01.507) 0:08:46.182 ******* skipping: [managed_node2] => { "false_condition": "__firewall_is_transactional | d(false)" } TASK [fedora.linux_system_roles.firewall : Reboot transactional update systems] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:48 Saturday 17 August 2024 19:38:45 -0400 (0:00:00.085) 0:08:46.267 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__firewall_is_transactional | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Fail if reboot is needed and not set] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:53 Saturday 17 August 2024 19:38:45 -0400 (0:00:00.187) 0:08:46.455 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__firewall_is_transactional | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Collect service facts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:5 Saturday 17 August 2024 19:38:45 -0400 (0:00:00.093) 0:08:46.549 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Attempt to stop and disable conflicting services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:9 Saturday 17 August 2024 19:38:45 -0400 (0:00:00.085) 0:08:46.634 ******* skipping: [managed_node2] => (item=nftables) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "item": "nftables", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=iptables) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "item": "iptables", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=ufw) => { "ansible_loop_var": "item", "changed": false, "false_condition": "firewall_disable_conflicting_services | bool", "item": "ufw", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.firewall : Unmask firewalld service] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:22 Saturday 17 August 2024 19:38:46 -0400 (0:00:00.114) 0:08:46.748 ******* ok: [managed_node2] => { "changed": false, "name": "firewalld", "status": { "AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dbus-broker.service sysinit.target dbus.socket polkit.service system.slice basic.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target network-pre.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target ipset.service nftables.service iptables.service ip6tables.service ebtables.service", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "\"man:firewalld(1)\"", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "DynamicUser": "no", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14724", "LimitNPROCSoft": "14724", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14724", "LimitSIGPENDINGSoft": "14724", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3288342528", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "[not set]", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "[not set]", "MemorySwapMax": "infinity", "MemorySwapPeak": "[not set]", "MemoryZSwapCurrent": "[not set]", "MemoryZSwapMax": "infinity", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "firewalld.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target system.slice dbus.socket", "Restart": "no", "RestartKillSignal": "15", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4417", "TimeoutAbortUSec": "45s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "45s", "TimeoutStopFailureMode": "abort", "TimeoutStopUSec": "45s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "disabled", "UtmpMode": "init", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.firewall : Enable and start firewalld service] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:28 Saturday 17 August 2024 19:38:46 -0400 (0:00:00.638) 0:08:47.387 ******* changed: [managed_node2] => { "changed": true, "enabled": true, "name": "firewalld", "state": "started", "status": { "AccessSELinuxContext": "system_u:object_r:firewalld_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dbus-broker.service basic.target system.slice sysinit.target dbus.socket polkit.service", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "shutdown.target network-pre.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "ipset.service iptables.service ip6tables.service nftables.service ebtables.service shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "\"man:firewalld(1)\"", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "DynamicUser": "no", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14724", "LimitNPROCSoft": "14724", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14724", "LimitSIGPENDINGSoft": "14724", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3305324544", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "[not set]", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "[not set]", "MemorySwapMax": "infinity", "MemorySwapPeak": "[not set]", "MemoryZSwapCurrent": "[not set]", "MemoryZSwapMax": "infinity", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "firewalld.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.socket sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4417", "TimeoutAbortUSec": "45s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "45s", "TimeoutStopFailureMode": "abort", "TimeoutStopUSec": "45s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "disabled", "UtmpMode": "init", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.firewall : Check if previous replaced is defined] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:34 Saturday 17 August 2024 19:38:47 -0400 (0:00:01.263) 0:08:48.650 ******* ok: [managed_node2] => { "ansible_facts": { "__firewall_previous_replaced": false, "__firewall_python_cmd": "/usr/bin/python3.12", "__firewall_report_changed": true }, "changed": false } TASK [fedora.linux_system_roles.firewall : Get config files, checksums before and remove] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:43 Saturday 17 August 2024 19:38:48 -0400 (0:00:00.101) 0:08:48.752 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Tell firewall module it is able to report changed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:55 Saturday 17 August 2024 19:38:48 -0400 (0:00:00.084) 0:08:48.836 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Configure firewall] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:71 Saturday 17 August 2024 19:38:48 -0400 (0:00:00.122) 0:08:48.959 ******* changed: [managed_node2] => (item={'port': '7500/tcp', 'zone': 'public', 'state': 'enabled', 'immediate': True, 'permanent': True}) => { "__firewall_changed": true, "ansible_loop_var": "item", "changed": true, "item": { "immediate": true, "permanent": true, "port": "7500/tcp", "state": "enabled", "zone": "public" } } TASK [fedora.linux_system_roles.firewall : Gather firewall config information] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:120 Saturday 17 August 2024 19:38:49 -0400 (0:00:00.993) 0:08:49.953 ******* skipping: [managed_node2] => (item={'port': '7500/tcp', 'zone': 'public', 'state': 'enabled', 'immediate': True, 'permanent': True}) => { "ansible_loop_var": "item", "changed": false, "false_condition": "'detailed' in fw[0]", "item": { "immediate": true, "permanent": true, "port": "7500/tcp", "state": "enabled", "zone": "public" }, "skip_reason": "Conditional result was False" } skipping: [managed_node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.firewall : Update firewalld_config fact] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:130 Saturday 17 August 2024 19:38:49 -0400 (0:00:00.150) 0:08:50.103 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "'detailed' in fw[0]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Gather firewall config if no arguments] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:139 Saturday 17 August 2024 19:38:49 -0400 (0:00:00.116) 0:08:50.219 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "firewall == None or firewall | length == 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Update firewalld_config fact] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:144 Saturday 17 August 2024 19:38:49 -0400 (0:00:00.091) 0:08:50.311 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "firewall == None or firewall | length == 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Get config files, checksums after] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:153 Saturday 17 August 2024 19:38:49 -0400 (0:00:00.092) 0:08:50.403 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Calculate what has changed] ********* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:163 Saturday 17 August 2024 19:38:49 -0400 (0:00:00.088) 0:08:50.492 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__firewall_previous_replaced | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Show diffs] ************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:169 Saturday 17 August 2024 19:38:49 -0400 (0:00:00.087) 0:08:50.579 ******* skipping: [managed_node2] => { "false_condition": "__firewall_previous_replaced | bool" } TASK [fedora.linux_system_roles.nbde_server : Reload the daemons so the new changes take effect] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main-tang.yml:34 Saturday 17 August 2024 19:38:50 -0400 (0:00:00.151) 0:08:50.731 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.nbde_server : Ensure required services are enabled and at the right state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main-tang.yml:39 Saturday 17 August 2024 19:38:50 -0400 (0:00:00.887) 0:08:51.618 ******* changed: [managed_node2] => (item=tangd.socket) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "tangd.socket", "name": "tangd.socket", "state": "started", "status": { "Accept": "yes", "AccessSELinuxContext": "system_u:object_r:tangd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket sysinit.target system.slice", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Backlog": "2147483647", "Before": "shutdown.target sockets.target", "BindIPv6Only": "default", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "Broadcast": "no", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "no", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "DeferAcceptUSec": "0", "Delegate": "no", "Description": "Tang Server socket", "DevicePolicy": "auto", "DirectoryMode": "0755", "Documentation": "\"man:tang(8)\"", "DropInPaths": "/etc/systemd/system/tangd.socket.d/override.conf", "DynamicUser": "no", "ExecStartPre": "{ path=/usr/bin/chown ; argv[]=/usr/bin/chown -R tang:tang /var/db/tang ; ignore_errors=yes ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorName": "tangd.socket", "FinalKillSignal": "9", "FlushPending": "no", "FragmentPath": "/usr/lib/systemd/system/tangd.socket", "FreeBind": "no", "FreezerState": "running", "GID": "[not set]", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPTOS": "-1", "IPTTL": "-1", "Id": "tangd.socket", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeepAlive": "no", "KeepAliveIntervalUSec": "0", "KeepAliveProbes": "0", "KeepAliveTimeUSec": "0", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14724", "LimitNPROCSoft": "14724", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14724", "LimitSIGPENDINGSoft": "14724", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "Listen": "[::]:7500 (Stream)", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "Mark": "-1", "MaxConnections": "64", "MaxConnectionsPerSource": "0", "MemoryAccounting": "yes", "MemoryAvailable": "3339665408", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "[not set]", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "[not set]", "MemorySwapMax": "infinity", "MemorySwapPeak": "[not set]", "MemoryZSwapCurrent": "[not set]", "MemoryZSwapMax": "infinity", "MessageQueueMaxMessages": "0", "MessageQueueMessageSize": "0", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NAccepted": "0", "NConnections": "0", "NRefused": "0", "NUMAPolicy": "n/a", "Names": "tangd.socket", "NeedDaemonReload": "no", "Nice": "0", "NoDelay": "no", "NoNewPrivileges": "no", "NonBlocking": "no", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "PassCredentials": "no", "PassPacketInfo": "no", "PassSecurity": "no", "Perpetual": "no", "PipeSize": "0", "PollLimitBurst": "150", "PollLimitIntervalUSec": "2s", "Priority": "-1", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "ReceiveBuffer": "0", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemoveIPC": "no", "RemoveOnStop": "no", "Requires": "system.slice sysinit.target", "RestartKillSignal": "15", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "ReusePort": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "SameProcessGroup": "no", "SecureBits": "0", "SendBuffer": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "SocketMode": "0666", "SocketProtocol": "0", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "4417", "TimeoutCleanUSec": "infinity", "TimeoutUSec": "45s", "TimerSlackNSec": "50000", "Timestamping": "off", "Transient": "no", "Transparent": "no", "TriggerLimitBurst": "200", "TriggerLimitIntervalUSec": "2s", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "disabled", "UtmpMode": "init", "WatchdogSignal": "6", "Writable": "no" } } TASK [Create encrypted Stratis pool with Clevis/Tang] ************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:240 Saturday 17 August 2024 19:38:51 -0400 (0:00:01.078) 0:08:52.697 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:38:52 -0400 (0:00:00.284) 0:08:52.981 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:38:52 -0400 (0:00:00.109) 0:08:53.091 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:38:52 -0400 (0:00:00.132) 0:08:53.224 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:38:52 -0400 (0:00:00.177) 0:08:53.401 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:38:52 -0400 (0:00:00.067) 0:08:53.469 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:38:52 -0400 (0:00:00.164) 0:08:53.633 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:38:53 -0400 (0:00:00.134) 0:08:53.768 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:38:53 -0400 (0:00:00.153) 0:08:53.921 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:38:53 -0400 (0:00:00.271) 0:08:54.193 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:38:53 -0400 (0:00:00.125) 0:08:54.318 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": "sda", "encryption": true, "encryption_clevis_pin": "tang", "encryption_password": "yabbadabbadoo", "encryption_tang_url": "localhost:7500", "name": "foo", "type": "stratis" } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:38:53 -0400 (0:00:00.131) 0:08:54.450 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:38:53 -0400 (0:00:00.125) 0:08:54.575 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:38:53 -0400 (0:00:00.084) 0:08:54.659 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:38:54 -0400 (0:00:00.162) 0:08:54.822 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:38:54 -0400 (0:00:00.067) 0:08:54.890 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:38:54 -0400 (0:00:00.078) 0:08:54.969 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:38:54 -0400 (0:00:00.133) 0:08:55.103 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:38:54 -0400 (0:00:00.054) 0:08:55.157 ******* changed: [managed_node2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "create device", "device": "/dev/stratis/foo", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/stratis/foo" ], "mounts": [], "packages": [ "stratis-cli", "e2fsprogs", "stratisd" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": "tang", "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": "localhost:7500", "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:39:11 -0400 (0:00:17.065) 0:09:12.222 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:39:11 -0400 (0:00:00.108) 0:09:12.331 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937793.7743778, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "040ba4405b5492ce3b98ec92daf6841922885fc7", "ctime": 1723937793.773378, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937793.773378, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:39:12 -0400 (0:00:00.509) 0:09:12.841 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:39:12 -0400 (0:00:00.615) 0:09:13.457 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:39:13 -0400 (0:00:00.283) 0:09:13.741 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "stratis" }, { "action": "create device", "device": "/dev/stratis/foo", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0", "/dev/stratis/foo" ], "mounts": [], "packages": [ "stratis-cli", "e2fsprogs", "stratisd" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": "tang", "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": "localhost:7500", "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:39:13 -0400 (0:00:00.156) 0:09:13.897 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": "tang", "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": "localhost:7500", "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:39:13 -0400 (0:00:00.181) 0:09:14.078 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:39:13 -0400 (0:00:00.130) 0:09:14.209 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:39:13 -0400 (0:00:00.218) 0:09:14.427 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:39:13 -0400 (0:00:00.221) 0:09:14.648 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:39:14 -0400 (0:00:00.207) 0:09:14.856 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:39:14 -0400 (0:00:00.244) 0:09:15.100 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:39:14 -0400 (0:00:00.175) 0:09:15.276 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:39:15 -0400 (0:00:00.529) 0:09:15.806 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:39:15 -0400 (0:00:00.230) 0:09:16.036 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:253 Saturday 17 August 2024 19:39:17 -0400 (0:00:02.371) 0:09:18.408 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:39:17 -0400 (0:00:00.180) 0:09:18.588 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": "tang", "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_tang_thumbprint": null, "encryption_tang_url": "localhost:7500", "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "stratis", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:39:18 -0400 (0:00:00.165) 0:09:18.754 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:39:18 -0400 (0:00:00.153) 0:09:18.907 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/stratis-1-private-507e432992a14f5887b8bc1e397cd961-crypt": { "fstype": "stratis", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-507e432992a14f5887b8bc1e397cd961-crypt", "size": "10G", "type": "crypt", "uuid": "507e4329-92a1-4f58-87b8-bc1e397cd961" }, "/dev/mapper/stratis-1-private-f0d03ace871d48879010fa6a3ba03a32-flex-mdv": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-f0d03ace871d48879010fa6a3ba03a32-flex-mdv", "size": "512M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-f0d03ace871d48879010fa6a3ba03a32-flex-thindata": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-f0d03ace871d48879010fa6a3ba03a32-flex-thindata", "size": "9.5G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-f0d03ace871d48879010fa6a3ba03a32-flex-thinmeta": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-f0d03ace871d48879010fa6a3ba03a32-flex-thinmeta", "size": "6M", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-f0d03ace871d48879010fa6a3ba03a32-physical-originsub": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-f0d03ace871d48879010fa6a3ba03a32-physical-originsub", "size": "10G", "type": "stratis", "uuid": "" }, "/dev/mapper/stratis-1-private-f0d03ace871d48879010fa6a3ba03a32-thinpool-pool": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/mapper/stratis-1-private-f0d03ace871d48879010fa6a3ba03a32-thinpool-pool", "size": "9.5G", "type": "stratis", "uuid": "" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f7f248e0-e119-4ca7-8563-c5b6855de484" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:39:18 -0400 (0:00:00.569) 0:09:19.477 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003186", "end": "2024-08-17 19:39:19.155147", "rc": 0, "start": "2024-08-17 19:39:19.151961" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:39:19 -0400 (0:00:00.636) 0:09:20.114 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.004206", "end": "2024-08-17 19:39:20.787789", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:39:19.783583" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:39:20 -0400 (0:00:01.541) 0:09:21.656 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda'], 'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_clevis_pin': 'tang', 'encryption_tang_url': 'localhost:7500', 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'stratis', 'volumes': []}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:39:21 -0400 (0:00:00.301) 0:09:21.957 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:39:21 -0400 (0:00:00.169) 0:09:22.127 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:39:21 -0400 (0:00:00.141) 0:09:22.268 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:39:21 -0400 (0:00:00.122) 0:09:22.390 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:39:21 -0400 (0:00:00.278) 0:09:22.669 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:39:22 -0400 (0:00:00.291) 0:09:22.960 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:39:22 -0400 (0:00:00.115) 0:09:23.076 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:39:22 -0400 (0:00:00.124) 0:09:23.201 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:39:22 -0400 (0:00:00.167) 0:09:23.368 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:39:22 -0400 (0:00:00.179) 0:09:23.547 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:39:23 -0400 (0:00:00.186) 0:09:23.733 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:39:23 -0400 (0:00:00.171) 0:09:23.904 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:39:23 -0400 (0:00:00.175) 0:09:24.080 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:39:23 -0400 (0:00:00.154) 0:09:24.234 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:39:24 -0400 (0:00:00.820) 0:09:25.054 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:39:24 -0400 (0:00:00.141) 0:09:25.196 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:39:24 -0400 (0:00:00.235) 0:09:25.432 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:39:24 -0400 (0:00:00.117) 0:09:25.549 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:39:24 -0400 (0:00:00.118) 0:09:25.668 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:39:25 -0400 (0:00:00.123) 0:09:25.791 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:39:25 -0400 (0:00:00.163) 0:09:25.955 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:39:25 -0400 (0:00:00.173) 0:09:26.129 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:39:25 -0400 (0:00:00.262) 0:09:26.391 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:39:25 -0400 (0:00:00.138) 0:09:26.529 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:39:25 -0400 (0:00:00.138) 0:09:26.668 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:39:26 -0400 (0:00:00.099) 0:09:26.768 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:39:26 -0400 (0:00:00.085) 0:09:26.853 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:39:26 -0400 (0:00:00.074) 0:09:26.927 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:39:26 -0400 (0:00:00.159) 0:09:27.087 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:39:26 -0400 (0:00:00.084) 0:09:27.172 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:39:26 -0400 (0:00:00.352) 0:09:27.525 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:39:26 -0400 (0:00:00.104) 0:09:27.630 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:39:27 -0400 (0:00:00.308) 0:09:27.939 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:39:27 -0400 (0:00:00.213) 0:09:28.152 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:39:27 -0400 (0:00:00.114) 0:09:28.267 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:39:27 -0400 (0:00:00.068) 0:09:28.336 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:39:27 -0400 (0:00:00.158) 0:09:28.494 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:39:27 -0400 (0:00:00.163) 0:09:28.657 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:39:28 -0400 (0:00:00.100) 0:09:28.758 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:39:28 -0400 (0:00:00.263) 0:09:29.021 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.374287", "end": "2024-08-17 19:39:29.083525", "rc": 0, "start": "2024-08-17 19:39:28.709238" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "clevis_config": { "thp": "u1Kl5d_YI5krcLkFd0UqEOlFSKdSmzbxdegneTLjYQc", "url": "localhost:7500" }, "clevis_pin": "tang", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sda", "size": "20938752 sectors", "uuid": "507e4329-92a1-4f58-87b8-bc1e397cd961" } ] }, "filesystems": [], "fs_limit": 100, "name": "foo", "uuid": "f0d03ace-871d-4887-9010-fa6a3ba03a32" } ], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:39:29 -0400 (0:00:00.965) 0:09:29.987 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [ { "available_actions": "fully_operational", "blockdevs": { "cachedevs": [], "datadevs": [ { "blksizes": "base: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes, crypt: BLKSSSZGET: 512 bytes, BLKPBSZGET: 512 bytes", "clevis_config": { "thp": "u1Kl5d_YI5krcLkFd0UqEOlFSKdSmzbxdegneTLjYQc", "url": "localhost:7500" }, "clevis_pin": "tang", "in_use": true, "key_description": "blivet-foo", "path": "/dev/sda", "size": "20938752 sectors", "uuid": "507e4329-92a1-4f58-87b8-bc1e397cd961" } ] }, "filesystems": [], "fs_limit": 100, "name": "foo", "uuid": "f0d03ace-871d-4887-9010-fa6a3ba03a32" } ], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:39:29 -0400 (0:00:00.290) 0:09:30.278 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:39:29 -0400 (0:00:00.198) 0:09:30.476 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:39:30 -0400 (0:00:00.252) 0:09:30.728 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:39:30 -0400 (0:00:00.144) 0:09:30.873 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:39:30 -0400 (0:00:00.123) 0:09:30.996 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:39:30 -0400 (0:00:00.137) 0:09:31.134 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:39:30 -0400 (0:00:00.107) 0:09:31.241 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:39:30 -0400 (0:00:00.105) 0:09:31.347 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:256 Saturday 17 August 2024 19:39:30 -0400 (0:00:00.160) 0:09:31.508 ******* included: fedora.linux_system_roles.storage for managed_node2 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 17 August 2024 19:39:31 -0400 (0:00:00.469) 0:09:31.977 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 17 August 2024 19:39:31 -0400 (0:00:00.350) 0:09:32.328 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 17 August 2024 19:39:31 -0400 (0:00:00.255) 0:09:32.583 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-fs", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=Fedora_40.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "Fedora_40.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 17 August 2024 19:39:32 -0400 (0:00:00.338) 0:09:32.922 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 17 August 2024 19:39:32 -0400 (0:00:00.145) 0:09:33.067 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 17 August 2024 19:39:32 -0400 (0:00:00.129) 0:09:33.197 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 17 August 2024 19:39:32 -0400 (0:00:00.174) 0:09:33.372 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 17 August 2024 19:39:32 -0400 (0:00:00.177) 0:09:33.550 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 17 August 2024 19:39:33 -0400 (0:00:00.505) 0:09:34.055 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"blivet_available\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 17 August 2024 19:39:33 -0400 (0:00:00.145) 0:09:34.200 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": "sda", "name": "foo", "state": "absent", "type": "stratis" } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 17 August 2024 19:39:33 -0400 (0:00:00.143) 0:09:34.344 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 17 August 2024 19:39:33 -0400 (0:00:00.233) 0:09:34.577 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 17 August 2024 19:39:34 -0400 (0:00:00.165) 0:09:34.743 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 17 August 2024 19:39:34 -0400 (0:00:00.123) 0:09:34.866 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"packages_installed\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 17 August 2024 19:39:34 -0400 (0:00:00.120) 0:09:34.986 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_skip_checks is not defined or not \"service_facts\" in storage_skip_checks", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 17 August 2024 19:39:34 -0400 (0:00:00.119) 0:09:35.105 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 17 August 2024 19:39:34 -0400 (0:00:00.388) 0:09:35.494 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 17 August 2024 19:39:34 -0400 (0:00:00.130) 0:09:35.625 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "stratis" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 17 August 2024 19:39:38 -0400 (0:00:03.375) 0:09:39.001 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 17 August 2024 19:39:38 -0400 (0:00:00.127) 0:09:39.128 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723937793.7743778, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "040ba4405b5492ce3b98ec92daf6841922885fc7", "ctime": 1723937793.773378, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263853, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1723937793.773378, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1366, "uid": 0, "version": "4063150176", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 17 August 2024 19:39:38 -0400 (0:00:00.453) 0:09:39.582 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 17 August 2024 19:39:39 -0400 (0:00:00.447) 0:09:40.029 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 17 August 2024 19:39:39 -0400 (0:00:00.054) 0:09:40.084 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy device", "device": "/dev/stratis/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "stratis" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/xvda2", "/dev/zram0" ], "mounts": [], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 17 August 2024 19:39:39 -0400 (0:00:00.135) 0:09:40.219 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 17 August 2024 19:39:39 -0400 (0:00:00.127) 0:09:40.347 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 17 August 2024 19:39:39 -0400 (0:00:00.249) 0:09:40.596 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 17 August 2024 19:39:40 -0400 (0:00:00.142) 0:09:40.739 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 17 August 2024 19:39:40 -0400 (0:00:00.124) 0:09:40.864 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 17 August 2024 19:39:40 -0400 (0:00:00.159) 0:09:41.024 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 17 August 2024 19:39:40 -0400 (0:00:00.125) 0:09:41.149 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "blivet_output['mounts']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 17 August 2024 19:39:40 -0400 (0:00:00.097) 0:09:41.247 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1723936476.423309, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1723936470.6092691, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 393219, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1722940756.664, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "711642655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 17 August 2024 19:39:40 -0400 (0:00:00.452) 0:09:41.699 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 17 August 2024 19:39:41 -0400 (0:00:00.055) 0:09:41.755 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:266 Saturday 17 August 2024 19:39:43 -0400 (0:00:02.504) 0:09:44.259 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 17 August 2024 19:39:43 -0400 (0:00:00.167) 0:09:44.427 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "stratis", "volumes": [] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 17 August 2024 19:39:43 -0400 (0:00:00.108) 0:09:44.536 ******* skipping: [managed_node2] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 17 August 2024 19:39:43 -0400 (0:00:00.088) 0:09:44.625 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/xvda2": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda2", "size": "250G", "type": "partition", "uuid": "fd1e4ecf-9333-45d5-a66d-c903fb23d106" }, "/dev/zram0": { "fstype": "", "label": "", "mountpoint": "[SWAP]", "name": "/dev/zram0", "size": "3.6G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 17 August 2024 19:39:44 -0400 (0:00:00.437) 0:09:45.062 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003275", "end": "2024-08-17 19:39:44.701854", "rc": 0, "start": "2024-08-17 19:39:44.698579" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Aug 6 10:39:16 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fd1e4ecf-9333-45d5-a66d-c903fb23d106 / ext4 defaults 1 1 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 17 August 2024 19:39:44 -0400 (0:00:00.510) 0:09:45.573 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003245", "end": "2024-08-17 19:39:45.257340", "failed_when_result": false, "rc": 0, "start": "2024-08-17 19:39:45.254095" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 17 August 2024 19:39:45 -0400 (0:00:00.526) 0:09:46.100 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 => (item={'disks': ['sda'], 'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'absent', 'type': 'stratis', 'volumes': []}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 17 August 2024 19:39:45 -0400 (0:00:00.397) 0:09:46.497 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 17 August 2024 19:39:45 -0400 (0:00:00.137) 0:09:46.635 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 17 August 2024 19:39:46 -0400 (0:00:00.131) 0:09:46.767 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 17 August 2024 19:39:46 -0400 (0:00:00.119) 0:09:46.886 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 => (item=members) included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 17 August 2024 19:39:46 -0400 (0:00:00.262) 0:09:47.148 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 17 August 2024 19:39:46 -0400 (0:00:00.122) 0:09:47.270 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 17 August 2024 19:39:46 -0400 (0:00:00.104) 0:09:47.375 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 17 August 2024 19:39:46 -0400 (0:00:00.119) 0:09:47.494 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 17 August 2024 19:39:47 -0400 (0:00:00.289) 0:09:47.784 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 17 August 2024 19:39:47 -0400 (0:00:00.126) 0:09:47.910 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 17 August 2024 19:39:47 -0400 (0:00:00.124) 0:09:48.035 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 17 August 2024 19:39:47 -0400 (0:00:00.125) 0:09:48.161 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 17 August 2024 19:39:47 -0400 (0:00:00.124) 0:09:48.285 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 17 August 2024 19:39:47 -0400 (0:00:00.105) 0:09:48.390 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: True STDERR: OpenSSH_9.6p1, OpenSSL 3.2.1 30 Jan 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.203 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.203 originally 10.31.44.203 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2d9356a4cd' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.203 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.475) 0:09:48.866 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.077) 0:09:48.944 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.123) 0:09:49.068 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.155) 0:09:49.223 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.065) 0:09:49.289 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.065) 0:09:49.354 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.066) 0:09:49.420 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.064) 0:09:49.485 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.064) 0:09:49.550 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.065) 0:09:49.616 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 17 August 2024 19:39:48 -0400 (0:00:00.064) 0:09:49.680 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 17 August 2024 19:39:49 -0400 (0:00:00.064) 0:09:49.745 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 17 August 2024 19:39:49 -0400 (0:00:00.230) 0:09:49.976 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 17 August 2024 19:39:49 -0400 (0:00:00.085) 0:09:50.061 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 17 August 2024 19:39:49 -0400 (0:00:00.129) 0:09:50.190 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 17 August 2024 19:39:49 -0400 (0:00:00.054) 0:09:50.245 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 17 August 2024 19:39:49 -0400 (0:00:00.185) 0:09:50.430 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check member encryption] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 17 August 2024 19:39:49 -0400 (0:00:00.102) 0:09:50.532 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 17 August 2024 19:39:50 -0400 (0:00:00.419) 0:09:50.952 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 17 August 2024 19:39:50 -0400 (0:00:00.176) 0:09:51.128 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 17 August 2024 19:39:50 -0400 (0:00:00.101) 0:09:51.230 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 17 August 2024 19:39:50 -0400 (0:00:00.103) 0:09:51.333 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 17 August 2024 19:39:50 -0400 (0:00:00.128) 0:09:51.461 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 17 August 2024 19:39:51 -0400 (0:00:00.248) 0:09:51.710 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 17 August 2024 19:39:51 -0400 (0:00:00.107) 0:09:51.818 ******* included: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 17 August 2024 19:39:51 -0400 (0:00:00.216) 0:09:52.034 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "stratis", "report" ], "delta": "0:00:00.379799", "end": "2024-08-17 19:39:52.062016", "rc": 0, "start": "2024-08-17 19:39:51.682217" } STDOUT: { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [], "stopped_pools": [] } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 17 August 2024 19:39:52 -0400 (0:00:00.918) 0:09:52.953 ******* ok: [managed_node2] => { "ansible_facts": { "_stratis_pool_info": { "name_to_pool_uuid_map": {}, "partially_constructed_pools": [], "path_to_ids_map": {}, "pools": [], "stopped_pools": [] } }, "changed": false } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 17 August 2024 19:39:52 -0400 (0:00:00.092) 0:09:53.045 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 17 August 2024 19:39:52 -0400 (0:00:00.096) 0:09:53.142 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 17 August 2024 19:39:52 -0400 (0:00:00.067) 0:09:53.209 ******* skipping: [managed_node2] => { "changed": false, "false_condition": "storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 17 August 2024 19:39:52 -0400 (0:00:00.067) 0:09:53.277 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 17 August 2024 19:39:52 -0400 (0:00:00.069) 0:09:53.346 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 17 August 2024 19:39:52 -0400 (0:00:00.072) 0:09:53.418 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 17 August 2024 19:39:52 -0400 (0:00:00.054) 0:09:53.473 ******* skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 17 August 2024 19:39:52 -0400 (0:00:00.052) 0:09:53.525 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } PLAY RECAP ********************************************************************* managed_node2 : ok=913 changed=25 unreachable=0 failed=0 skipped=1271 rescued=0 ignored=0 Saturday 17 August 2024 19:39:53 -0400 (0:00:00.274) 0:09:53.800 ******* =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 90.70s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 17.07s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.63s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.44s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.36s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.35s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.18s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.55s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 3.78s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.selinux : Get SELinux modules facts ----------- 3.46s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:112 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 3.38s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Gathering Facts --------------------------------------------------------- 3.37s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/tests/storage/tests_stratis.yml:2 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 3.31s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 3.15s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.nbde_server : Ensure tang is installed -------- 2.96s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/nbde_server/tasks/main-tang.yml:2 fedora.linux_system_roles.storage : Update facts ------------------------ 2.70s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 fedora.linux_system_roles.storage : Update facts ------------------------ 2.61s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 fedora.linux_system_roles.storage : Get service facts ------------------- 2.61s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 fedora.linux_system_roles.storage : Update facts ------------------------ 2.58s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 fedora.linux_system_roles.storage : Update facts ------------------------ 2.58s /tmp/collections-43F/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222