Bug #3143
openassertion "0" failed in hammer2_inode_xop_chain_sync
0%
Description
This happened on DragonFly 5.2.2
chain 00000000441da00a.01 key=0000000000008641 meth=30 CHECK FAIL (flags=00140002, bref/data 830e0b4c1fbfa512/9f5e0aac2b2788b2)
panic: assertion "0" failed in hammer2_inode_xop_chain_sync at /usr/src/sys/vfs/hammer2/hammer2_inode.c:1775
cpuid = 0
Trace beginning at frame 0xfffff9008a525920
panic() at panic+0x236 0xffffffff805f8666
panic() at panic+0x236 0xffffffff805f8666
hammer2_inode_xop_chain_sync() at hammer2_inode_xop_chain_sync+0x249 0xffffffff808bfde9
hammer2_primary_xops_thread() at hammer2_primary_xops_thread+0x26b 0xffffffff808be0fb
Debugger("panic")
This machine is mostly idle, except for once an hour (at 10 minutes to the hour) a dovecot backup is synced to it and on the hour it creates a snapshot of the pfs with the dovecot backup.
It came back up with no problems and it seems to work okay, but now there are hundreds errors like the following appearing:
Jul 17 15:43:14 archivist kernel: CHILD ERROR DURING FLUSH LOCK 0xfffff8009abb9420->0xfffff8009c118ca0
Jul 17 15:43:14 archivist kernel: chain 00000000441da40a.01 key=0000000000008639 meth=30 CHECK FAIL (flags=0114c002, bref/data 80bb95a42ef6c177/4762378c10af65e7)
Jul 17 15:43:14 archivist kernel: CHILD ERROR DURING FLUSH LOCK 0xfffff8009abb9420->0xfffff8009c11b0a0
Jul 17 15:43:14 archivist kernel: chain 00000000441da80a.01 key=000000000000863f meth=30 CHECK FAIL (flags=0114c002, bref/data 6c75050e70b2327a/1e4bf80cd16e453b)
Jul 17 15:43:14 archivist kernel: CHILD ERROR DURING FLUSH LOCK 0xfffff8009abb9420->0xfffff8009c118ca0
This on a vultr VM. The disk is small but there is plenty of space available:
Filesystem Size Used Avail Capacity Mounted on
/dev/mapper/root@DATA 21.6G 4469M 17.2G 20% /
/dev 1024B 1024B 0B 100% /dev
/dev/vbd0s1a 1022M 467M 473M 50% /boot
/build/usr.obj 21.6G 4469M 17.2G 20% /usr/obj
/build/var.crash 21.6G 4469M 17.2G 20% /var/crash
/build/var.cache 21.6G 4469M 17.2G 20% /var/cache
/build/var.spool 21.6G 4469M 17.2G 20% /var/spool
/build/var.log 21.6G 4469M 17.2G 20% /var/log
/build/var.tmp 21.6G 4469M 17.2G 20% /var/tmp
tmpfs 233M 12.0K 233M 0% /tmp
procfs 4096B 4096B 0B 100% /proc
@var.vmail 21.6G 4469M 17.2G 20% /var/vmail