syzbot


INFO: task hung in nfsd_umount

Status: upstream: reported on 2024/07/07 04:37
Subsystems: nfs
[Documentation on labels]
Reported-by: syzbot+b568ba42c85a332a88ee@syzkaller.appspotmail.com
First crash: 653d, last: 5h36m
Discussions (3)
Title Replies (including bot) Last reply
[syzbot] Monthly nfs report (Jul 2025) 0 (1) 2025/07/04 12:38
[syzbot] Monthly nfs report (Jun 2025) 0 (1) 2025/06/03 09:38
[syzbot] [nfs?] INFO: task hung in nfsd_umount 3 (4) 2024/09/21 07:58

Sample crash report:
INFO: task syz-executor:17205 blocked for more than 143 seconds.
      Tainted: G             L      syzkaller #0
"echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
task:syz-executor    state:D stack:24400 pid:17205 tgid:17205 ppid:1      task_flags:0x400140 flags:0x00080002
Call Trace:
 <TASK>
 context_switch kernel/sched/core.c:5295 [inline]
 __schedule+0xfee/0x6120 kernel/sched/core.c:6908
 __schedule_loop kernel/sched/core.c:6990 [inline]
 schedule+0xdd/0x390 kernel/sched/core.c:7005
 schedule_preempt_disabled+0x13/0x30 kernel/sched/core.c:7062
 __mutex_lock_common kernel/locking/mutex.c:692 [inline]
 __mutex_lock+0xc9a/0x1b90 kernel/locking/mutex.c:776
 nfsd_shutdown_threads+0x5b/0xf0 fs/nfsd/nfssvc.c:576
 nfsd_umount+0x3b/0x60 fs/nfsd/nfsctl.c:1354
 deactivate_locked_super+0xc1/0x1b0 fs/super.c:476
 deactivate_super fs/super.c:509 [inline]
 deactivate_super+0xe7/0x110 fs/super.c:505
 cleanup_mnt+0x21f/0x450 fs/namespace.c:1312
 task_work_run+0x150/0x240 kernel/task_work.c:233
 resume_user_mode_work include/linux/resume_user_mode.h:50 [inline]
 __exit_to_user_mode_loop kernel/entry/common.c:67 [inline]
 exit_to_user_mode_loop+0x100/0x4a0 kernel/entry/common.c:98
 __exit_to_user_mode_prepare include/linux/irq-entry-common.h:226 [inline]
 syscall_exit_to_user_mode_prepare include/linux/irq-entry-common.h:256 [inline]
 syscall_exit_to_user_mode include/linux/entry-common.h:325 [inline]
 do_syscall_64+0x668/0xf80 arch/x86/entry/syscall_64.c:100
 entry_SYSCALL_64_after_hwframe+0x77/0x7f
RIP: 0033:0x7fd3fe39d9d7
RSP: 002b:00007ffc53986768 EFLAGS: 00000246 ORIG_RAX: 00000000000000a6
RAX: 0000000000000000 RBX: 00007fd3fe431f90 RCX: 00007fd3fe39d9d7
RDX: 0000000000000004 RSI: 0000000000000009 RDI: 00007ffc539878b0
RBP: 00007ffc5398789c R08: 0000000000000000 R09: 0000000000000000
R10: 0000000000000000 R11: 0000000000000246 R12: 00007ffc539878b0
R13: 00007fd3fe431f90 R14: 00000000000f9434 R15: 00007ffc539878f0
 </TASK>

Showing all locks held in the system:
1 lock held by khungtaskd/31:
 #0: ffffffff8e7e9220 (rcu_read_lock){....}-{1:3}, at: rcu_lock_acquire include/linux/rcupdate.h:312 [inline]
 #0: ffffffff8e7e9220 (rcu_read_lock){....}-{1:3}, at: rcu_read_lock include/linux/rcupdate.h:850 [inline]
 #0: ffffffff8e7e9220 (rcu_read_lock){....}-{1:3}, at: debug_show_all_locks+0x3d/0x184 kernel/locking/lockdep.c:6775
3 locks held by kworker/0:5/5919:
 #0: ffff88813fe61148 ((wq_completion)rcu_gp){+.+.}-{0:0}, at: process_one_work+0x1287/0x1920 kernel/workqueue.c:3250
 #1: ffffc90004ac7d08 ((work_completion)(&(&ssp->srcu_sup->work)->work)){+.+.}-{0:0}, at: process_one_work+0x93c/0x1920 kernel/workqueue.c:3251
 #2: ffffffff8e874e78 (&ssp->srcu_sup->srcu_gp_mutex){+.+.}-{4:4}, at: srcu_advance_state kernel/rcu/srcutree.c:1837 [inline]
 #2: ffffffff8e874e78 (&ssp->srcu_sup->srcu_gp_mutex){+.+.}-{4:4}, at: process_srcu+0x77/0x1ab0 kernel/rcu/srcutree.c:1997
3 locks held by kworker/0:1/10634:
 #0: ffff88813fe63548 ((wq_completion)events){+.+.}-{0:0}, at: process_one_work+0x1287/0x1920 kernel/workqueue.c:3250
 #1: ffffc90005d6fd08 (free_ipc_work){+.+.}-{0:0}, at: process_one_work+0x93c/0x1920 kernel/workqueue.c:3251
 #2: ffffffff8e7f4e38 (rcu_state.exp_mutex){+.+.}-{4:4}, at: exp_funnel_lock+0x19e/0x3c0 kernel/rcu/tree_exp.h:343
4 locks held by kworker/u10:3/15226:
 #0: ffff88801c6ae948 ((wq_completion)netns){+.+.}-{0:0}, at: process_one_work+0x1287/0x1920 kernel/workqueue.c:3250
 #1: ffffc90003b97d08 (net_cleanup_work){+.+.}-{0:0}, at: process_one_work+0x93c/0x1920 kernel/workqueue.c:3251
 #2: ffffffff905faff0 (pernet_ops_rwsem){++++}-{4:4}, at: cleanup_net+0xb8/0x920 net/core/net_namespace.c:675
 #3: ffffffff90613928 (rtnl_mutex){+.+.}-{4:4}, at: ops_exit_rtnl_list net/core/net_namespace.c:173 [inline]
 #3: ffffffff90613928 (rtnl_mutex){+.+.}-{4:4}, at: ops_undo_list+0x7ec/0xab0 net/core/net_namespace.c:248
2 locks held by getty/16204:
 #0: ffff888037d500a0 (&tty->ldisc_sem){++++}-{0:0}, at: tty_ldisc_ref_wait+0x24/0x80 drivers/tty/tty_ldisc.c:243
 #1: ffffc9000394b2f0 (&ldata->atomic_read_lock){+.+.}-{4:4}, at: n_tty_read+0x419/0x1500 drivers/tty/n_tty.c:2211
2 locks held by syz.2.2294/17016:
 #0: ffffffff906bfab0 (cb_lock){++++}-{4:4}, at: genl_rcv+0x19/0x40 net/netlink/genetlink.c:1217
 #1: ffffffff8ec591e8 (nfsd_mutex){+.+.}-{4:4}, at: nfsd_nl_threads_set_doit+0x6c1/0xc00 fs/nfsd/nfsctl.c:1597
2 locks held by syz-executor/17205:
 #0: ffff8880565a60e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: __super_lock fs/super.c:58 [inline]
 #0: ffff8880565a60e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: __super_lock_excl fs/super.c:73 [inline]
 #0: ffff8880565a60e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: deactivate_super fs/super.c:508 [inline]
 #0: ffff8880565a60e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: deactivate_super+0xdf/0x110 fs/super.c:505
 #1: ffffffff8ec591e8 (nfsd_mutex){+.+.}-{4:4}, at: nfsd_shutdown_threads+0x5b/0xf0 fs/nfsd/nfssvc.c:576
2 locks held by syz-executor/17357:
 #0: ffff888020b1e0e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: __super_lock fs/super.c:58 [inline]
 #0: ffff888020b1e0e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: __super_lock_excl fs/super.c:73 [inline]
 #0: ffff888020b1e0e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: deactivate_super fs/super.c:508 [inline]
 #0: ffff888020b1e0e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: deactivate_super+0xdf/0x110 fs/super.c:505
 #1: ffffffff8ec591e8 (nfsd_mutex){+.+.}-{4:4}, at: nfsd_shutdown_threads+0x5b/0xf0 fs/nfsd/nfssvc.c:576
2 locks held by syz-executor/18932:
 #0: ffff888036c620e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: __super_lock fs/super.c:58 [inline]
 #0: ffff888036c620e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: __super_lock_excl fs/super.c:73 [inline]
 #0: ffff888036c620e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: deactivate_super fs/super.c:508 [inline]
 #0: ffff888036c620e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: deactivate_super+0xdf/0x110 fs/super.c:505
 #1: ffffffff8ec591e8 (nfsd_mutex){+.+.}-{4:4}, at: nfsd_shutdown_threads+0x5b/0xf0 fs/nfsd/nfssvc.c:576
2 locks held by syz-executor/20073:
 #0: ffff888079f040e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: __super_lock fs/super.c:58 [inline]
 #0: ffff888079f040e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: __super_lock_excl fs/super.c:73 [inline]
 #0: ffff888079f040e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: deactivate_super fs/super.c:508 [inline]
 #0: ffff888079f040e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: deactivate_super+0xdf/0x110 fs/super.c:505
 #1: ffffffff8ec591e8 (nfsd_mutex){+.+.}-{4:4}, at: nfsd_shutdown_threads+0x5b/0xf0 fs/nfsd/nfssvc.c:576
2 locks held by syz.5.3152/20640:
 #0: ffffffff906bfab0 (cb_lock){++++}-{4:4}, at: genl_rcv+0x19/0x40 net/netlink/genetlink.c:1217
 #1: ffffffff8ec591e8 (nfsd_mutex){+.+.}-{4:4}, at: nfsd_nl_version_set_doit+0xc4/0x7a0 fs/nfsd/nfsctl.c:1743
2 locks held by syz-executor/20703:
 #0: ffff88802d1b00e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: __super_lock fs/super.c:58 [inline]
 #0: ffff88802d1b00e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: __super_lock_excl fs/super.c:73 [inline]
 #0: ffff88802d1b00e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: deactivate_super fs/super.c:508 [inline]
 #0: ffff88802d1b00e0 (&type->s_umount_key#55){+.+.}-{4:4}, at: deactivate_super+0xdf/0x110 fs/super.c:505
 #1: ffffffff8ec591e8 (nfsd_mutex){+.+.}-{4:4}, at: nfsd_shutdown_threads+0x5b/0xf0 fs/nfsd/nfssvc.c:576
1 lock held by syz.9.3197/20891:
 #0: ffffffff90613928 (rtnl_mutex){+.+.}-{4:4}, at: tun_detach drivers/net/tun.c:634 [inline]
 #0: ffffffff90613928 (rtnl_mutex){+.+.}-{4:4}, at: tun_chr_close+0x38/0x220 drivers/net/tun.c:3436
1 lock held by syz-executor/20944:
 #0: ffffffff90613928 (rtnl_mutex){+.+.}-{4:4}, at: tun_detach drivers/net/tun.c:634 [inline]
 #0: ffffffff90613928 (rtnl_mutex){+.+.}-{4:4}, at: tun_chr_close+0x38/0x220 drivers/net/tun.c:3436
2 locks held by syz.2.3257/21316:
 #0: ffffffff905faff0 (pernet_ops_rwsem){++++}-{4:4}, at: copy_net_ns+0x451/0x7c0 net/core/net_namespace.c:577
 #1: ffffffff90613928 (rtnl_mutex){+.+.}-{4:4}, at: ip_tunnel_init_net+0x21e/0x780 net/ipv4/ip_tunnel.c:1146
2 locks held by syz.0.3258/21327:
 #0: ffffffff905faff0 (pernet_ops_rwsem){++++}-{4:4}, at: copy_net_ns+0x451/0x7c0 net/core/net_namespace.c:577
 #1: ffffffff90613928 (rtnl_mutex){+.+.}-{4:4}, at: register_nexthop_notifier+0x1b/0x70 net/ipv4/nexthop.c:3968

=============================================

NMI backtrace for cpu 0
CPU: 0 UID: 0 PID: 31 Comm: khungtaskd Tainted: G             L      syzkaller #0 PREEMPT(full) 
Tainted: [L]=SOFTLOCKUP
Hardware name: Google Google Compute Engine/Google Compute Engine, BIOS Google 02/12/2026
Call Trace:
 <TASK>
 __dump_stack lib/dump_stack.c:94 [inline]
 dump_stack_lvl+0x100/0x190 lib/dump_stack.c:120
 nmi_cpu_backtrace.cold+0x12d/0x151 lib/nmi_backtrace.c:113
 nmi_trigger_cpumask_backtrace+0x1d7/0x230 lib/nmi_backtrace.c:62
 trigger_all_cpu_backtrace include/linux/nmi.h:161 [inline]
 __sys_info lib/sys_info.c:157 [inline]
 sys_info+0x141/0x190 lib/sys_info.c:165
 check_hung_uninterruptible_tasks kernel/hung_task.c:346 [inline]
 watchdog+0xd25/0x1050 kernel/hung_task.c:515
 kthread+0x370/0x450 kernel/kthread.c:467
 ret_from_fork+0x754/0xd80 arch/x86/kernel/process.c:158
 ret_from_fork_asm+0x1a/0x30 arch/x86/entry/entry_64.S:245
 </TASK>

Crashes (3737):
Time Kernel Commit Syzkaller Config Log Report Syz repro C repro VM info Assets (help?) Manager Title
2026/03/03 19:43 upstream af4e9ef3d784 4180d919 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/03/03 11:51 upstream af4e9ef3d784 28b83e23 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/03/02 19:03 upstream 11439c4635ed b9dd6534 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/03/02 13:22 upstream 11439c4635ed b9dd6534 .config console log report info [disk image] [vmlinux] [kernel image] ci-upstream-kasan-gce-smack-root INFO: task hung in nfsd_umount
2026/03/02 09:46 upstream 39c633261414 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/03/02 01:22 upstream 11439c4635ed 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-upstream-kasan-gce-smack-root INFO: task hung in nfsd_umount
2026/03/01 19:17 upstream eb71ab2bf722 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/03/01 13:29 upstream eb71ab2bf722 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/03/01 06:08 upstream 2f9339c052bd 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/03/01 03:19 upstream 2f9339c052bd 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/03/01 02:15 upstream 2f9339c052bd 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/03/01 02:05 upstream 2f9339c052bd 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/28 17:49 upstream 4d349ee5c778 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/28 11:46 upstream 4d349ee5c778 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/28 09:56 upstream 4d349ee5c778 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/28 06:41 upstream 4d349ee5c778 43249bac .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/28 01:43 upstream a75cb869a8cc 2cf092b8 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/28 00:01 upstream a75cb869a8cc 2cf092b8 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/27 13:54 upstream a75cb869a8cc a2f13f71 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/27 09:05 upstream 3f4a08e64442 a2f13f71 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/27 06:36 upstream 3f4a08e64442 a2f13f71 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/27 04:53 upstream 3f4a08e64442 a2f13f71 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/26 23:28 upstream 3f4a08e64442 a2f13f71 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/26 17:13 upstream f4d0ec0aa20d ffa54287 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/26 04:22 upstream d9d32e5bd5a4 e0f78d93 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/25 16:52 upstream 7dff99b35460 c162cde9 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/25 14:24 upstream 7dff99b35460 c162cde9 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/25 08:21 upstream 7dff99b35460 787dfb7c .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/25 07:16 upstream 7dff99b35460 787dfb7c .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/24 19:11 upstream 7dff99b35460 96b1aa46 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/24 14:58 upstream 7dff99b35460 96b1aa46 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/24 13:38 upstream 7dff99b35460 96b1aa46 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/23 16:09 upstream 6de23f81a5e0 7c9658af .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/23 14:07 upstream 6de23f81a5e0 7c9658af .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/23 07:53 upstream 189f164e573e 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/23 06:30 upstream 189f164e573e 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/23 04:35 upstream 189f164e573e 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/23 02:32 upstream 189f164e573e 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/23 00:30 upstream 189f164e573e 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/22 14:04 upstream 32a92f8c8932 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/22 12:09 upstream 32a92f8c8932 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/22 10:40 upstream 32a92f8c8932 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/22 06:31 upstream 32a92f8c8932 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/22 05:02 upstream d79526b89571 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/22 03:28 upstream d79526b89571 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/22 00:30 upstream d79526b89571 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/21 21:38 upstream d79526b89571 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/21 15:21 upstream a95f71ad3e2e 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/21 13:29 upstream a95f71ad3e2e 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/21 11:59 upstream a95f71ad3e2e 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/21 09:54 upstream a95f71ad3e2e 6e7b5511 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/21 02:16 upstream a95f71ad3e2e 741f5161 .config console log report info [disk image] [vmlinux] [kernel image] ci-upstream-kasan-gce-smack-root INFO: task hung in nfsd_umount
2026/02/21 00:49 upstream a95f71ad3e2e 741f5161 .config console log report info [disk image] [vmlinux] [kernel image] ci-qemu-gce-upstream-auto INFO: task hung in nfsd_umount
2026/02/17 04:37 upstream 0f2acd3148e0 5d52cba5 .config console log report info [disk image] [vmlinux] [kernel image] ci-upstream-kasan-gce-selinux-root INFO: task hung in nfsd_umount
2026/01/06 15:39 upstream 7f98ab9da046 d6526ea3 .config console log report info [disk image] [vmlinux] [kernel image] ci-upstream-kasan-gce-root INFO: task hung in nfsd_umount
2024/07/06 12:12 upstream 1dd28064d416 bc4ebbb5 .config console log report info [disk image] [vmlinux] [kernel image] ci-upstream-kasan-gce-smack-root INFO: task hung in nfsd_umount
2024/07/03 04:33 upstream e9d22f7a6655 1ecfa2d8 .config console log report info [disk image] [vmlinux] [kernel image] ci-upstream-kasan-gce-smack-root INFO: task hung in nfsd_umount
2024/06/29 05:25 upstream 6c0483dbfe72 757f06b1 .config console log report info [disk image] [vmlinux] [kernel image] ci-upstream-kasan-gce-smack-root INFO: task hung in nfsd_umount
2026/02/16 00:24 linux-next 635c467cc14e 1e62d198 .config console log report info [disk image] [vmlinux] [kernel image] ci-upstream-rust-kasan-gce INFO: task hung in nfsd_umount
2026/02/13 17:07 linux-next af98e93c5c39 1e62d198 .config console log report info [disk image] [vmlinux] [kernel image] ci-upstream-linux-next-kasan-gce-root INFO: task hung in nfsd_umount
* Struck through repros no longer work on HEAD.