Hi Mark,
sorry for the long delay (sw issues again;)
the process is still running/idle, and within the last week it’s grown to ~10 Gbyte ram (RSS, 15G virt), so it’s leaking memory “as hell”:
$ top -p 2850319
  PID USER     PR NI   VIRT   RES   SHR S %CPU %MEM    TIME+ COMMAND                                                                             Â
2850319 koh8rt   20  0  15.6g  9.8g 73808 S 18.8 31.4  1536:47 Logic     Â
$Â Â ps up 2850319
USERÂ Â Â Â Â Â Â Â PID %CPU %MEMÂ Â Â VSZÂ Â RSS TTYÂ Â Â Â Â STAT STARTÂ Â TIME COMMAND
koh8rt  2850319 13.0 31.4 16309704 10246352 pts/4 Sl May18 1537:06 /tmp/.mount_Logic2BkMyWd/Logic --type=renderer --field-trial-handle=2499542314422784525,94
                                                                     Â
this is top output in thread view (“H”):
   PID USER     PR NI   VIRT   RES   SHR S %CPU %MEM    TIME+ COMMAND                                                                             Â
2850327 koh8rt   20  0  15.6g  9.8g 73956 R  6.7 31.4 600:43.47 Compositor                                                                          Â
2850319 koh8rt   20  0  15.6g  9.8g 73956 R  5.4 31.4 499:19.02 Logic                                                                               Â
2850324 koh8rt   20  0  15.6g  9.8g 73956 S  0.8 31.4 70:32.17 Chrome_ChildIOT                                                                     Â
2850371 koh8rt   20  0  15.6g  9.8g 73956 S  0.8 31.4 73:06.84 DedicatedWorker                                                                     Â
2850356 koh8rt   20  0  15.6g  9.8g 73956 S  0.7 31.4 26:28.88 TaskWorker                                                                          Â
2850359 koh8rt   20  0  15.6g  9.8g 73956 S  0.7 31.4 26:28.54 TaskWorker                                                                          Â
2850360 koh8rt   20  0  15.6g  9.8g 73956 S  0.7 31.4 26:29.35 TaskWorker                                                                          Â
2850362 koh8rt   20  0  15.6g  9.8g 73956 S  0.7 31.4 26:29.82 TaskWorker                                                                          Â
2850355 koh8rt   20  0  15.6g  9.8g 73956 S  0.6 31.4 26:26.13 TaskWorker                                                                          Â
2850357 koh8rt   20  0  15.6g  9.8g 73956 S  0.6 31.4 26:20.72 TaskWorker                                                                          Â
2850358 koh8rt   20  0  15.6g  9.8g 73956 S  0.6 31.4 26:29.91 TaskWorker                                                                          Â
2850361 koh8rt   20  0  15.6g  9.8g 73956 S  0.6 31.4 26:32.17 TaskWorker                                                                          Â
2850330 koh8rt   20  0  15.6g  9.8g 73956 S  0.2 31.4 11:47.39 CompositorTileW                                                                     Â
2850331 koh8rt   20  0  15.6g  9.8g 73956 S  0.2 31.4 11:48.97 CompositorTileW                                                                     Â
2853316 koh8rt   20  0  15.6g  9.8g 73956 S  0.2 31.4 13:55.50 SubscriptionMan                                                                     Â
3326243 koh8rt   20  0  15.6g  9.8g 73956 S  0.2 31.4  0:04.40 SubscriptionMan                                                                     Â
2850329 koh8rt   20  0  15.6g  9.8g 73956 S  0.1 31.4 11:47.02 CompositorTileW                                                                     Â
2853363 koh8rt   20  0  15.6g  9.8g 73956 S  0.1 31.4  8:02.98 StreamTerminal:                                                                     Â
so I try to get more info from PIDs 2850327 and 2850319
Thread 5 (Thread 0x7f1a0ff14700 (LWP 2850327)):
#0Â futex_abstimed_wait_cancelable (private=, abstime=0x7f1a0ff135c0, clockid=, expected=0, futex_word=0x7f1a0ff136b8) at …/sysdeps/nptl/futex-internal.h:320
#1Â __pthread_cond_wait_common (abstime=0x7f1a0ff135c0, clockid=, mutex=0x7f1a0ff13668, cond=0x7f1a0ff13690) at pthread_cond_wait.c:520
#2Â __pthread_cond_timedwait (cond=0x7f1a0ff13690, mutex=0x7f1a0ff13668, abstime=0x7f1a0ff135c0) at pthread_cond_wait.c:656
#3 0x000055c38ea34ce0 in ()
#4 0x00000000000d9e42 in ()
#5 0x00000000390d1526 in ()
#6 0x00000000000d9e42 in ()
#7 0x0000000039fd58f6 in ()
#8 0x0000000000000000 in ()
Thread 1 (Thread 0x7f1a140dd200 (LWP 2850319)):
#0Â futex_abstimed_wait_cancelable (private=, abstime=0x7ffee8946e50, clockid=, expected=0, futex_word=0x7ffee8946f48) at …/sysdeps/nptl/futex-internal.h:320
#1Â __pthread_cond_wait_common (abstime=0x7ffee8946e50, clockid=, mutex=0x7ffee8946ef8, cond=0x7ffee8946f20) at pthread_cond_wait.c:520
#2Â __pthread_cond_timedwait (cond=0x7ffee8946f20, mutex=0x7ffee8946ef8, abstime=0x7ffee8946e50) at pthread_cond_wait.c:656
#3 0x000055c38ea34ce0 in ()
#4 0x00000000000d9e42 in ()
#5 0x00000000390cd601 in ()
#6 0x00000000000d9e44 in ()
#7 0x000000001ec23309 in ()
#8 0x0000000000000000 in ()
generating core files for the 2 main threads with gdb worked, but they are HUGE due to memory leaking:
-rw-rw-r-- 1 koh8rt koh8rt 16G May 26 16:10 Logic2.core.2850319.Logic
-rw-rw-r-- 1 koh8rt koh8rt 1.7G May 26 16:14 Logic2.core.2850319.Logic.gz
-rw-rw-r-- 1 koh8rt koh8rt 1.6G May 26 16:25 Logic2.core.2850319.Logic.xz
-rw-rw-r-- 1 koh8rt koh8rt 16557415936 May 26 16:17 Logic2.core.2850327.Compositor
-rw-rw-r-- 1 koh8rt koh8rt 1764252181 May 26 16:24 Logic2.core.2850327.Compositor.gz
-rw-rw-r-- 1 koh8rt koh8rt 1676216586 May 26 16:21 Logic2.core.2850327.Compositor.xz
and even though they “nicely compress” I’m not sure how to transfer them to you…
finally I’ve tried ctrl+alt+shift+C and got the secret crash menu.
but when clicking âsaleaeAssertâ, the window and process just vanished, no output file or other gdb process trying to dump data…
I’ve attached strace logs over 10 secs per thread, maybe this also can give some hints what and why it’s looping ?!
now I’m restarting with 2.3.37 for some not so short/small analog measurements.
I’ll keep Logic2 running again and keep an eye on both cpu and memory for the next days;)
Harald
(Attachment strace-10s.7z is missing)