Linux Multi User Support

Hello everyone,

due to the Corona situation, we’re moving our FPGA lab course for students to a remote teaching model.

Previsouly, we had a setup with one FPGA and one Logic 8 at each student Linux PC. Now we moved everything to our server room and connected 8 FPGA Boards, 8 Logic 8 and some more devices to the linux server with remote desktop access. We use udev rules to assign the devices to different groups on the linux machine. This is working mostly fine for FPGA and Logic Analyzers, when opening the Logic software users only see the exactly one device assigned to themselves and they can use the device just fine (bandwidth is a little limited when you attach so many devices to one USB2 controller, but that’s expected).

But we’re running into this issue: https://support.saleae.com/troubleshooting/linux-shared-memory-crash As a workaround, we tried deleting the files in /tmp after starting the software for one user, then starting the software for another user. This sometimes does work, but it seems to depend on which user starts the software first and sometimes the second instance still crashes. I don’t know whether we ever manage to start 8 instances of the software in parallel or whether we’d have to time-slice usage of the logic analyzers. Tested on both 1.2.18 and 1.2.29.

We wanted to use the 2.0 alpha which supposedly has this bug fixed, but it does not run on the Centos 7 image we use, as it’s compiled for a slightly newer glibc (Centos 7 has glibc 2.17). Upgrading the linux system for this semester can’t be done, as other software we need only supports Centos 7. Any idea how to workaround this problem or is there any chance of getting a 2.0 alpha version compiled against glibc 2.17?

On a related note, we’d like to install the software on readonly nfs, as we do for all other software. Previously the software just crashed when ran from readonly storage, so we always had to copy the Logic software to /tmp before starting it. Has this been fixed in the meantime?

Best regards
Johannes

Hi @johannes, we’re still looking into this issue. This is fairly complex. Let’s continue our discussion via email as we may want to grab some information from you.