Logic 2.3.16

Thanks for letting us know!
Does it happen every time?

Yes. Everytime I open a capture it starts using the hdd at 100%.

I tried 2.3.15 and it happens again. Once I load a capture it halts my computer: mouse does not move, I can not even kill Logic 2 … it is at the point where it will damage my computer …

And which version should I use to avoid this problem? I want to load a capture and compare with an old one. It is currently impossible.

Actually the problem is when loading a specific file (created with 2.3.16):

or when a recording is completed with these channels.

OK … you software is unusable … I had to hard reset my computer 5 times … UNUSABLE!!!

After further analysis it seems the problem might be in:

“Simple Parallel” analyzer which tries to build the bytes out of all rising edges, but what happens is that it overloads the HDD in just 5 seconds.

Tested with 2.3.15. Memory usage goes to 5GB ? This is not normal.

I’m really sorry about that. We’ll look into this ASAP
It might be a bug in the analyzer or in the indexing service.

From a first impression, it looks like you have 19 million results, so 5GB might not be that far fetched, however, we’re still looking into this.
Have you noticed by any chance if the progress bar was updating? What OS are you using?

Also I was thinking if this is a .NET application then you might want to limit the .NET parallel processing in the Simple Parallel analyzer: https://stackoverflow.com/questions/9290498/how-can-i-limit-parallel-foreach

Parallel Processing is good for utilizing the CPU at 100%, but in my case (a bit older computer) it freezes my computer.

Yes, there are too many values to build in the Simple Parallel. Actually I do not need these millions of values. In reality I need only the first 10 to verify that my card is working.

So you can slow down the processing priority or set a limit to the values processed. The default value can be the first 500 values for example.

Still I am not sure this is the only problem here.

I’m glad that it’s working for you (partially, at least :slight_smile: ). One suggestion that might help is using the timer option (or the trim after capture option). This way, you can capture only 0.1 seconds of data for example.

I need recordings longer than 0.1s.

The other problem is when the values are printed in the terminal view in Simple Parallel. This also hijacks the memory and the HDD.

The only way to use your software is to use “Affinity” in Windows and set Logic 2 not to use more than 4 cores out the 8 I have, so that the computer is not overloaded.

start /affinity 4 C:\Progra~1\Logic\Logic.exe

Just wondering if there has been any progress on the USB LS/FS analyzer?

Either the ability to filter better (like only export information from frames that have data associated with them…)

Or the hooks to allow the HLA to work?


Another quick question, which I will experiment and try to answer myself:

In the thread:

There were two User added Analyzers that might help out. I tried one of them which is nice, now I thought I would try the second one (QSPI).

Question is: can I specify multiple paths in the preferences for the Custom Low Level Analyzers?
Example ; separated?

Again I can and will try it and see, but thought I would ask as maybe the hints or like should say one way or another. I am guessing not, as the (i) says: the path…
But thought I would ask anyway…

Update: multiple paths separated by ;'s does not work.

We only support a single location at the moment (sorry…), however, you can add multiple analyzers to that folder and it will load all of them. I hope that it helps

Curios. Did you investigate my problem where Logic 2 consumed my 8 cores so that I was forced to restart 10 times ?

I want to personally apologize for not replying earlier. I missed your last reply.
@timreyes who runs our support will take it from here. He’s more reliable than me :slight_smile:

@toncho11 Sorry you’ve had to force 4 cores under the Affinity settings all this time…

Very recently, we made several changes under the hood of the software, including some performance improvements when handling large amounts of data.

Would you mind giving it a try on your PC and capture file? It is v2.3.17 below:

The HDD usage should not change since the previous version, so you might still have your PC lock up assuming this was indicative of a HDD or CPU bottleneck limitation on your end.

In case 2.3.17 doesn’t help at all, we’ll need to go back to the drawing board, or at least do a much better job of setting proper expectations on minimum hardware requirements and HDD usage when using analyzers on large datasets. As Rani mentioned previously, 5GB usage for 19 million results wasn’t out of the ordinary based on our testing. Can you let us know the following?

  • What version of Windows are you running, and is it 32-bit or 64-bit?
  • Can you give some details on your PC specs?
  • For the tests below, you can use the same capture file you sent to us previously for repeatability.
  • Have you noticed by any chance if the progress bar was updating while your HDD/CPU usage was high? (From Rani’s question previously)
  • How long does it take for the progress bar to complete on your PC?
  • Once the progress bar is complete (i.e. Simple Parallel has finished processing), are you still seeing excessive HDD/CPU usage?
  • Can you give some details on the amount of free RAM and HDD space while Logic 2 is processing the Simple Parallel data? Does it look like you’re topping out anywhere?

Finally, we were keeping note of your download link to easily access your file across our team without having to save local copies. It seems your download link expired. Would you mind re-activating your link or sharing a new one? We’ll make sure to save it locally this time to avoid this issue again.

Looking forward to getting to the bottom of this asap.

I am reluctant to lock up my PC again :). I think I might have lost the file.

What I suggest is:

  • For the performance enhancements to somehow to make sure 1 CPU core is always spared by Logic 2. Maybe this can be done programmatically or similar to my solution.

  • Or create a configuration option in a file such as “Parallel IO Processing = false” to be used for everyone with this problem.

  • If many values are detected in a file, it is clear that a human won’t be able to inspect them. So there can be an option saying process only the first 1000, instead of 1 000 000.

Hope this helps.

@toncho11 Thanks for the suggestions. I’ll review these with the software team here. For your point #3 below:

If many values are detected in a file, it is clear that a human won’t be able to inspect them. So there can be an option saying process only the first 1000, instead of 1 000 000.

Does the “Delete Data” feature help? See below:

With this, you can save a copy of your original capture, then create a copy of that capture with its captured data length essentially trimmed down for efficiency of decoding.