Comment by jtrueb
21 hours ago
Obviously there is a lot of work here, but I am a bit confused. If you already have lab code in Julia, Matlab, R, Python, Excel, etc., what is the motivation to use this tool? Is this hot in a specific community?
21 hours ago
Obviously there is a lot of work here, but I am a bit confused. If you already have lab code in Julia, Matlab, R, Python, Excel, etc., what is the motivation to use this tool? Is this hot in a specific community?
I suppose this is a FOSS solution for the roughly same space occupied by commercial tools like Origin, that are very popular in some scientific communities.
They can be useful if you have other tools (e.g. measurement software) that already produces the data you want, and you just want a GUI tool to create plots, and maybe do some simple things like least squares curve fitting etc.
If you already do a lot of data wrangling in something with a programming language and plotting libraries accessible from said language, like the ones you mention, yeah, this is not the tool for you.
It is! I remember using this (or SciDavis, a related project) a couple of years back in college. It was not as powerful as Origin 10 years ago, but it ran on Linux.
This is great for people who don't know nor want to learn to program.
Same experience here! We used Origin and/or QtiPlot in a physics lab for the graphs and quick regressions.
I'm in potentially the target demographic for this. I regularly bounce between R, Python, Maxima, and occasionally MATLAB/Octave. Passing data between these is usually done using the lowest common denominator: CSV. Having four completely different interfaces to these tools is a hassle. I'm also not a big fan of Jupyter and if this feels better for me it might be a decent Jupyter replacement even without the cross-language stuff.
I'm someone who enjoys figuring out the details of making a nice looking plot (in base R, I can't stand ggplot), but even as someone who enjoys it, LLMs are pretty much good enough that if I explain to them how I want the plot to look and how my data is structured, they can generate code that works first shot. It seems to me that, at this point, if you are already doing some coding in one of the above languages but either don't like or aren't comfortable making the plots using them, that LLMs can solve it for you. Unless they are significantly worse in the non-R options (which could be the case, It wouldn't surprise me if R has more plotting examples in the training set than the other languages).
Sorry for the off-topic question but would you mind to elaborate on why you can't stand ggplot? I personally haven't spent too much time with the r base functions but have come to absolutely adore ggplot for graphing and am thus very interested in learning about potential reasons to use r base plotting functions instead!
2 replies →
In my experience, there are people out there who don't program, or who don't feel that it's a productive way of doing things. I'm firmly in the Python camp, but recognize that my workplace has several JMP licenses, and the majority of engineers are satisfied with Excel. And I never let anybody see how long it takes me to do things. ;-)
However, those people also belong to the most-of-the-world who are still leery of "open source" or anything that doesn't come from a known brand.
This thing could be an option for someone who wants to mess around with data but isn't comfortable mentioning it to the boss until they see for themselves if it's worthwhile.
It's the use case. Here is one concrete example. I worked as a project engineer during the development of a launch vehicle. The telemetry data frames from every test and every flight were processed into numerous CSV or TSV files that were labeled with the parameter name. Those files could be very large depending up on their sampling rates, especially for tests that lasted hours on end. You would conduct exploratory manual analysis on that data which involves:
* Quickly cycle visually through time series graphs (often several hundred parameters). You'd have seen most of those parameters before and would quickly catch any anomalies. You can clear so much data rapidly like this.
* Quickly analyze a graph at various zoom and pan settings. May be save some as images for inclusion in documents. Like above, the zoom and pan operations often follow each other in a matter of seconds.
* Zoom into fine details, down to single bit levels or single sample intervals. There's surprising amount information you can glean even at these levels. I have run into freak, but useful single events at these levels. And since they're freak events, it's hard to predict in advance where they'd show up. So operation speed becomes a key factor again.
* Plot multiple parameters (sometimes with different units) together to assess their correlation or unusual events. We used to even have team analysis sessions where such visualizations were prepared on demand.
* Do statistical or spectral analysis (like periodograms, log or semi-log graphs, PDFs, etc)
* Add markers or notes within the graph (usually to describe events). Change the axes or plot labels. Change grid value formatting (eg: Do you want time in seconds or HMS?).
All the operations above are possible with Julia, Matlab, R or Python. And we did use almost all of them (depending on personal preference). But none of them suit the workflow described above for one simple reason - speed. You don't have enough time to select each parameter by text or GUI. There must be a way to either quickly launch a visualization or cycle through the parameters as the investigator closes each graph. You also don't have time to set zoom, pan and labels by text. It must be done using mouse (zoom & pan) and directly on the graph (labels and markers) in a WYSIWYG manner. And you don't want to run an FFT or a filter function, save the new series and then plot it - you want it done with a single menu selection. The difference is like using a C++ compiler vs Python in JupyterLab. The application we used was very similar to Labplot.
Now, Excel might seem like a better choice. In fact, LabPlot and our application all has a spreadsheet-like interface with the ability to directly import CSV, TSV, etc. But Excel just doesn't cross the finish line for our requirement. For example, to plot a time series in excel, you have to select the values (column or cells), designate the axes, optionally define the axes and graph labels, start a plot, expand it to required levels and format the print. At that rate, you wouldn't finish the analysis in a month. Those applications would do all that on their own (the labels and other metadata were embedded in the data files by means of formatted comments). But an even bigger problem was the size of the data. Some of those files on import would slow down Excel to speed of molasses. The application had disk and memory level buffering to significantly improve the responsiveness to almost instant interactivity.
I hope this gives you an idea where the tools that you mentioned are not good enough replacements for LabPlot and similar tools.
Sounds like plotjuggler (https://github.com/facontidavide/PlotJuggler) could be worth checking into as well for you.
I'm also space and launch-vehicle adjacent. Using vnlog for data storage (like what you described, but with better tooling support) and feedgnuplot/gnuplotlib for visualization. Works great. The learning curve is really easy, you can get going and start analyzing stuff FAST. Making complex plots is fiddly, but it usually is with any tool.
Thank you for this fantastic elaboration. I am in a very similar boat (unmanned aerospace) and have very similar needs. I’ve been chewing on making my own application to do this but LabPlot looks like it has potential to be exactly what I’ve been dreaming about for a few years.
Haven't tried this tool yet, but if it lets me drag and drop my data and visuals, that sounds like a great addition to those tools.