Comment by touisteur

10 days ago

I think they preserve timestamped I,Q data. Know some people looking at down-sampling, preselecting those signals for longer term storage and deeper reprocessing and they seem to have a 24h window to 'analyze and keep what you need'.

We're still in technological phase where ADCs are far more advanced than storage and online processing systems, which means throwing away a lot. But I have high hopes for a system where you upgrade computing, network, storage (and maybe ADCs...) and you get an improved sensor. Throw man-hours at some GPU kernel developers and you get new science. The limit seems more now about enough people and compute to fully exploit the data than technological...

Too late to edit: any idea of the resolution that the I,Q data is sampled at (bandwidth, bit depth)? I've been in one of these installations a while ago and the tourguide had really no clue about any of the details (I think he was the son of one of the scientists)?

Fascinating stuff, thank you for the details and the view of a possible path forward.

  • BTW if you're interested in the concept of upgrading a sensor without retooling the RF part, and the impact of 'just' putting new COTS racked server hardware and engineering man-hours to get a 'new' sensor with new capabilities, have a look at Julien Plante's work on NenuFAR (which isn't like the SKA at all :-) : https://cnrs.hal.science/USN/obspm-04273804v1 . Damien Gratadour, his PhD supervisor is an amazing technologist, dedicated to improving astronomy instruments, and I was very lucky to work with him and his team... the things the French can string together with small teams and thin budgets...