Sky-scanning complete for Gaia

3 days ago (esa.int)

Direct link to some very very nice images and animations: https://www.esa.int/Science_Exploration/Space_Science/Gaia

Two of my favorites: https://www.esa.int/ESA_Multimedia/Images/2025/The_best_Milk...

https://www.esa.int/ESA_Multimedia/Images/2025/01/The_best_M...

  • I get how Gaia could make the best edge on image, but how could Gaia (or anything man made) get the the "best" face on image?

    • The whole purpose of Gaia is to precisely measure the position of stars (and other objects). Once positions are known, a 3D model can be built. But how are the distances measured? The answer is parallax, essentially triangulation. You look for very small changes of position against the background sky. You use the width of the earth's orbit as the baseline and measure at different times of the year.

    • All of these are "Artist's Impressions". My best guess is they run a simulation based on the data from the spacecraft and then can pan the camera around as they see fit

      6 replies →

    • It can't. The galaxy is assumed to be roughly symmetrical, and they fill in the missing data with what we can see on our side of the galaxy. It's "best" in the sense that it's the most accurate fiction, I suppose.

      Gaia is good to about 13,000 light years: https://en.wikipedia.org/wiki/File:Galaxymap.com,_map_12000_...

      The Milky Way is maybe 100,000 light years in diameter. So we're only getting good distance readings on a small fraction, and nothing behind the central bulge of our galaxy. The first won't improve until we send an astrometry telescope way outside the orbit of the Earth, for better baselines, and the second is going to need a telescope sent 10,000 light years out of the galactic ecliptic.

      18 replies →

Gaia has a 1.0 × 0.5 m focal plane array on which light from both telescopes is projected. This in turn consists of 106 CCDs of 4500 × 1966 pixels each, for a total of 937.8 megapixels.

Neat.

  • The really neat part is the instrument precision. It's terrifyingly good and I have no idea how it (really) works.

    - "Gaia measures their positions to an accuracy of 24 microarcseconds, comparable to measuring the diameter of a human hair at a distance of 1000 km"

    https://www.esa.int/Science_Exploration/Space_Science/Gaia/C...

    • To nitpick with the grammar in the quote: It's capable of measuring to the accuracy of 120 μm at 1000 km. So it cannot accurately measure the diameter of a human hair (which ranges from around 20 to 200 μm) at that distance, but only to the accuracy of a human hair.

      3 replies →

    • It takes about 14 pictures of each star during orbit (which is quite close to Earth’s orbit around the Sun), so approximately once per month, and then compares those to calculate the star’s distance from the parallax.

  • And Gaia also has a downlink speed of approx 3Mbps. So it will process as much as possible on board and just send down less than 20 pixels per each star imaged. That is why you can not get a direct image out of it.

Has anyone created a 3D map, available via web and with ability to fly through, jump to stellar objects by name, look around, etc?

I wonder if it could keep giving us useful data without the precision rotation? Intuitively it seems like we should be able to figure out where it's pointing by star-matching plus dead reckoning based on the last frame.

  • It's possible...but the point of this instrument is to measure star locations very precisely. It probably has a star tracker for positioning doing what you're suggesting. If you were to use that type of positioning info you could introduce inaccuracies into the measured data eventually.

    Also, every mission comes to an end eventually - better to do it in the right way and have the right amount of propellent saved for either a graveyard orbit or de-orbiting. It met the mission timeline and goals.

    • Yea ok. Still, it seems like it could produce a lot of very useful data if switched to a blind spinning mode.

  • when you're trying to take relative measurements of the motion of objects within the field of view, which is how you do these fine position measurements, you don't have a lot of choices for objects with a measured location accuracy similar to the measurements you're taking. You're measuring the precise position of something compared to other things you don't know the precise position of. That requires a very stable platform so you can do comparisons internally. There are lots of options for precisely determining where you're pointing but pretty much all of them involve a loss of precision whenever the platform moves, so if you have to slew to a guidestar and back your accuracy is limited to the measurement of the guidestar and then the error induced when you slewed. You also end up using different instruments for alignment measurements vs. the actual observation (for example because the observation requires a long time at a static position), and there's an imprecision involved in comparing the different instruments because they slightly move relative to each other with thermal effects and so on. When you really get into it, using some instruments will reduce the precision of other instruments because they vibrate the platform or create heat. You have to account for all of this with a complex model.

    In practice these highly precise measurements, at least in the domain I'm familiar with, become sensor fusion problems where you take a lot of sources of position info, weight them based on their accuracy, and integrate them over time. The less stable the platform, the more error is induced by the integrating over time. Nothing in that realm is really all-or-nothing, as we're seeing with Hubble as it racks up more and more failures, but the loss of the rotation will mean more error in combining position references which will mean less accurate final observations. They may no longer be that much more accurate than measurements obtained by other means.

    I'm not sure if I explained that very coherently, it's a complex field that I used to write software in but, well, I was the person writing the software, not the person figuring out the theory. The general idea is that space-based instruments tend to have a bunch of different factors that go into their final accuracy and that accuracy normally gets worse over time as you run out of fuel and things degrade and ultimately stop functioning. Fortunately since space-based systems cost so much to build and launch, the teams behind them have usually put a lot of thought into how they'll continue to get the best use out of them as they get older. That often means having future plans for different missions that just don't require as much accuracy, which is the case with Gaia---it's ending this "phase" of the mission plan.

  • The current coordinate system is based on extremely distant radio sources. Ground based scopes found some bright galactic sources which GAIA aligned to, and is measuring everything relative to those. And now GAIA is the defining source of the ICRS for optical observations.

Uh-huh, just in time it seems.

"Gaia’s fuel tank is now approaching empty"

Well, congrats to all involved to such a supremely successful and important mission. When I went to school, it was said that astronomers were happy if they get the order of a measurement right. No such excuses anymore (at least for some 2 billion "nearby" objects)!

Hope they have captured an image of Planet Nine somewhere there, and eventually are able to pinpoint it.

  • I'm really curious whether it exists and / or if they will ever be able to find it, it's wild to consider they can map the galaxy but can't spot something that's relatively close to us. But, detection relies on reflected sunlight or blotting out stars etc behind it, and given the supposed distance that would be minimal.

    That said, if it does exist I'm sure they'll find it eventually.

  • I don't think it would have a detection.

    Gaia is a a satellite for mapping star positions and any possible planets past Neptune would be very faint.

    It's also not a telescope in the traditional sense, it's more like a bar scanner in a supermarket but it's spinning around.

And now to use the data to make the most realistic scifi game. With correct stellar motion during relativistic travel.

I really wish they would have identified Gaia as some kind of satellite. Gaia is also a name for Earth itself.

You may want to know: the high-res images which are offered for downloading contain the same image which is shown on the page, that is, the infographic.

Not worth the download, as I thought that it would contain a huge panorama of the sky.

  • For real data you can use Gaia ESA archive: https://gea.esac.esa.int/archive/

    I went to study MSc in Space Science and Technology as a hobby few years ago. In one course (2022) we had an assignment to find Supernovae from recent Gaia data (Python code). Then made sure this is observable by University’s robotic telescope (and compliant with local weather forecast). Next requested the observation from the telescope and if successful, received the pictures next day. Had to analyse the results as well. It surprised me how much data there actually is available in quite open format from ESA missions.

    Controlling remote telescope few thousand kilometres away was also a nice experience.

  • The downlinked data is claimed to be 142TB compressed. I suspect that the huge panorama might be a little big for your computer.