Comment by dundarious

4 days ago

I believe 972mag.com have reported on Palentir tech involved in the "AI target selection" programs that the Israeli military has used in Gaza. My recollection is they use a logic similar to the subprime ratings agency scandal: collate info on individuals (cell tower proximity, movement patterns, social media leanings), and find the top 5% of target candidates, call those "high quality" regardless of any absolute metric of quality, and then rubber-stamp approve air strikes on their homes by the human lawyers "in the loop" -- then repeat with the next top 5% and call those "high quality" again. The implication was that Palentir worked on the ranking system itself. (The 5% is arbitrary here, a stand-in for whatever top slice they do use)

There are a couple such systems, and I am speaking without the ability to take the time right now to find those articles to confirm/counter my recollections, so consider this a prompt for a proper review -- ironic.

This comment may be a good stepping stone: https://news.ycombinator.com/item?id=46222724

The human in the loop gets a few seconds to decide if it's a target or not, do not know the exact number.