The US Department of Defense has deployed machine learning algorithms to identify targets in over 85 air strikes on targets in Iraq and Syria this year.
The Pentagon has done this sort of thing since at least 2017 when it launched Project Maven, which sought suppliers capable of developing object recognition software for footage captured by drones. Google pulled out of the project when its own employees revolted against using AI for warfare, but other tech firms have been happy to help out.
I hope they taught those things the difference between a military base and a hospital or wedding this time
I always describe the birth and development of AI is like a trailer park trash couple that never finished grade school, highly religious and believe in ghosts and fairies that have a new baby.
We’re terrible parents that probably shouldn’t have children yet we have one that is growing fast and by the time it is fully mature, it will be way more powerful and capable than we are … but it will have the morals and ethics that it’s parents taught it.
AI vision systems are already better than humans at distinguishing between a gun and a camera or other gun-like-but-not-a-gun object, so I for one am cautiously optimistic about this sort of thing. People need to bear in mind that humans aren’t the greatest things to be putting in charge of targeting decisions either.