On June 7, during a training exercise in the Baltics, four U.S. Army Stryker vehicles driving along a road between Kaunas and Prienai, Lithuania, collided when the lead vehicle braked too hard for a obstacle on the roadway. Not long after the incident, a blog post made to look like a popular Lithuanian news outlet claimed the Americans had killed a local child in the collision.
A doctored image was posted showing unconcerned soldiers near a crushed bicycle and child’s corpse.
“This is a very typical example of the hostile information, and proves we are already being watched,” Lithuanian Defense Minister Raimundas Karoblis said of the fabricated event during a June 8 meeting with NATO officials. “We have no doubt that this was a deliberate and coordinated attempt aiming to raise general society’s condemnation to our allies, as well as discredit the exercises and our joint efforts on defense strengthening.”
In this case, the phony image and news article were quickly refuted, but what happens when it’s not so easy to tell truth from fiction?
The ability to distort reality is expected to reach new heights with the development of so-called “deep fake" technology: manufactured audio recordings and video footage that could fool even digital forensic experts.