[ad_1]
For as little as $0.12 per report, knowledge brokers within the US are promoting delicate personal knowledge about each active-duty navy members and veterans, together with their names, addresses, geolocation, web value, and faith, and details about their kids and well being circumstances.
In an unsettling research revealed right this moment, researchers from Duke University approached 12 knowledge brokers and bought 1000’s of data about American service members with minimal vetting.
The research highlights the intense privateness and nationwide safety dangers created by knowledge brokers. These firms are a part of a shadowy multibillion-dollar trade that collects, aggregates, buys, and sells knowledge, practices which might be at present authorized within the US, exacerbating the erosion of private and shopper privateness. Read the complete story.
—Tate Ryan-Mosley
The inside scoop on watermarking and content material authentication
Last week, President Biden launched his govt order on AI, a sweeping algorithm and pointers designed to enhance AI security and safety. The order put nice emphasis on watermarking and content material authentication instruments, which intention to label content material to find out whether or not it was made by a machine or a human. The White House is making a giant wager on these strategies as a technique to battle AI-generated misinformation.
The White House is encouraging tech firms to create new instruments to assist customers discern if audio and visible content material is AI-generated, and plans to work with the group behind the open-source web protocol generally known as the Coalition for Content Provenance and Authenticity, or C2PA. Tate Ryan-Mosley, our senior tech coverage reporter, has written a helpful information to C2PA, what it could actually obtain, and, crucially, what it could actually’t. Read the complete story.
This story is from The Technocrat, our weekly e-newsletter protecting tech and politics Sign up to obtain it in your inbox each Friday.
