Apple has announced that it has withdrawn its plans to scan photos on users' iCloud for child sexual abuse material (CSAM). Following criticism from civil society and expert communities, in September 2021 Apple paused the rollout of the relevant feature. Now, the company will focus on its Communication Safety feature announced in August 2021, which allows parents and caregivers to opt into protections on the iCloud. Apple is also developing a new feature to detect nudity in videos sent through Messages and will expand this to its other communication applications.
Inscrivez-vous par courriel maintenant pour le Stock de Promotion hebdomadaire
100% free, Unsubscribe any time!Add 1: Room 605 6/F FA YUEN Commercial Building, 75-77 FA YUEN Street, Mongkok KL, HongKong Add 2: Room 405, Building E, MeiDu Building, Gong Shu District, Hangzhou City, Zhejiang Province, China
Whatsapp/Tel: +8618057156223 Tél. : + 33 (0) 3 88 88 20: 0086 571 86729517 Tel à HK: 00852 66181601
Courriel:: [email protected]