bigdickcheney
2021-08-03T02:13:18+00:00
Apple 公司于 8 月 5 日表示,公司将针对儿童安全推出一项新的举措。在保证用户隐私的情况下,针对用户上传至 iCloud 相册和 iMessage 的图片,通过哈希系统对图片检测,和美国国家失踪与受虐儿童中心(以下简称 NCMEC)提供的散列数据库进行比对,用于识别和报告收集到的儿童性虐待图片。只有超过一定阈值后,违法行为才会进行上报,这将在不降低用户隐私安全的情况下来调查潜在的犯罪行为,该系统将会在 iOS 15 更新时正式启动。
更多技术信息的 PDF 文件苹果官网就有:[url]https://www.apple.com/child-safety/[/url]
[url]https://cn.wsj.com/articles/%E8%8B%B9%E6%9E%9C%E5%85%AC%E5%8F%B8%E8%AE%A1%E5%88%92%E8%AE%A9iphone%E6%A3%80%E6%B5%8B%E5%84%BF%E7%AB%A5%E8%89%B2%E6%83%85%E5%9B%BE%E7%89%87%EF%BC%8C%E5%BC%95%E5%8F%91%E9%9A%90%E7%A7%81%E4%BA%89%E8%AE%BA-11628208908[/url]
已知信息:
0. 不适用于未上传到Apple服务器的设备上的照片或其他图片
(关闭iCloud Photos可解)
1. 目前只有在美国地区的iOS15、iPadOS15 新系统的设备会部署这项技术
(国区iCloud在云上贵州无影响)
2. 不是在云端扫描图像,而是利用已知CSAM图像哈希值数据库进行设备上的匹配
[img]https://img.nga.178.com/attachments/mon_202108/06/-7Q174-2gm1K1vT1kSfn-46.jpg[/img]
3. 安全凭证苹果自己也是不可解的,只有超过阈值才会进行匹配,之后会有人工审查命中的信息是否确属儿童色情内容。
潜在问题:
0. 开了个口子,提供了一个给人工看你私人照片的途径。而这个途径,对于用户来说是黑箱的,你不知道自己的照片会不会被其他人看到
1. 可以扫描儿童色情照片,一定程度上证明也可以扫描出运动、电影、app、地标建筑、目标人物的照片,开了先例后面就不知道会发生什么了
2.
对于担心自己隐私问题的用户建议
0. 关掉 iCloud Photo library
1. 停止使用 Apple 设备
2. 使用个人nas进行备份
更多技术信息的 PDF 文件苹果官网就有:[url]https://www.apple.com/child-safety/[/url]
[url]https://cn.wsj.com/articles/%E8%8B%B9%E6%9E%9C%E5%85%AC%E5%8F%B8%E8%AE%A1%E5%88%92%E8%AE%A9iphone%E6%A3%80%E6%B5%8B%E5%84%BF%E7%AB%A5%E8%89%B2%E6%83%85%E5%9B%BE%E7%89%87%EF%BC%8C%E5%BC%95%E5%8F%91%E9%9A%90%E7%A7%81%E4%BA%89%E8%AE%BA-11628208908[/url]
已知信息:
0. 不适用于未上传到Apple服务器的设备上的照片或其他图片
(关闭iCloud Photos可解)
1. 目前只有在美国地区的iOS15、iPadOS15 新系统的设备会部署这项技术
(国区iCloud在云上贵州无影响)
2. 不是在云端扫描图像,而是利用已知CSAM图像哈希值数据库进行设备上的匹配
[img]https://img.nga.178.com/attachments/mon_202108/06/-7Q174-2gm1K1vT1kSfn-46.jpg[/img]
3. 安全凭证苹果自己也是不可解的,只有超过阈值才会进行匹配,之后会有人工审查命中的信息是否确属儿童色情内容。
摘自官网 ...
[quote]Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.
Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.[/quote]
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.
Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.[/quote]
潜在问题:
0. 开了个口子,提供了一个给人工看你私人照片的途径。而这个途径,对于用户来说是黑箱的,你不知道自己的照片会不会被其他人看到
1. 可以扫描儿童色情照片,一定程度上证明也可以扫描出运动、电影、app、地标建筑、目标人物的照片,开了先例后面就不知道会发生什么了
2.
对于担心自己隐私问题的用户建议
0. 关掉 iCloud Photo library
1. 停止使用 Apple 设备
2. 使用个人nas进行备份