<em id="0i93o"></em>
<sub id="0i93o"></sub>
<meter id="0i93o"></meter>
      1. <form id="0i93o"></form>

        官方APP下載:英語學習播客(支持蘋果手機、安卓手機)
        創辦于2003年
        UNSV記不???那就記中文諧音“憂安思?!卑?!
          Slow and Steady Wins the Race!
        UNSV英語學習頻道 - Slow and steady wins the race!
        公眾微信服務號(英語全能特訓)
        英語全能特訓(微信公眾服務號)
        UNSV英語學習頻道淘寶網店
        客服短信:18913948480
        客服郵箱:web@unsv.com
        初級VIP會員
        全站英語學習資料下載。
        ¥98元/12個月

        Apple's Plan to Search for Child Sexual Images Concerns Privacy Activists

        作者:Mario Ritter 發布日期:8-12-2021

        Apple recently announced plans to use a tool designed to identify known child sexual images on iPhones.

        The decision was praised by child protection groups. But some privacy activists and security researchers have raised concerns. They warn that the system could be misused to search for other kinds of information or be used by governments to watch citizens.

        How does it work?

        Apple says the tool, called "NeuralHash," will scan all images kept on the device that are sent to iCloud, the company's online storage system. iPhone users can choose in their settings whether to send photos to iCloud or have them remain on the device. If the images are not sent to iCloud, Apple says they will not be scanned by the new tool.

        The system searches for photos included in a database of known child sexual abuse images collected by law enforcement. Apple's scanning system will change the images into a "hash." This is a numerical piece of data that can identify the images but cannot be used to recreate them. This hash will be uploaded and compared against the law enforcement image database.

        If the system matches an image with one in the database, it will be examined by a human. If the person confirms the image as a match, the device user's account will be locked and the National Center for Missing and Exploited Children (NCMEC) will be contacted.

        The system is designed to only identify images already included in the existing database. Apple says parents taking innocent photos of unclothed children need not worry about such images being identified.

        Concerns about possible abuse

        Some security researchers have criticized the way NeuralHash "sees" the images and say the system could be used for dangerous purposes.

        Matthew Green is a top cryptography researcher at Johns Hopkins University. He told the Associated Press that he fears the system could be used to accuse innocent people. It could send users images that seem harmless but that the system would report as child sexual material. Green said researchers have been able to easily fool similar systems in the past.

        Another possible abuse could be a government seeking to watch dissidents or protesters. "What happens when the Chinese government says, 'Here is a list of files that we want you to scan for,'" Green asked. "Does Apple say no? I hope they say no, but their technology won't say no."

        In an online explanation of its system, Apple said it "will refuse any such (government) demands."

        Apple has been under pressure from governments and law enforcement to permit increased observation of data that it encrypts on its devices. The company said its new tool was designed to operate "with user privacy in mind." It also claimed the system was built to reduce the chance of misidentification to one in one trillion each year.

        However, some privacy researchers said the system represents a clear change for a company that has been praised for its leadership on privacy and security.

        In a joint statement, India McKinney and Erica Portnoy of the Electronic Frontier Foundation warned that Apple's new tool "opens a backdoor to your private life." The two noted that it may be impossible for outside researchers to confirm whether Apple is operating the system as promised.

        Apple's system was also criticized by former U.S. National Security Agency contractor Edward Snowden. Snowden lives in exile because he is wanted in the U.S. on spying charges linked to his release of information on secret government programs for gathering intelligence.

        He tweeted that with the new tool, Apple was offering "mass surveillance to the entire world." Snowden added: "Make no mistake, if they can scan for kiddie porn today, they can scan for anything tomorrow."

        Separately, Apple announced it was adding new tools to warn children and parents when sexually explicit images are received or sent. This system is designed to identify and blur such images and warn children and parents about the content. Apple says the tool will only work for messages in child accounts registered in the company's Family Sharing system.

        Apple said the changes will come out later this year with new releases of its device operating systems.

        I'm Bryan Lynn.

        The Associated Press, Reuters and Apple reported on this story. Gregory Stachel and Bryan Lynn adapted the reports for VOA Learning English. Mario Ritter, Jr. was the editor.

        We want to hear from you. Write to us in the Comments section, and visit our Facebook page.

        Words in This Story

        scan v. to look at (something) carefully usually in order to find someone or something

        match n. a person or thing that is equal to another

        cryptographyn. the use of special codes to keep information safe in computer networks

        encrypt v. to change (information) from one form to another especially to hide its meaning

        surveillance n. the act of carefully watching activities of people especially in order to control crime or the spread of disease

        porn (pornography)n. movies, pictures, magazines, etc., that show or describe naked people or sex in an open and direct way in order to cause sexual excitement

        explicit adj. showing or talking about sex or violence in a very detailed way

        blur v. to make (something) unclear or difficult to see or remember

        版權所有©2003-2019 南京通享科技有限公司,保留所有權利。未經書面許可,嚴禁轉載本站內容,違者追究法律責任。 互聯網經營ICP證:蘇B2-20120186
        網站備案:蘇公網安備 32010202011039號蘇ICP備05000269號-1中國工業和信息化部網站備案查詢
        廣播臺
        又粗又大又黄又爽的免费视频