未来智讯 > 人脸语音识别论文 > 人脸识别时代的来临
    The human face is a remarkable piece of work. The astonishing variety of facial features helps people recognise each other and is crucial to the formation of complex societies. So is the face’s ability to send emotional signals, whether through an involuntary blush or the artifice1 o f a false smile. People spend much of their waking lives, in the office and the courtroom as well as the bar and the bedroom, reading faces, for signs of attraction, hostility, trust and deceit. They also spend plenty of time trying to dissimulate2.
    Technology is rapidly catching up with the human ability to read faces. In America facial recognition is used by churches to track worshippers’ attendance; in Britain, by retailers to spot past shoplifters. In 2017, Welsh police used it to arrest a suspect outside a football game. In China it verifies the identities of ride-hailing3 drivers, permits tourists to enter attractions and lets people pay for things with a smile. Apple’s new iPhone is expected to use it to unlock the homescreen.
    Set against human skills, such applications might seem incremental4. Some breakthroughs, such as flight or the internet, obviously transform human abilities; facial recognition seems merely to encode them. Although faces are peculiar to individuals, they are also public, so technology does not, at first sight, intrude on something that is private. And yet the ability to record, store and analyse images of faces cheaply, quickly and on a vast scale promises one day to bring about fundamental changes to notions of privacy, fairness and trust.
    The final frontier
    Start with privacy. One big difference between faces and other biometric5 data, such as fingerprints, is that they work at a distance. Anyone with a phone can take a picture for facial-recognition programs to use. FindFace, an app in Russia, compares snaps of strangers with pictures on VKontakte6, a social network, and can identify people with a 70% accuracy rate. Facebook’s bank of facial images cannot be scraped7 by others, but the Silicon Valley giant could obtain pictures of visitors to a car showroom, say, and later use facial recognition to serve them ads for cars. Even if private firms are unable to join the dots between images and identity, the state often can. Photographs of half of America’s adult population are stored in databases that can be used by the FBI. Law-enforcement agencies now have a powerful weapon in their ability to track criminals, but at enormous potential cost to citizens’ privacy.
    The face is not just a name-tag. It displays a lot of other information―and machines can read that, too. Again, that promises benefits. Some firms are analysing faces to provide automated diagnoses of rare genetic conditions, such as Hajdu-Cheney syndrome8, far earlier than would otherwise be possible. Systems that measure emotion may give autistic people a grasp of social signals they find elusive.9 But the technology also threatens. Researchers at Stanford University have demonstrated that, when shown pictures of one gay man, and one straight man, the algorithm10 could attribute their sexuality correctly 81% of the time. Humans managed only 61%. In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.
    Keys, wallet, balaclava11
    Less violent forms of discrimination could also become common. Employers can already act on their prejudices to deny people a job. But facial recognition could make such bias routine, enabling firms to filter all job applications for ethnicity and signs of intelligence and sexuality. Nightclubs and sports grounds may face pressure to protect people by scanning entrants’ faces for the threat of violence―even though, owing to the nature of machine-learning, all facial-recognition systems inevitably deal in probabilities. Moreover, such systems may be biased against those who do not have white skin, since algorithms trained on data sets of mostly white faces do not work well with different ethnicities. Such biases have cropped up in automated assessments used to inform courts’ decisions about bail and sentencing.12
    Eventually, continuous facial recording and gadgets13 that paint computerised data onto the real world might change the texture of social interactions. Dissembling helps grease the wheels of daily life.14 If your partner can spot every suppressed yawn, and your boss every grimace15 of irritation, marriages and working relationships will be more truthful, but less harmonious. The basis of social interactions might change, too, from a set of commitments founded on trust to calculations of risk and reward derived from the information a computer attaches to someone’s face. Relationships might become more rational, but also more transactional.
    In democracies, at least, legislation can help alter the balance of good and bad outcomes. European regulators have embedded a set of principles in forthcoming data-protection regulation, decreeing that biometric information, which would include “faceprints”, belongs to its owner and that its use requires consent16―so that, in Europe, unlike America, Facebook could not just sell ads to those car-showroom visitors. Laws against discrimination can be applied to an employer screening candidates’ images. Suppliers of commercial facerecognition systems might submit to audits, to demonstrate that their systems are not propagating bias unintentionally.17 Firms that use such technologies should be held accountable.
         Such rules cannot alter the direction of travel, however. Cameras will only become more common with the spread of wearable devices. Efforts to bamboozle18 facial-recognition systems, from sunglasses to make-up, are already being overtaken; research from the University of Cambridge shows that artificial intelligence can reconstruct the facial structures of people in disguise. Google has explicitly turned its back on matching faces to identities, for fear of its misuse by undemocratic regimes19. Other tech firms seem less picky. Amazon and Microsoft are both using their cloud services to offer face recognition; it is central to Facebook’s plans. Governments will not want to forgo its benefits. Change is coming. Face up to it.
    1. artifice: 诡计,狡诈。
    2. dissimulate: 隐藏(真实情感或目的)。
    3. ride-hailing: 叫车服务。
    4. incremental: // 逐步增长的。
    5. biometric: 生物识别的。
    6. VKontakte: 俄罗斯最大的社交网站,VKontakte为“保持联系”之意。
    7. scrape: 本义是“艰难取得,勉强获得”,这里指利用爬虫程序抓取信息,爬虫程序是一种数据采集程序。
    8. Hajdu-Cheney syndrome: 遗传性骨发育不良并肢端溶骨症,于1948年和 1965年分别由Hajdu和Cheney两位放射科医生进行了病例报道。
    9. autistic: 自闭症的;elusive: 难懂的。
    10. algorithm: // 算法。
    11. balaclava: 巴拉克拉法帽,一种仅露双眼和鼻子的羊毛头罩,本来用于御寒,后来由于其能掩盖脸部、隐藏身份,常被特种部队、恐怖分子、劫匪等佩戴。
    12. crop up: 发生,出现;bail: 保释;sentence: 判决。
    13. gadget: //(电子或机械)小装置。
    14. dissemble: 掩饰(真实的情感或想法);grease: 给……加润滑油。
    15. grimace:(表示疼痛或厌恶等的)怪相,鬼脸。
    16. embed sth. in: 使嵌入,使成为……的重要部分;decree: 下令,命令;consent: 同意,许可。
    17. audit: 审核,严格检查;propagate: 宣传,传播。
    18. bamboozle: // 愚弄,蒙蔽。
    19. regime: 政�啵�政体。