Automated facial recognition


in Geek stuff, Internet matters, Law, Photography, Politics, Rants, Security

As processing power becomes cheaper and smarter software is produced, it seems inevitable that more and more people and organizations will begin to identify people automatically by recognizing their faces with surveillance cameras.

London’s Heathrow airport is planning to install such a system, and Facebook may be the ultimate database to let freelancers do it themselves.

To me, it is all rather worrisome. At a basic level, life becomes more paranoid and less creative and interesting when you are being watched at all times and all of your actions are being archived forever. It’s only a matter of time before photos from every fun party ever are being combed through by investigative journalists hoping to catch someone who has become famous in an embarrassing-looking situation. Facial recognition allows for the creation of databases that can be used for truly evil purposes, from suppression of political dissent to stalking and blackmail.

Like nerve gas, facial recognition technology is probably one of those things that it would be better if we could un-invent.

{ 66 comments… read them below or add one }

Milan July 25, 2011 at 6:31 pm

William Gibson’s novel Zero History raises some of the issues associated with ubiquitous surveillance.

The police are already using facial recognition technology in Canada.

. July 25, 2011 at 6:41 pm

Smile: you’re on camera!
Face recognition is only the beginning

DOES that new feature increasingly found in pocket-sized digital cameras—face-recognition technology—really work? It’s actually a lot cleverer than you think. A few years ago, it would have needed a shoe-box of electronics to drive it, and it would still have been hit-or-miss. But in the brutally competitive world of digital photography, Canon, Pentax and Fuji have honed the technology so their popular digicams can take more striking pictures by finding, and then focusing on, the faces in the viewfinder.

With 70% of all photographs being taken primarily of people, much is to be gained from using a face-recognition algorithm stored in the camera’s chip. This scans the image in the viewfinder for a shape resembling a human face—ie, eyes, ears, nose and chin. Once located, the camera can then adjust the focus exclusively for that part of the picture. Some cameras can recognise up to ten faces in a scene and set an average focus, or select just one face from a group and focus on that.

To prevent the camera from locking on to faces in the background, the algorithms used in today’s digicams tend to ignore features smaller than 10% of the viewfinder’s height. The result is a pin-sharp image of the subject’s facial features—the part we’re interested in—amid a slightly blurrier foreground and background. Niftier still, the algorithm can also capture the face’s actual location within the scene. That lets the user zoom in automatically on the face immediately after the picture has been taken, to check everything is okay before saving it.

oleh July 26, 2011 at 1:52 am

ICBC is the British Columbia provincial crown corporation with a monopoly on the issuance of basic auto insurance in BC and the dominant insurer in the optional coverage. This week it was reported in the Globe and Mail that ICBC offered to the Vancouver police the use of its facial recognition technology to assist in the investigation of the Vancouver Stanley Cup riots. The Vancouver Police have not taken up this offer.

I am simply now reporting on this development, not really knowing what to think – with feelings ranging from support so that the hooligan rioters can be identified to concern for setting up a precedent that can be misused.

I am sure that there are benign and positive reasons to have this technology , including the original intended to catch and verify speeding drivers.

But I am uncertain considering the potential abuse. It s difficult to un-invent something. Nuclear weapons come to mind.

. July 27, 2011 at 7:20 pm

SOCIAL NETWORKING Facebook face-recognition not available to Canadians

ANITA ELASH Facebook’s controversial facial-recognition technology for photographs will not be available to Canadian users, the social-networking site says.

A spokesperson for Facebook did not say why Canadians would not get access to the feature but wrote in an e-mail: “Not all of our launches roll out globally and each of our different products and features have varying launch schedules, and we don’t have plans to roll this feature out in Canada at this time.” The feature, referred to by Facebook as a “tag suggestion,” has raised concerns about privacy since it was launched in the United States in December. It recognizes faces in new photos and automatically suggests names of friends in photos as they are uploaded. Facebook users who do not want their photos tagged are not asked their permission but must opt out of using the feature.

A coalition of digital rights groups in the U.S. has filed a complaint with the Federal Trade Commission alleging that users are not fully informed of the biometric data that is “secretly” being collected about them. A group of privacy watchdogs from European Union countries is also investigating possible privacy violations.

TIME TO LEAD / BIOMETRIC IDENTIFICATION It’s a reality: facial recognition and privacy

ANN CAVOUKIAN Information and Privacy Commissioner of Ontario One of the most common forms of biometric identification is when our face is compared with a stored facial image, such as a driver’s licence or passport photo. Facial-recognition technology automates this process.

First, a biometric “template,” or representation of you, is generated from measurements of your physiological traits (in this case, the image of your face), and retained in a database. Further samples from captured facial images may then be compared against this template – if there’s a match, then you’re identified.

Imagine a scenario where you’re walking down the street or attending a sports event or shopping at a mall, and your photo is taken, identified, tagged and matched against a database of facial templates, without your knowledge or consent. This would be an affront to privacy that should not be tolerated.

Two key developments are making this scenario possible. First, sophisticated, high-resolution cameras in surveillance systems – and now conveniently embedded in our mobile devices – are allowing for the frequent capture of high-quality facial images “on the move.” Second, software is now available that is capable of indexing vast numbers of photos, allowing for the creation of biometric databases.

. August 2, 2011 at 12:28 am

As Internet giants Facebook Inc. and Google Inc. race to expand their facial-recognition abilities, new research shows how powerful, and potentially detrimental to privacy, these tools have become.

Armed with nothing but a snapshot, researchers at Carnegie Mellon University in Pittsburgh successfully identified about one-third of the people they tested, using a powerful facial-recognition technology recently acquired by Google.

. August 3, 2011 at 12:55 am

Mug-Shot Industry Will Dig Up Your Past, Charge You to Bury It Again

Philip Cabibi, a 31-year-old applications administrator in Utah, sat at his computer one recent Sunday evening and performed one of the compulsive rituals of the Internet Age: the ego search. He typed his name into Google to take a quick survey of how the internet sees him, like a glance in the mirror.

There were two LinkedIn hits, three White Pages listings, a post he made last year to a Meetup forum for Italian-Americans in the Salt Lake City area. Then, coming in 10th place — barely crawling onto the first page of search results — was a disturbing item.

“Philip Cabibi Mugshot,” read the title. The description was “Mug shot for Philip Cabibi booked into the Pinellas County jail.”

When he clicked through, Cabibi was greeted with his mug shot and booking information from his 2007 drunk-driving arrest in Florida. It’s an incident in Cabibi’s life that he isn’t proud of, and one that he didn’t expect to find prominently listed in his search results four years later, for all the world to see.

. August 3, 2011 at 6:53 pm

Developments in Facial Recognition

Eventually, it will work. You’ll be able to wear a camera that will automatically recognize someone walking towards you, and a earpiece that will relay who that person is and maybe something about them. None of the technologies required to make this work are hard, it’s just a matter of getting the error rate down low enough for it to be a useful system. And there have been a number of recent research results and news stories that illustrate what this new world might look like.

The police want this sort of system. I already blogged about MORIS, an iris-scanning technology that several police forces in the U.S. are using. The next step is the face-scanning glasses that the Brazilian police claim they will be wearing at the 2014 World Cup.

. August 10, 2011 at 5:22 pm

A bunch of vigilantes are organizing a Google Group dedicated to using recently revealed facial recognition tools to identify looters in the London riots. While Vancouver discussed doing something similar after the Stanley Cup riots, the city never actually moved forward on it. Ring of Steel London, though, is far more likely to incorporate FRT into its investigative work.”

. August 11, 2011 at 10:33 am

“Although we think it’s generally a pretty nifty feature, valid concerns over the misuse of Facebook’s auto-recognition tagging have lead Germany to ban it entirely. That’s right — Facebook in its current state is now illegal. The German government, which possesses perhaps the world’s most adamant privacy laws as a result of postwar abuse, considers Facebook’s facial recognition a violation of ‘the right to anonymity.'”

. September 11, 2011 at 6:19 pm

Face recognition
Anonymous no more

You can’t hide—from anybody

IF YOUR face and name are anywhere on the web, you may be recognised whenever you walk the streets—not just by cops but by any geek with a computer. That seems to be the conclusion from some new research on the limits of privacy.

For suspected miscreants, and people chasing them, face-recognition technology is old hat. Brazil, preparing for the soccer World Cup in 2014, is already trying out pairs of glasses with mini-cameras attached; policemen wearing them could snap images of faces, easy to compare with databases of criminals. More authoritarian states love such methods: photos are taken at checkpoints, and images checked against recent participants in protests.

But could such technology soon be used by anyone at all, to identify random passers-by and unearth personal details about them? A study which is to be unveiled on August 4th at Black Hat, a security conference in Las Vegas, suggests that day is close. Its authors, Alessandro Acquisti, Ralph Gross and Fred Stutzman, all at America’s Carnegie Mellon University, ran several experiments that show how three converging technologies are undermining privacy. One is face-recognition software itself, which has improved a lot. The researchers also used “cloud computing” services, which provide lots of cheap processing power. And they went to social networks like Facebook and LinkedIn, where most users post real names and photos of themselves.

. October 23, 2011 at 10:23 pm

Facial monitoring
The all-telling eye
Webcams can now spot which ads catch your gaze, read your mood and check your vital signs

IMAGINE browsing a website when a saucy ad for lingerie catches your eye. You don’t click on it, merely smile and go to another page. Yet it follows you, putting up more racy pictures, perhaps even the offer of a discount. Finally, irked by its persistence, you frown. “Sorry for taking up your time,” says the ad, and promptly desists from further pestering. Creepy. But making online ads that not only know you are looking at them but also respond to your emotions will soon be possible, thanks to the power of image-processing software and the ubiquity of tiny cameras in computers and mobile devices.

Uses for this technology would not, of course, be confined to advertising. There is ample scope to deploy it in areas like security, computer gaming, education and health care. But admen are among the first to embrace the idea in earnest. That is because it helps answer, at least online, clients’ perennial carp: that they know half the money they spend on advertising is wasted, but they don’t know which half.

dp October 23, 2011 at 11:25 pm

That’s it! I am sticking electrical tape over the lenses of all my webcams.

. November 12, 2011 at 6:33 pm

Face Recognition Makes the Leap From Sci-Fi

FACIAL recognition technology is a staple of sci-fi thrillers like “Minority Report.”

SceneTap, a new app for smart phones, uses cameras with facial detection software to scout bar scenes. Without identifying specific bar patrons, it posts information like the average age of a crowd and the ratio of men to women, helping bar-hoppers decide where to go. More than 50 bars in Chicago participate.

As SceneTap suggests, techniques like facial detection, which perceives human faces but does not identify specific individuals, and facial recognition, which does identify individuals, are poised to become the next big thing for personalized marketing and smart phones. That is great news for companies that want to tailor services to customers, and not so great news for people who cherish their privacy. The spread of such technology — essentially, the democratization of surveillance — may herald the end of anonymity.

And this technology is spreading. Immersive Labs, a company in Manhattan, has developed software for digital billboards using cameras to gauge the age range, sex and attention level of a passer-by. The smart signs, scheduled to roll out this month in Los Angeles, San Francisco and New York, deliver ads based on consumers’ demographics. In other words, the system is smart enough to display, say, a Gillette ad to a male passer-by rather than an ad for Tampax.

. November 22, 2011 at 9:29 pm

“I just noticed that not only are all Afghans going to have their biometric data (fingerprints and iris scans) recorded but the government plans to share it with the U.S. From the article: ‘Gathering the data does not stop at Afghanistan’s borders, however, since the military shares all of the biometrics it collects with the United States Department of Justice and the Department of Homeland Security through interconnected databases.’ Talk about ‘know thine enemy’ (or I guess, for now, friend). Does this foretell the near future when the U.S. govt. (and by extension, Chinese hackers) have the biometrics of almost everyone alive?”

. November 22, 2011 at 9:30 pm
. February 5, 2012 at 7:46 pm

A small magazine in Victoria, BC just uncovered a massive public traffic surveillance system deployed in Canada. Here’s a quote from the article: ‘Normally, area police manually key in plate numbers to check suspicious cars in the databases of the Canadian Police Information Center and ICBC. With [Automatic License Plate Recognition], for $27,000, a police cruiser is mounted with two cameras and software that can read license plates on both passing and stationary cars. According to the vendors, thousands of plates can be read hourly with 95-98 percent accuracy. … In August 2011, VicPD Information and Privacy Manager Debra Taylor called me to explain that, even though VicPD had the ALPR system in one of their cruisers, the [Royal Canadian Mounted Police] ran the system, and I should contact them for any information. “We actually don’t have a program,” Taylor said. “We don’t have any documents per se.” … A month later, Taylor handed over 600 pages. … [The claim they kept no documents] was apparently only in reference to digital information. VicPD had kept 500 pages of written, hard-copy logs of every ALPR hit they’d ever seen.

Anon June 18, 2012 at 7:51 pm
. July 19, 2012 at 7:05 pm
. August 24, 2012 at 3:13 pm

WELCOME to China, the land of video surveillance. Guangdong province boasts over 1m cameras. In 2010 the city of Chongqing, governed by the now-disgraced Bo Xilai, ordered 500,000. Other provinces have hundreds of thousands, according to Human Rights in China, an NGO. Video surveillance constitutes over half the country’s huge security industry, and is expected to reach 500 billion yuan ($79 billion) in 2015. China will soon overtake Britain, with around 3m cameras, as the capital of video surveillance.

Yet China is far from alone. In many democracies surveillance cameras are multiplying, too. And face-recognition technology is proving a wonder tool for both governments and marketers.

A jail in Alabama uses it to check those leaving against prisoner records. Mexican prisons use it to identify visitors. Heathrow airport is installing systems to track passengers through lounges and onto the plane. Brazil has plans to equip police with camera-spectacles that can identify troublemakers at the 2014 World Cup.

Privacy-loving European countries are less easy-going, and usually require cameras to be matched with signs to tell people they are being watched. Facebook’s face recognition has already fallen foul of tough German privacy laws. And America’s Supreme Court is uneasy with technology that enables the persistent tracking of individuals in public.

. August 26, 2012 at 4:11 pm

FBI To Give Facial Recognition Software to Law-Enforcement Agencies

The speedy onward march of biometric technology continues. After recently announcing plans for a nationwide iris-scan database, the FBI has revealed it will soon hand out free facial-recognition software to law enforcement agencies across the United States.

The software, which was piloted in February in Michigan, is expected to be rolled out within weeks. It will give police analysts access to a so-called “Universal Face Workstation” that can be used to compare a database of almost 13 million images. The UFW will permit police to submit and enhance image files so they can be cross-referenced with others on the database for matches.

. September 7, 2012 at 5:09 pm

FBI Launches $1 Billion Nationwide Face Recognition System

The U.S. Federal Bureau of Investigation has begun rolling out its new $1 billion biometric Next Generation Identification (NGI) system. In essence, NGI is a nationwide database of mugshots, iris scans, DNA records, voice samples, and other biometrics that will help the FBI identify and catch criminals — but it is how this biometric data is captured, through a nationwide network of cameras and photo databases, that is raising the eyebrows of privacy advocates. Until now, the FBI relied on IAFIS, a national fingerprint database that has long been due an overhaul. Over the last few months, the FBI has been pilot testing a face recognition system, which will soon be scaled up (PDF) until it’s nationwide. In theory, this should result in much faster positive identifications of criminals and fewer unsolved cases. The problem is, the FBI hasn’t guaranteed that the NGI will only use photos of known criminals. There may come a time when the NGI is filled with as many photos as possible, from as many sources as possible, of as many people as possible — criminal or otherwise. Imagine if the NGI had full access to every driving license and passport photo in the country — and DNA records kept by doctors, and iris scans kept by businesses. The FBI’s NGI, if the right checks and balances aren’t in place, could very easily become a tool that decimates civilian privacy and freedom.

. September 15, 2012 at 10:17 pm

Congress report warns: drones will track faces from the sky

With the FAA working on rules to integrate drones into airspace safety by 2015, the US government’s Congressional Research Service has warned of gaps in how American courts might treat the use of drones.

The snappily-headlined report, Drones in Domestic Surveillance Operations: Fourth Amendment Implications and Legislative Responses (PDF here), notes drones now in use can carry thermal imaging, high-powered cameras, license plate readers and LIDAR (light detection and ranging). “Soft” biometrics and facial recognition won’t be far behind, the report suggests, allowing drones to “recognize and track individuals based on attributes such as height, age, gender, and skin color.”

“The relative sophistication of drones contrasted with traditional surveillance technology may influence a court’s decision whether domestic drone use is lawful under the Fourth Amendment,” the report compiled by legislative attorney Richard Thompson II states.

. September 22, 2012 at 8:17 pm

Facebook Disables Face Recognition In EU

“Facebook has disabled face recognition features on its site for all new European users. The move follows privacy recommendations made by the Irish Data Protection Commissioner. Tag Suggest information has been turned off for new users, and Facebook plans to delete the information for existing EU users by October 15th. ‘The DPC says today’s report (PDF) is the result of evaluations it made through the first half of 2012 and on-site at Facebook’s HQ in Dublin over the course of two days in May and four in July. The DPC says FB has made just about all of the improvements it requested in five key areas: better transparency for the user in how their data is handled; user control over settings; more clarity on the retention periods for the deletion of personal data, and users getting more control over deleting things; an improvement in how users can access their personal data; and the ability of Facebook to be able to better track how they are complying with data protection requirements.'”

. December 8, 2013 at 8:22 pm

The bigger worry is for those in front of the cameras, not behind them. School bullies already use illicit snaps from mobile phones to embarrass their victims. The web throngs with furtive photos of women, snapped in public places. Wearable cameras will make such surreptitious photography easier. And the huge, looming issue is the growing sophistication of face-recognition technologies, which are starting to enable businesses and governments to extract information about individuals by scouring the billions of images online. The combination of cameras everywhere—in bars, on streets, in offices, on people’s heads—with the algorithms run by social networks and other service providers that process stored and published images is a powerful and alarming one. We may not be far from a world in which your movements could be tracked all the time, where a stranger walking down the street can immediately identify exactly who you are.

For the moment, companies are treading carefully. Google has banned the use of face-recognition in apps on Glass and its camera is designed to film only in short bursts. Japanese digital camera-makers ensure their products emit a shutter sound every time a picture is taken. Existing laws to control stalking or harassment can be extended to deal with peeping drones.

. December 24, 2013 at 10:38 am
. March 30, 2014 at 4:38 pm

Facebook Creates Software That Matches Faces Almost as Well as You Do

Facebook’s new AI research group reports a major improvement in face-processing software

. June 1, 2014 at 12:08 pm

A newly released set of slides from the Snowden leaks reveals that the NSA is harvesting millions of facial images from the Web for use in facial recognition algorithms through a program called “Identity Intelligence.” James Risen and Laura Poitras’s NYT piece shows that the NSA is linking these facial images with other biometrics, identity data, and “behavioral” data including “travel, financial, behaviors, social network.”

The NSA’s goal — in which it has been moderately successful — is to match images from disparate databases, including databases of intercepted videoconferences (in February 2014, another Snowden publication revealed that NSA partner GCHQ had intercepted millions of Yahoo video chat stills), images captured by airports of fliers, and hacked national identity card databases from other countries. According to the article, the NSA is trying to hack the national ID card databases of “Pakistan, Saudi Arabia and Iran.”

. September 13, 2014 at 4:50 pm

Clocking people’s clocks
Facial-recognition systems are getting better

The two main techniques used to recognise faces electronically are principal-component analysis (PCA) and linear-discriminant analysis (LDA). Both compare a picture of someone’s phizog with a reference image taken in a controlled environment. Passport photos and mugshots, then, are about as ideal as it gets.

Governments are not the only ones interested. Earlier this year, Facebook’s DeepFace system was asked whether thousands of pairs of photos were of the same person. It answered correctly 97.25% of the time, a shade behind humans at 97.53%. Although DeepFace is only a research project, and is aided by the fact that many Facebook photos are tagged with the names of people in the images, which lets the system learn those faces in different poses and lighting conditions, it is still an impressive feat.

. October 15, 2014 at 1:06 am
. November 9, 2014 at 10:15 am

Judge Says Public Has a Right To Know About FBI’s Facial Recognition Database

“There can be little dispute that the general public has a genuine, tangible interest in a system designed to store and manipulate significant quantities of its own biometric data, particularly given the great numbers of people from whom such data will be gathered,” Chutkan wrote in an opinion.

. June 16, 2016 at 2:38 pm

FBI Can Access Hundreds of Millions of Face Recognition Photos

The federal Government Accountability Office published a report on the FBI’s face recognition capabilities that says the FBI has access to hundreds of millions of photos. According to the GAO report, the FBI’s Facial Analysis, Comparison, and Evaluation (FACE) Services unit not only has access to the FBI’s Next Generation Identification (NGI) face recognition database of nearly 30 million civil and criminal mug shot photos, but it also has access to the State Department’s Visa and Passport databases, the Defense Department’s biometric database, and the drivers license databases of at least 16 states. This totals 411.9 million images, most of which are Americans and foreigners who have committed no crimes.

. July 29, 2016 at 8:29 pm

In Russia, meanwhile, there has been a recent outcry over an app called FindFace, which lets users take photos of strangers and then determines their identity from profile pictures on social networks. The app’s creators say it is merely a way to make contact with people glimpsed on the street or in a bar. Russian police have started using it to identify suspects and witnesses. The risk is clear: the end of public anonymity. Gigapixel images of a large crowd, taken from hundreds of metres away, can be analysed to find out who went on a march or protest, even years later. In effect, deep learning has made it impossible to attend a public gathering without leaving a record, unless you are prepared to wear a mask. (A Japanese firm has just started selling Privacy Visor, a funny-looking set of goggles designed to thwart facial-recognition systems.)

. October 19, 2016 at 11:08 pm

Half of all U.S. adults are in face-recognition databases, and Black people more likely to be targeted

The 150-page report [PDF Link] released by Georgetown University’s Center for Privacy and Technology on Tuesday shows that the faces of about half of all adults in the United States are stored in face-recognition databases that federal, state, and local authorities can search.

. November 2, 2016 at 4:01 pm

In Accessorize to a Crime: Real and Stealthy Attacks on State-of-the-Art Face Recognition, researchers from Carnegie-Mellon and UNC showed how they could fool industrial-strength facial recognition systems (including Alibaba’s “smile to pay” transaction system) by printing wide, flat glasses frames with elements of other peoples’ faces with “up to 100% success.”

As the software learns what a face looks like, it leans heavily on certain details—like the shape of the nose and eyebrows. The Carnegie Mellon glasses don’t just cover those facial features, but instead are printed with a pattern that is perceived by the computer as facial details of another person.

In a test where researchers built a state-of-the-art facial recognition system, a white male test subject wearing the glasses appeared as actress Milla Jovovich with 87.87% accuracy. An Asian female wearing the glasses tricked the algorithm into seeing a Middle Eastern man with the same accuracy. Other notable figures whose faces were stolen include Carson Daly, Colin Powell, and John Malkovich. Researchers used about 40 images of each person to generate the glasses used to identify as them.

. September 16, 2017 at 4:51 pm

What machines can tell from your face

Life in the age of facial recognition

Technology is rapidly catching up with the human ability to read faces. In America facial recognition is used by churches to track worshippers’ attendance; in Britain, by retailers to spot past shoplifters. This year Welsh police used it to arrest a suspect outside a football game. In China it verifies the identities of ride-hailing drivers, permits tourists to enter attractions and lets people pay for things with a smile. Apple’s new iPhone is expected to use it to unlock the homescreen.

Set against human skills, such applications might seem incremental. Some breakthroughs, such as flight or the internet, obviously transform human abilities; facial recognition seems merely to encode them. Although faces are peculiar to individuals, they are also public, so technology does not, at first sight, intrude on something that is private. And yet the ability to record, store and analyse images of faces cheaply, quickly and on a vast scale promises one day to bring about fundamental changes to notions of privacy, fairness and trust.

Anyone with a phone can take a picture for facial-recognition programs to use. FindFace, an app in Russia, compares snaps of strangers with pictures on VKontakte, a social network, and can identify people with a 70% accuracy rate. Facebook’s bank of facial images cannot be scraped by others, but the Silicon Valley giant could obtain pictures of visitors to a car showroom, say, and later use facial recognition to serve them ads for cars. Even if private firms are unable to join the dots between images and identity, the state often can. China’s government keeps a record of its citizens’ faces; photographs of half of America’s adult population are stored in databases that can be used by the FBI. Law-enforcement agencies now have a powerful weapon in their ability to track criminals, but at enormous potential cost to citizens’ privacy.

Google has explicitly turned its back on matching faces to identities, for fear of its misuse by undemocratic regimes. Other tech firms seem less picky. Amazon and Microsoft are both using their cloud services to offer face recognition; it is central to Facebook’s plans.

Advances in AI are used to spot signs of sexuality

Machines that read faces are coming

When shown one photo each of a gay and straight man, both chosen at random, the model distinguished between them correctly 81% of the time. When shown five photos of each man, it attributed sexuality correctly 91% of the time. The model performed worse with women, telling gay and straight apart with 71% accuracy after looking at one photo, and 83% accuracy after five. In both cases the level of performance far outstrips human ability to make this distinction. Using the same images, people could tell gay from straight 61% of the time for men, and 54% of the time for women. This aligns with research which suggests humans can determine sexuality from faces at only just better than chance.

Researchers produce images of people’s faces from their genomes

Facial technology makes another advance

Human Longevity has assembled 45,000 genomes, mostly from patients who have been in clinical trials, and data on their associated physical attributes. The company uses machine-learning tools to analyse these data and then make predictions about how genetic sequences are tied to physical features. These efforts have improved to the point where the company is able to generate photo-like pictures of people without ever clapping eyes on them.

. September 16, 2017 at 5:00 pm

What’s more, the smartphone will do for face recognition what smart speakers, such as the Amazon Echo, have done for speech recognition: make it acceptable to consumers. Millions of Chinese already “swipe” their faces on smartphones to authorise payments. On September 12th Apple is expected to unveil a new version of its iPhone, with technology that can reliably identify the owner’s face and then unlock the device, even in the dark. That will come only a few weeks after Samsung presented a new Galaxy Note with a similar but less sophisticated feature.

It makes sense to separate facial-recognition technology into two categories: the underlying capability and the applications that make use of it. Megvii’s Face++ belongs in the first category, as do similar offerings from SenseTime, another Chinese startup, NTechLab, a Russian firm, as well as Amazon, IBM and Microsoft. All provide face recognition as a cloud-computing service. Megvii’s customers can upload a batch of photos and names, and use them to train algorithms, which then can recognise those particular people. Firms can also integrate the recognition service into their own offerings, for instance to control access to online accounts.

Megvii’s and SenseTime’s services are largely founded on good data. They have access to the Chinese government’s image database of 700m citizens, who are each given a photo ID by the age of 16. Chinese government agencies are also valuable customers—more and more of the country’s hundreds of millions of surveillance cameras will soon recognise faces. In Shenzhen facial recognition is used to identify jaywalkers; names and pictures go up on a screen. In Beijing the municipality has started using the technology to catch thieves of toilet paper in public restrooms (its system also prevents people from taking more than 60 centimetres of paper within a nine-minute period).

. October 3, 2017 at 3:20 pm

Airport Face Scans Could Be a Dry Run for a National Surveillance System

Sixteen years after Sept. 11, we’re all well-aware that in order to go through security at an American airport, we will have to use a driver’s license or passport to identify ourselves. But Congress is quietly laying groundwork to take things much, much further—to build out a face scanning system that identifies you when you walk into an airport and tracks your every move, until you board the plane.

The TSA Modernization Act is scheduled to be marked up in the Senate Commerce Committee this week. From the title it sounds like a harmless bill to get the airport security agency new computers. But an alarming section in the bill would give the Trump administration a green light to begin using biometrics to identify people in airports nationwide. And right now there’s only one technology fit for a biometric surveillance system: automated, real-time face scans.

This could mean scans of every person’s face throughout the entirety of every American airport—“at checkpoints, screening lanes, bag drop and boarding areas, and other areas.” Basically, it could go anywhere the Department of Homeland Security determines that the technology could improve security.

. August 4, 2018 at 1:48 am
. March 13, 2019 at 10:50 pm

US Government Will Be Scanning Your Face At 20 Top Airports, Documents Show

These same documents state — explicitly — that there were no limits on how partnering airlines can use this facial recognition data. CBP did not answer specific questions about whether there are any guidelines for how other technology companies involved in processing the data can potentially also use it. It was only during a data privacy meeting last December that CBP made a sharp turn and limited participating companies from using this data. But it is unclear to what extent it has enforced this new rule. CBP did not explain what its current policies around data sharing of biometric information with participating companies and third-party firms are, but it did say that the agency “retains photos … for up to 14 days” of non-US citizens departing the country, for “evaluation of the technology” and “assurance of the accuracy of the algorithms” — which implies such photos might be used for further training of its facial matching AI.

. September 3, 2019 at 2:12 pm

An even subtler idea was proposed by researchers at the Chinese University of Hong Kong, Indiana University Bloomington, and Alibaba, a big Chinese information-technology firm, in a paper published in 2018. It is a baseball cap fitted with tiny light-emitting diodes that project infra-red dots onto the wearer’s face. Many of the cameras used in face-recognition systems are sensitive to parts of the infra-red spectrum. Since human eyes are not, infra-red light is ideal for covert trickery.

In tests against FaceNet, a face-recognition system developed by Google, the researchers found that the right amount of infra-red illumination could reliably prevent a computer from recognising that it was looking at a face at all. More sophisticated attacks were possible, too. By searching for faces which were mathematically similar to that of one of their colleagues, and applying fine control to the diodes, the researchers persuaded FaceNet, on 70% of attempts, that the colleague in question was actually someone else entirely.

Training one algorithm to fool another is known as adversarial machine learning. It is a productive approach, creating images that are misleading to a computer’s vision while looking meaningless to a human being’s. One paper, published in 2016 by researchers from Carnegie Mellon University, in Pittsburgh, and the University of North Carolina, showed how innocuous-looking abstract patterns, printed on paper and stuck onto the frame of a pair of glasses, could often convince a computer-vision system that a male ai researcher was in fact Milla Jovovich, an American actress.

. October 5, 2019 at 6:29 pm
. October 5, 2019 at 6:29 pm
. October 6, 2019 at 6:31 pm
. December 24, 2019 at 3:48 pm

Guo Bing, a legal academic in the eastern city of Hangzhou, likes to spend his leisure time at a local safari park. But when the park informed season-pass holders like him that admission would require a face-scan, Mr Guo objected. Late last month he filed a lawsuit, claiming the new rules violated his privacy. Facial-recognition technology is widely used in China. Doubtless to the relief of the government which makes extensive use of it, there has been little public debate about it. State media, however, seized on Mr Guo’s case, trumpeting it as the first of its kind to be lodged in a Chinese court. Netizens have been hailing Mr Guo as a champion of consumer rights. A thread about his suit has garnered 100m views on Weibo, a social-media platform.

It is surprising that it has taken so long for the judiciary to get involved. Some 300 tourist sites in China use facial recognition to admit visitors. The safari park says doing so can shorten queues. Many office workers in Beijing’s main financial district clock in and out of work by scanning their faces. Some campuses and residential buildings use facial-recognition cameras to screen people entering. WeChat, a messaging and digital-wallet app, allows users to pay with their faces at camera-equipped vendors. Facial-recognition systems are ubiquitous at traffic intersections, in railway stations and airports (visitors to a public-security expo are pictured being scanned).

. January 17, 2020 at 4:31 pm
. January 28, 2020 at 12:05 pm
. February 1, 2020 at 7:24 pm
. February 14, 2020 at 7:54 pm
. February 21, 2020 at 6:39 pm
. March 12, 2020 at 12:23 pm
. July 1, 2020 at 5:40 pm
. July 1, 2020 at 5:48 pm

Airports will also emphasise hygiene. “I think the move to minimising contact during any travel experience will just push us over the edge to having a contactless journey,” says John Holland-Kaye, Heathrow’s chief. “Once you get into the terminal, you’ll scan your passport, have an image of your face taken, drop your bags,” and then stroll through checkpoints as cameras use facial recognition to open gates.

Some of this may sound far-fetched, but citizens of some three dozen countries can already use e-gates to get through passport control on arrival at Heathrow and many other airports, allowing them to go from gate to kerb without talking to another person. Security will still involve slowing down, but even there it should soon be possible to leave laptops and liquids inside the bag. Automation will reduce the need to touch trays. Hand-sanitiser is already everywhere. Once implemented, such changes are unlikely to be undone.

. July 10, 2020 at 9:36 pm
. September 16, 2020 at 6:23 pm

Portland passes broadest facial recognition ban in the US – CNN

. September 16, 2020 at 6:57 pm

Portland Passes Groundbreaking Ban on Facial Recognition in Stores, Banks, Restaurants and More – Slashdot

. November 9, 2020 at 7:18 pm

Toronto Eaton Centre owner found guilty of hiding facial recognition cameras in kiosks

. November 9, 2020 at 7:20 pm

Cadillac Fairview covertly collected images of millions of shoppers: Privacy commissioner | Calgary Herald

. November 22, 2020 at 4:23 pm

Face recognition isn’t just for humans — it’s learning to identify bears and cows, too – CNN

. November 22, 2020 at 4:31 pm

Wrongful arrest exposes racial bias in facial recognition technology – CBS News

. January 4, 2021 at 11:04 pm
. February 1, 2021 at 6:50 pm

New Site Extracts and Posts Every Face from Parler’s Capitol Hill Insurrection Videos – Slashdot

. February 1, 2021 at 6:50 pm
. February 1, 2021 at 7:45 pm
. March 10, 2021 at 11:31 am

Hacked Surveillance Camera Firm Shows Staggering Scale of Facial Recognition

A hacked customer list shows that facial recognition company Verkada is deployed in tens of thousands of schools, bars, stores, jails, and other businesses around the country.

. March 16, 2021 at 9:15 pm

Tech Companies Are Limiting Use Of Facial Recognition By Law Enforcement : Short Wave : NPR

. April 28, 2021 at 9:37 pm

Legal Chatbot Firm DoNotPay Adds Anti-Facial Recognition Filters To Its Suite of Handy Tools – Slashdot

Leave a Comment

{ 2 trackbacks }

Previous post:

Next post: