Researchers have paired a neural network and a facial detection system used in a “dog hipsterisation” app to help manage and protect grizzly bears
Facial recognition is problematic for humans. When it works, it invades privacy and eases us into a surveillance state. When it doesn’t work, people have been falsely arrested by police. But that’s people. For bears, it’s all good – and facial recognition is now being used to help research, monitor and protect the animals using a neural network-based system called BearID.
Melanie Clapham tracks grizzlies. Normally, that requires methodically examining photographs or physically tagging the animal, as the University of Victoria researcher’s work on grizzly behaviour requires being able to pinpoint a specific individual.
But that’s not easy because bears have few distinctive markings – they’re all brown and fluffy – and can dramatically change appearance from one season to the next. “They moult their coats in the summertime,” says Clapham. “And in the autumn, before they go into hibernation, they can put on a third of their body weight.”
How can facial recognition be made safer?
If individual bears are watched closely, it’s possible to track them through such changes. However, that becomes more difficult when you’re monitoring many bears over a wide area or not seeing the same bear frequently enough. “If you’re not observing them constantly, it can be difficult to pick out the same bears even between spring and fall,” she said.
Clapham knew there had to be a solution, and realised that automation and machine learning could be part of it. To find out, she joined a group called Wildlabs.
Based in Cambridge, Wildlabs is a network of 4,305 members including field conservationists, researchers and technology experts. “Academics often have really specific research questions they’re answering, and a high level of expertise and time to build their own stuff,” explained Stephanie O’Donnell, community manager at Wildlabs.
Wildlabs was founded by NGOs, including the WWF, and tech groups such as Google.org and Arm in the hopes of bridging the gap between the two worlds. “Conservationists who work in the field find it really hard to find other conservationists to talk about technology,” O’Donnell says. “Field conservationists are trying to manage big protected areas or monitor a whole ecosystem while dealing with challenges around human-wildlife conflict or climate change. They need technology to do different things.”
The BearID project was a bit of both, taking in practical management with academic research. O’Donnell recalls Clapham getting in touch because two hours earlier she heard from Ed Miller and Mary Nguyen, a pair of Silicon Valley software developers who were working on a similar idea as part of a bear-watching project called Explore.org, which has webcams watching grizzlies at Brooks Falls in Katmai National Park, Alaska. “I was like, ‘you guys must be working together already’,” says O’Donnell. But they weren’t, so she helped link them up. “That interaction is exactly what Wildlabs was created to do.”
Brought together by Wildlabs, the two projects combined their resources and their datasets to set up BearID, and, over several years, developed software that would analyse images to learn how to recognise one grizzly from another.
The system itself is designed in two sections. First, there’s a facial-detection tool, which looks at the image, recognises a bear’s face, and makes measurements between key aspects, such as eyes and the tip of the nose. Rather than design such a program from scratch, the team used a pre-made system from Dlib, a library of machine-learning algorithms and tools, that was designed to recognise dog faces in order to give them hipster glasses and moustaches — yes, you read that correctly. Silly apps like the “dog hipsteriser” aren’t necessarily a waste of time in the right hands.
“We used elements of that network to help us detect bear faces initially, which gave us a bit of a kickstart, though we did go back and retrain it on bear faces,” Clapham says. That made the job of labelling bears faces easier, plus it let the team add hipster glasses and moustaches to bears, which is as delightful as it sounds.
The other half of the system is facial recognition. That began with labelling the photos, marking each bear, in order to use that data to train a machine-learning system. Human facial-recognition systems are taught on millions of images, but there simply aren’t enough bears for such detailed training.
The network is shown a selection of images that have been correctly labelled to learn how to tell bears apart. “We don’t tell the network what to look for in a bear’s face, we just present the labelled data,” says Clapham. “Over time, the network learns what is stable about that bear’s face and uses that to distinguish between different individuals.”
Only skin deep: The state of biometric security
The first version of the system was trained on 5,000 images of 132 bears, half from Alaska and half from further south in the US and Canada. That was split into two sections: 80% were used for training and 20% to test the system’s accuracy. When a bear’s face is detected, a deep-convolutional neural network examines 128 dimensions on the image.
But even at this early stage, the system works. Across the two sites, at Knight Inlet in British Columbia and Brooks River in Alaska, the BearID system could identify animals with 84% accuracy. While we don’t know exactly what the deep-learning network is doing to differentiate bears, it intriguingly struggled on the same animals that Clapham had difficulty telling apart. In the future, additional data such as geography and animal size may also be considered by the system to improve its accuracy.
So far, the system has only been used on existing photos of bears, those that have been taken with handheld cameras. The next stage of the project is to use the automated system to identify and label bears spotted via remote camera traps, which may require more training for the network as the perspective and angles differ. “That’s where this technology will become very useful for research and monitoring of grizzly bears, and hopefully other species too,” explains Clapham.
Another future goal is having the system track newly spotted individuals. At the moment, the network needs to be trained on each and every bear, with new animals added manually. “We really need to be able to distinguish between individuals that it’s already seen and those that it hasn’t seen yet,” she says. “That will probably take some retraining. This is really the first step.”
This automated bear-spotting system could change research around the furry beasts. Once it works with camera traps, it can help count bears for population measuring. “To be able to do that, you need to be able to recognise individuals, because you capture and recapture the same bears over time,” Clapham says. Such data is important for managing species that are facing a decline, and is currently done using genetic tagging, which is much more time consuming and difficult.
Clapham is most excited about using the automated system for studying their behaviour, which is the focus of her own research. BearID can help track individual bears across landscapes, helping to study their movement in response to food availability and human development. “We’re interested to see how this could be used to reduce conflicts between people and bears,” she says. And, she explains, it can hopefully be extended to other types of bears, letting researchers and conservationists track any species.
In the end, facial recognition may be problematic for people, but, with ideas such as BearID, it could prove a lifeline for grizzly bears.
Lire la suite: www.itpro.co.uk