Counter Terror Business - Surveillance & Biometrics /features/surveillance-biometrics en How facial recognition tech could change the police force /features/how-facial-recognition-tech-could-change-police-force <div class="field-item even"><img typeof="foaf:Image" src="/sites/default/files/styles/696x462_content_main/public/face.jpg?itok=xNk9yLTD" width="696" height="392" alt="" /></div><div class="field-item even"><a href="/features/surveillance-biometrics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Surveillance &amp; Biometrics</a></div><p><strong>Facial recognition technology at its most basic level has existed since the 1960s, when an American team experimented to see if a computer could use a rudimentary scanner to map a person’s hairline, eyes, and nose.</strong></p> <p>Today, the technology has advanced massively and is used in a variety of public spaces to improve security across the UK.</p> <p>Back in April, the government announced a £55 million investment set to expand facial recognition technology, including mobile units that can be deployed on high streets to identify individuals wanted by the police to crackdown on retail crime.</p> <p>The investment will be made over the next four years, while the £4 million for mobile units will be spent over the next year.</p> <p>Prime minister Rishi Sunak said the increase in funding was “sending a message” to criminals.</p> <p>“Whether they are from serious organised criminal gangs, repeat offenders or opportunistic thieves – who think they can get away with stealing from these local businesses or abusing shopworkers, enough is enough,” he said.</p> <p><strong>The story so far</strong></p> <p>Facial recognition technology is currently only being used by two police forces in the UK, those being the Met and South Wales Police.</p> <p>They are testing the innovation with the help of the National Physical Laboratory (NPL) which provides cutting-edge measurement in science, engineering and technology.</p> <p>The NPL test plan was specifically designed to help identify any impact this technology may have on any protected characteristics, in particular race, age and sex.</p> <p>A report conducted by the NPL on the use of facial recognition technology in the two police forces was published last year, which they said gave a better understanding of what setting the algorithm can be operated at where there is no statistical significance between demographic performance.</p> <p><strong>The Met and South Wales Police</strong></p> <p>The Met has published examples of how the technology has been helping to reduce crime. For example, on 9 April, 12 arrests were made with the assistance of Live Facial Recognition (LFR) technology.</p> <p>Officers arrested a man who breached his sexual harm prevention order following an alert at a previous deployment in Clapham.</p> <p>An investigation following this alert found the man was sending explicit images to children. He has been charged with two counts of sexual communication with a child and has been remanded in custody.</p> <p>Lindsey Chiswick, the Met’s director of intelligence said they are “guided by data” as part of ‘A New Met for London.'</p> <p>She said: “The data for our deployments is available to the public and shows the technology is outperforming what an independent study predicted.”</p> <p>Chiswick added that reports into law enforcement use of this technology found the public are “mostly supportive.”</p> <p>She commented: “It’s vital we bring communities in London with us so we are continuing work with independent advisory groups and invite them to deployments."</p> <p>The use of facial recognition technology is proven to assist in freeing up police time and apprehending criminals. For those not in the know, there are three main types of facial recognition technology. The aforementioned Live Facial Recognition tech compares a live camera feed of faces against a predetermined watchlist to find a possible match that generates an alert.</p> <p>Retrospective Facial Recognition (RFR) is used after an event, and compares still images of faces of unknown subjects against a reference image database in order to identify them.</p> <p>Finally, Operator Initiated Facial Recognition (OIFR) is a mobile phone use of facial recognition which compares a photograph of a person’s face taken on a mobile phone to the predetermined watchlist to assist an officer to identify a subject.</p> <p>These are three types of recognition that are being used by both police services in conjunction with the NPL. South Wales Police have used the technology in a similar way to the Met as they are both being monitored by NPL.</p> <p>When they announced they were going to implement the technology into their policy work at the start of 2023, Chief Constable Jeremy Vaughan, said: “My priority will always be to protect the public while relentlessly pursuing those people determined to cause harm in our communities."</p> <p>He added: “I believe the public will continue to support our use of all the available methods and technology to keep them safe and thanks to the work of the National Physical Laboratory and the results of its independent evaluation I believe we are now in a stronger position than ever before to be able to demonstrate that the use of facial recognition technology is fair, legitimate, ethical and proportionate.”</p> <p><strong>The concerns</strong></p> <p>The public reaction to facial recognition technology being used by the police has been mixed. A 2022 joint survey by Ada Lovelace Institute and the Alan Turing Institute found that while only 12 per cent of people said they had a good knowledge and experience of the technology, 86 per cent of the participants believed that police’s use of it is beneficial.</p> <p>However, the facial recognition technology does tend to conjure images of a dystopian society and many members of the public are concerned with the ethical and privacy issues surrounding the innovation. One of the groups opposing the use of facial recognition tech by the police is Big Brother Watch.</p> <p>Director Silkie Carlo said in response to the government’s latest funding into the tech: “Criminals should be brought to justice, but papering over the cracks of broken policing with Orwellian tech is not the solution. “It is completely absurd to inflict mass surveillance on the general public under the premise of fighting theft whilst police are failing to even turn up to 40 per cent of violent shoplifting incidents or to properly investigate many more serious crimes.</p> <p>“Rather than resourcing police to actively pursue people who pose a risk to the public, the government’s investment in facial recognition cameras for retail offences relies on shoplifters walking in front of marked police cameras and as such will effectively target the lowest hanging fruit."</p> <p>The risk of wrongful arrest is another of the main concerns surrounding the use of facial recognition by law enforcement.</p> <p>Both police forces said if someone is not on a watchlist, they will never store their biometric data and it will automatically be deleted.</p> Wed, 01 May 2024 15:44:07 +0000 Robyn Quick 16912 at /features/how-facial-recognition-tech-could-change-police-force#comments Using facial recognition to fight crime /features/using-facial-recognition-fight-crime <div class="field-item even"><img typeof="foaf:Image" src="/sites/default/files/styles/696x462_content_main/public/adobestock_346105770.jpg?itok=x_vX1iKu" width="696" height="280" alt="" /></div><div class="field-item even"><a href="/features/surveillance-biometrics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Surveillance &amp; Biometrics</a></div><p>Policing minister Chris Philp recently announced at the Conservative Party conference that he wanted police officers to have access to a wider range of databases outside of those on the national database, which only includes people who have already been arrested The announcement could mean police having access to passport photos to use for facial recognition in an attempt to fight crime. Philp said: “I’m going to be asking police forces to search all of those databases — the police national database, which has custody images, but also other databases like the passport database.” The 鶹 Office said that facial recognition has already been used to help catch criminals.<br>A 鶹 Office spokesperson told the BBC that facial recognition has already been used to help catch criminals and that the technology could also be used to help search for missing or vulnerable people.<br>A spokesperson said: “Facial recognition, including live facial recognition, has a sound legal basis that has been confirmed by the courts and has already enabled a large number of serious criminals to be caught, including for murder and sexual offences.”</p> <p><strong>Criticism</strong></p> <p>However, these plans have been widely criticised. Biometrics and surveillance camera commissioner Professor Fraser Sampson has said that the plans risk damaging public trust.<br>Professor Sampson told the BBC: “The state has large collections of good quality photographs of a significant proportion of the population - drivers and passport holders being good examples - which were originally required and given as a condition of, say, driving and international travel,” he said. “If the state routinely runs every photograph against every picture of every suspected incident of crime simply because it can, there is a significant risk of disproportionality and of damaging public trust.”</p> <p>A group of 65 parliamentarians and 31 rights and race equality organisations have called for an urgent stop to the use of facial recognition surveillance by the police and private companies. Signatories to the statement include David Davis, Diane Abbott, Christine Jardine, Ed Davey and Caroline Lucas. The statement says: “The signatories to this call are rights organisations, race equality organisations, technology experts, and parliamentarians. “We hold differing views about live facial recognition surveillance, ranging from serious concerns about its incompatibility with human rights, to the potential for discriminatory impact, the lack of safeguards, the lack of an evidence base, an unproven case of necessity or proportionality, the lack of a sufficient legal basis, the lack of parliamentary consideration, and the lack of a democratic mandate.</p> <p>“However, all of these views lead us to the same following conclusion: “We call on UK police and private companies to immediately stop using live facial recognition for public surveillance.” Silkie Carlo, director of Big Brother Watch, said: “This important call from MPs to urgently stop live facial recognition represents the greatest involvement parliamentarians have ever had in Britain’s approach to facial recognition surveillance. “With the Government now planning to turn all of our passport photos into mugshots for facial recognition scanning, yet again absent any democratic scrutiny, this intervention could not come at a more important time.</p> <p>This dangerously authoritarian technology has the potential to turn populations into walking ID cards in a constant police line up. “The UK’s reckless approach to face surveillance makes us a total outlier in the democratic world, especially against the backdrop of the EU’s proposed ban. “As hosts of the AI summit in autumn, the UK should show leadership in adopting new technologies in a rights-respecting way, rather than a way that mirrors the dystopian surveillance practices of Saudi Arabia and China. There must be an urgent stop to live facial recognition, parliamentary scrutiny and a much wider democratic debate before we introduce such a privacy-altering technology to British life.”</p> <p><strong>Legislation and bans </strong></p> <p>The EU is considering a ban on AI-powered facial recognition surveillance under the new AI act and other jurisdictions around the world have already banned it. In September, 120 civil society organisations and 60 experts called for a global stop to facial recognition surveillance. Ella Jakubowska, senior policy advisor at European Digital Rights (EDRi) said: “With the upcoming Artificial Intelligence Act, the European Union has the chance to become a world leader in protecting people from public facial recognition and other biometric surveillance. European Parliamentarians have spoken loud and clear in support of strong bans. “Worryingly, EU governments continue to push back, citing vague claims of ‘safety’ and ‘security’ without providing any objective evidence. They want an unlimited margin of discretion to subject our faces, our bodies and our communities to these dystopian uses of technology, despite a complete lack of democratic mandate.”</p> <p><strong>Use cases </strong></p> <p>South Wales Police have hit the headlines a few times for their use of facial recognition technology. Ahead of Harry Styles’ concerts in Cardiff on 20 and 21 June, fans were warned that they could be scanned by live facial recognition cameras deployed in the area by South Wales Police. The cameras were to be used to identify people wanted for priority offences. South Wales Police stated: “It’s being deployed specifically to seek out wanted individuals. Fully appreciate the concert has a young audience, however concert-goers won’t be the only people in the city centre during this time.”</p> <p>At a Beyoncé concert earlier in the year, the force said the technology would be used “to support policing in the identification of persons wanted for priority offences… to support law enforcement… and to ensure the safeguarding of children and vulnerable persons”. South Wales Police have used the technology at previous events. Using the technology at a rugby match, 108,540 faces were scanned, resulting in the arrests of two people. South Wales Police has a LFR FAQ page on its website. It states: “The specific purpose for Live Facial Recognition deployment is: To support Policing in the identification of persons wanted for priority offences, to support law enforcement including the administration of justice (through arrest of persons wanted on warrant or unlawfully at large/recall to prison), and to ensure and promote the safeguarding of children and vulnerable persons at risk.”</p> <p>The website also lists occasions where the force has used LFR, as well as the events already listed, the technology was utilised at Pride Cymru in August 2022 and Wales Airshow in July 2023. The website says: “Live Facial Recognition technology is used as an efficient and effective policing tactic to prevent and detect crime, and protect the most vulnerable in our society.”</p> Mon, 12 Feb 2024 17:11:07 +0000 Robyn Quick 16758 at /features/using-facial-recognition-fight-crime#comments Utilising facial recognition to keep the public safe /features/utilising-facial-recognition-keep-public-safe <div class="field-item even"><img typeof="foaf:Image" src="/sites/default/files/styles/696x462_content_main/public/adobestock_343570247.jpg?itok=SAkVfppw" width="696" height="391" alt="" /></div><div class="field-item even"><a href="/features/surveillance-biometrics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Surveillance &amp; Biometrics</a></div><p><strong>Facial recognition has been in the news for featuring at high-profile events such as Harry Styles concerts. But these articles in the mainstream press often highlight the privacy concerns surrounding the technology. Without a high-profile success case, it’s hard to convince the public of the technology’s viability</strong></p> <p>Ahead of Harry Styles’ concerts in Cardiff on 20 and 21 June, fans were warned that they could be scanned by live facial recognition cameras deployed in the area by South Wales Police. The cameras were to be used to identify people wanted for priority offences.<br>&nbsp;&nbsp; &nbsp;<br>South Wales Police stated: “it’s being deployed specifically to seek out wanted individuals. Fully appreciate the concert has a young audience, however concert-goers won’t be the only people in the city centre during this time.”</p> <p><strong>How can it be used?</strong><br>Live facial recognition (LFR) works by comparing faces with a watchlist, using artificial intelligence. The police stated that if you are not on a watch list, the biometric data collected won’t be stored and it will immediately be deleted.<br>&nbsp;&nbsp; &nbsp;<br>At a Beyoncé concert earlier in the year, the force said the technology would be used “to support policing in the identification of persons wanted for priority offences… to support law enforcement… and to ensure the safeguarding of children and vulnerable persons”.<br>&nbsp;&nbsp; &nbsp;<br>South Wales Police have used the technology at previous events. Using the technology at a rugby match, 108,540 faces were scanned, resulting in the arrests of two people.<br>&nbsp;&nbsp; &nbsp;<br>South Wales Police has a LFR FAQ page on its website. It states: “The specific purpose for Live Facial Recognition deployment is: To support Policing in the identification of persons wanted for priority offences, to support law enforcement including the administration of justice (through arrest of persons wanted on warrant or unlawfully at large/recall to prison), and to ensure and promote the safeguarding of children and vulnerable persons at risk.”<br>&nbsp;&nbsp; &nbsp;<br>The website also lists occasions where the force has used LFR, as well as the events already listed, the technology was utilised at Pride Cymru in August 2022 and Wales Airshow in July 2023.<br>&nbsp;&nbsp; &nbsp;<br>The website says: “Live Facial Recognition technology is used as an efficient and effective policing tactic to prevent and detect crime, and protect the most vulnerable in our society.”</p> <p><strong>Expected increase</strong><br>Biometrics and surveillance camera commissioner Professor Fraser Sampson recently commissioned an independent gap analysis by Professors Pete Fussey and William Webster. Fussey and Webster highlighted that the use of such technology is likely to increase. They said: “the Policing Minister expressed his desire to embed facial recognition technology in policing and is considering what more the government can do to support the police on this. Such embedding is extremely likely to include exploring integration of this technology with police body worn video”.</p> <p><strong>persuading the public</strong><br>So if use of the technology is to increase, how can the public be persuaded to accept it?<br>&nbsp;&nbsp; &nbsp;<br>In his annual report, published in February, Sampson claims that the extent to which the public will tolerate facial recognition will rely on whether or not they believe that measures are in place to make sure that it is used lawfully and responsibly.<br>&nbsp;&nbsp; &nbsp;<br>There are two main ways to achieve this. The first is to update the legislation. Legislation is currently in the works in the form of the Data Protection and Digital Information (No.2) Bill, which is still at the early stages of its journey through parliament. There needs to be legal oversight over how and when the technology is used and by whom. For example, it may be acceptable to the public for the police to use LFR at concerts, but the public are less likely to welcome it when used by shops, schools or even private users.<br>&nbsp;&nbsp; &nbsp;<br>There was criticism when Sports Direct announced that the use of LFR had cut crime in its shops. 50 MPs and peers supported a letter opposing the use of LFR by Frasers Group.<br>&nbsp;&nbsp; &nbsp;<br>The legislation also needs to cover how the data collected will be stored and used, who will have access to it and when it should be deleted.<br>&nbsp;&nbsp; &nbsp;<br>The other factor in changing the public’s opinion is to keep the public informed. Forces need to be clear about when they are using the technology and how, so the public can trust them.<br>&nbsp;&nbsp; &nbsp;<br>Another aspect to consider, is that so far, there have been no high-profile success stories. Perhaps if the public became aware of an occasion or occasions where the technology had been used to identify a threat to the public and potentially prevent a crime, they would be more accepting of its use.<br>&nbsp;&nbsp; &nbsp;<br>One high-profile use case is set to be the Paris Olympics next year. Real-time cameras are set to use AI to detect suspicious activity, such as abandoned luggage and unexpected crowds. However, a new law states that while police can use CCTV algorithms to pick up anomalies such as crowd rushes, fights or unattended bags, it outlaws using LFR to trace “suspicious” individuals. It may be that introducing technology like this could be a stepping stone to getting the public to trust LFR.</p> Tue, 03 Oct 2023 13:53:34 +0000 Freya 16575 at /features/utilising-facial-recognition-keep-public-safe#comments Ensuring legitimacy and accountability /features/ensuring-legitimacy-and-accountability <div class="field-item even"><img typeof="foaf:Image" src="/sites/default/files/styles/696x462_content_main/public/facialrec_camera_2.jpg?itok=Ri1uO0Sz" width="696" height="464" alt="" /></div><div class="field-item even"><a href="/features/surveillance-biometrics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Surveillance &amp; Biometrics</a></div><p><strong>Many police forces and other security services have been using facial recognition technology for a while now. However, it is a hotly debated topic with privacy concerns and struggles to win over the public</strong> In February, Professor Fraser Sampson, the biometrics and surveillance camera commissioner, published his annual report in which he mentioned facial recognition technology. The Commissioner is responsible for overseeing police use of DNA and fingerprints in England, Wales and Northern Ireland, and for encouraging the proper use of public space surveillance cameras. The report, which was submitted to the 鶹 Secretary in November, sets out Professor Sampson’s findings in relation to his statutory responsibilities, and other observations about the use of biometrics and overt surveillance. Among other topics, it also covers facial recognition technology. Professor Sampson said: “The areas of biometrics and surveillance are becoming both increasingly important and increasingly inter-related. In recent years we have seen an explosion of surveillance technology in the public and private realms, with devices such as drones and body worn video, dashcams and smart doorbells. At the same time, there have been enormous advances in the power of AI to exploit the vast amount of surveillance data now being produced. “I believe that many of the issues raised in my report show that we urgently need to wake up to the opportunities presented, and the threats posed by, the explosion of capability in AI-driven biometric surveillance. If we fail, we risk missing out on the potential benefits it can offer and exposing ourselves to the potential dangers it poses. “Now more than ever, we need a clear, comprehensive and coherent framework to ensure proper regulation and accountability in these crucial areas.” <strong>Legislation</strong> Sampson notes that the police are using biometric surveillance technology such as facial recognition, though there remains uncertainty around the regulatory framework for ensuring legitimacy and accountability if and when they do use such technology. He outlines the two sides to the debate: “Biometric surveillance technologies can undoubtedly be intrusive to privacy and raise other human rights considerations, but there is no question that they can also be powerful weapons in the fight against serious crime and safeguard other fundamental rights such as the right to life and freedom from degrading or inhumane treatment.” This debate will likely continue for a long time, with supporters of both sides strongly arguing their case and competing for support among the public. Sampson claims that the extent to which the public will tolerate facial recognition will rely on whether or not they believe that measures are in place to make sure that the technology is used lawfully and responsibly. Parliament is considering legislation for reform and Sampson points out the need to address questions surrounding the legitimate role for new technology such as facial recognition in biometric surveillance by the police and law enforcement: “The ramifications of AI-driven facial recognition in policing and law enforcement are [ … ] profound enough to be taken seriously and close enough to require our immediate attention.” <strong>Concerns</strong> The revised Surveillance Camera Code of Practice was approved by Parliament in January 2022 and addresses the use of public space surveillance, including the use of facial recognition technology, by the police and local authorities. Sampson has advised how the code of conduct can be useful if adopted across government departments to address some of the concerns about surveillance companies and their practices. In his report, Sampson points out some concerns that have been raised around the use of facial recognition technology, including the potential for racial and gender bias; accuracy of the technology; a need for greater transparency and governance in the use of LFR; accuracy of reporting of false positives in the media; proportionality arguments particularly with reference to the rate of ‘success’ compared to the number of faces scanned; and the legal basis for deployment of the technology together with the need for independent authorisation. There is also a concern around whether the technology can be hacked for nefarious purposes. Much discussion has also been had around where the technology comes from. China is the world’s leading exporter of the technology and there is concern that foreign governments may have access to the data generated by the technology that is exported. A recent survey by the commissioner found that 18 of the 39 police forces who responded say that their external camera systems use equipment about which there have been security or ethical concerns (including Dahua, Hikvision, Honeywell and Huawei, and Nuuo) and at least 24 respondents say that their internal camera systems use equipment about which there have been security or ethical concerns (including Dahua, Hikvision, Honeywell and Huawei, and Nuuo). Sampson said: “It is abundantly clear from this detailed analysis of the survey results that the police estate in the UK is shot through with Chinese surveillance cameras. It is also clear that the forces deploying this equipment are generally aware that there are security and ethical concerns about the companies that supply their kit. “There has been a lot in the news in recent days about how concerned we should be about Chinese spy balloons 60,000 feet up in the sky. I do not understand why we are not at least as concerned about the Chinese cameras 6 feet above our head in the street and elsewhere. “Parliament has already acted to curtail the use of equipment made by several Chinese manufacturers from some areas of public life where security is key. Myself and others have been saying for some time that we should, both for security and ethical reasons, really be asking ourselves whether it is ever appropriate for public bodies to use equipment made by companies with such serious questions hanging over them.” <strong>Criticisms</strong> There are a few well-publicised cases where the use of facial recognition technology has been criticised. For example, speaking to the BBC, Clearview CEO Hoan Ton-That revealed that the company has run nearly a million searches for US police. The founder also revealed that Clearview now has 30bn images scraped from platforms such as Facebook, which have been taken without users’ permissions. The company has been fined several times in Europe and Australia for breaches of privacy. The technology allows a law enforcement customer to upload a photo of a face and then find matches in a database of billions of images it has collected. The company is banned from selling its services to most US companies, after being taken to court in Illinois for breaking privacy law. However, this ban does not apply to the police. Facial recognition technology is also being used by some governments to curb dissent and target protesters. A Reuters review of more than 2,000 court cases in Russia, has revealed how the technology is being used to identify opponents of the regime. In September, the Iranian government announced that it was planning to use facial recognition technology on public transport to identify women who are not complying laws on wearing the hijab. This month, it was announced that the Iranian government had started to install cameras to identify women not wearing the hijab. In Scotland, a council has been criticised by data watchdog the Information Commissioner’s Office (ICO) for using facial recognition technology in nine schools. <strong>Successful use cases</strong> On the other hand, there are examples of where the technology has been used to apprehend criminals. In South Africa, six men were arrested for a series of heists after being identified through facial recognition technology. The suspects were found after facial recognition analysis was carried out on the CCTV footage from the stores they robbed. This example adds to the argument that facial recognition technology can help keep the public safe. In Australia, a recent survey has found that 72 per cent of the 4000 people asked want more facial recognition at airports to speed up the customs process. Adam Schwab, CEO and co-founder of Luxury Escapes, which carried out the research, told news.com.au: “Facial recognition technology is just one of many ways Australian travellers, and the travel industry, continue to look for ways to make travel safer, more efficient and less stressful for all.” In airports, use of facial recognition technology is twice as fast as fingerprint scanning, and is also not subject to passenger error. Facial recognition technology can be used to search for missing people. In 2020, Indian police used a facial recognition app to reunite thousands of missing and trafficked children with their families. Thousands of children go missing every year and many are trafficked to work in eateries, handicraft industries, brick kilns, factories or into begging and brothels. The technology was used to reunite more than 1500 children with their families. It has even been used in casinos to bar entry to gamblers who have requested to be excluded. <strong>Justification</strong> Sampson argues that for the use of facial recognition technology to be justified, it needs to be proportionate. i.e. we need to know how many people have been arrested as a result of their use, compared to how many people have been scanned. The National Physical Laboratory recently published independent research into the Met’s deployment of facial recognition. The study, which was entitled ‘Facial Recognition Technology in Law Enforcement’, tested the accuracy, in operational conditions, of the algorithm used by the Met in terms of different demographics. The research found that there are settings the algorithm can be operated at where there is no statistical significance between demographic performance. It was also found that when used at a threshold setting of 0.6 or above, correct matches (True Positive Identification Rate) were 89 per cent. The incorrect match rate (False Positive Identification Rate) was 0.017 per cent. The chance of a false match therefore, is just 1 in 6000 people walking past the camera. When used at a threshold setting of 0.6 or above, any differences in matches across groups were not statistically significant - meaning performance was the same across race and gender. With regards to Retrospective Facial Recognition, the true positive identification rate for high quality images was 100 per cent. The Met says it will use “Facial Recognition Technology as a first, but significant, step towards precise community-based crime fighting.” According to the Met: “Live Facial Recognition (LFR) enables us to be more focussed in our approach to tackle crime, including robbery and violence against women and girls.” Lindsey Chiswick, director of intelligence for the Met said: “Live Facial Recognition technology is a precise community crime fighting tool. Led by intelligence, we place our effort where it is likely to have the greatest effect. It enables us to be more focused in our approach to tackle crime, including robbery and violence against women and girls. “This is a significant report for policing, as it is the first time we have had independent scientific evidence to advise us on the accuracy and any demographic differences of our Facial Recognition Technology. “We commissioned the work so we could get a better understanding of our facial recognition technology, and this scientific analysis has given us a greater insight into its performance for future deployments. “We know that at the setting we have been using it, the performance is the same across race and gender and the chance of a false match is just 1 in 6000 people who pass the camera. All matches are manually reviewed by an officer. If the officer thinks it is a match, a conversation will follow to check. “The study was large enough to ensure any demographic differences would be seen. However, he has also been able to extrapolate these figures to reflect results more representative of watch list size for previous LFR deployments.” While there are some success stories, it is clear there is still a long way to go to gain the public’s trust on the use of facial recognition technology. However, there are ways to do this, and it involves being open, legislating, justifying and managing the technology. Any use needs to be lawful, justified and responsible.</p> Wed, 26 Apr 2023 08:27:31 +0000 Freya 16367 at /features/ensuring-legitimacy-and-accountability#comments Facial Recognition Technology in the fight against terror /features/facial-recognition-technology-fight-against-terror <div class="field-item even"><img typeof="foaf:Image" src="/sites/default/files/styles/696x462_content_main/public/adobestock_188158741.jpg?itok=3U_lKFi8" width="696" height="499" alt="" /></div><div class="field-item even"><a href="/features/surveillance-biometrics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Surveillance &amp; Biometrics</a></div><p><strong>By Tony Porter, previous surveillance camera commissioner for England and Wales and current chief privacy officer at Corsight AI</strong></p> <p>In his most recent review of London’s Preparedness for a Major Terrorist Attack published in March 2022, security expert Lord Toby Harris deemed London “significantly better prepared for a terrorist attack” compared to 2016. Despite the encouraging progress, Lord Harris notes that the last decade has seen the threat of terrorism shift from organised groups of attackers to lone extremists acting independently. Notably, police believe that the perpetrator of the 2017 Manchester bombing acted alone at the time of the attack. Like finding a needle in a haystack, the human eye remains ill-suited to identifying one suspect in a crowd. Technologies like Facial Recognition and Visual Search have long been at the forefront of the counterterrorism conversation, with the Met beginning operational use of Live Facial Recognition in 2020.</p> <p>Beyond comparing subjects with existing criminal databases, advancements in AI allow surveillance systems to run visual searches monitoring patterns of irregular behaviour. For example, someone leaving a bag unattended for an extended period or returning to a site regularly to take photographs. This information can then be used as the basis on which to perform actions, e.g. to notify officers to conduct a stop and search. Such use cases of facial recognition can play a key role in improving the efficiency of law enforcement in identifying and pre-empting potential terror attacks.</p> <p><strong>Outdated concerns</strong><br /> Unsurprisingly, the deployment of Live Facial Recognition (LFR) by law enforcement has stoked furious opposition from campaign groups over concerns around the technology’s accuracy, privacy and bias. With fourteen organisations including Big Brother Watch, Liberty and Black Lives Matter UK writing an open letter to Metropolitan Police Commissioner Sir Mark Rowley requesting an end to the use of facial recognition technologies by the police force. Published in September last year, the letter characterises the technology as “privacy-eroding, inaccurate and wasteful.”</p> <p>Concerns around inaccuracy, bias between gender and ethnic minorities, and the violation of privacy rights have historically taken centre stage in the public discourse around Facial Recognition Technology (FRT). Yet, these outdated narratives disregard FRT’s advancements in accuracy and bias elimination. They also fail to recognise the important checks and balances that are built into the way the technology is deployed. </p> <p>Most mature facial recognition solutions on the market prioritise privacy and ethics, providing recommendations to governments on further legislation that can help ensure FRT’s proportionate and responsible use. Furthermore, nobody has ever been arrested or charged based simply on the decisions of a machine. There is always a ‘human in the loop’ to evaluate any potential matches flagged by the software. This individual will always be accountable for the decision on what happens next, just like has always been the case for any police officer, investigating any type of crime.</p> <p>Individual views will differ on the exact circumstances on when FRT should be used. However, we should keep in mind that there is widespread public support for its use in counter terrorism operations, where it can save numerous innocent lives. </p> <p><strong>Facial recognition is highly accurate</strong><br /> In its most recent independent analysis of leading AI facial recognition software, the National Institute of Standards and Technology (NIST) observed an unprecedented performance of just 0.0001 variance between race and gender differences. To put this number into perspective, the acceptable false capture rate in Automatic Number Plate Recognition by UK law enforcement runs at +/- 3% on nationally accepted standards. </p> <p>Ultimately, FRT is a tool at our disposal to help us filter through vast amounts of information, yet it should not be the only deciding factor when identifying suspects. Nearly all existing sensitive technologies apply a dual verification process; by designing FRT protocol around the concept of placing this ‘human in the loop’ as mentioned above, an operator can exert their judgement when reviewing FRT matches, thereby working together to account for both human and AI biases. Vendors and distributors of FRT must work closely with counterterror agencies to ensure that operatives are adequately trained in how to spot instances of bias, while taking the appropriate measures to safeguard privacy rights.</p> <p><strong>Reframing the debate</strong><br /> The public is increasingly aware of unethical and disreputable Nation States seeking to dominate the AI market. We need look no further than what is happening with CCTV cameras from China to understand the importance of trust and ethics in surveillance. However, to ascribe toxic human qualities of racism and intrusion to FRT unfairly stigmatises the technology behind it. The fact remains that the speed and accuracy of FRT has come a long way since its inception, with said advancements showing no sign of slowing down.</p> <p>Following the horrific events of the 2017 Manchester bombing, a new piece of legislation known as “The Protect Duty” is soon to come into force, which imposes an obligation on the public and private sector to assess and take steps to mitigate the risk of terror attacks. Judicious application of video surveillance technology can help fulfil this duty by acting as a filter for law enforcement, helping draw attention to patterns of suspicious behaviour that warrants investigation. Given the potential of FRT in preventing the preventable loss of life, it seems irresponsible to dismiss such innovations based on concerns addressable through legal safeguards and processes. </p> <p><strong>Safeguarding privacy</strong><br /> It is essential that privacy remains a top priority when developing and using FRT. Yet like any other policing measure, legislative safeguards around legality, necessity and proportionality can be implemented to guarantee citizen rights and wellbeing. In the 2020 case R(Bridges) v Chief Constable of South Wales Police, the Court of Appeals recognised a sufficient legal framework within the UK legal system to enable the use of LFR. The court also stipulated police requirements for the lawful use of LFR (adherence to Public Sector Equality Duty requirements, continuous oversight of its DPIA and management of the watch list and positioning of the cameras). Such rulings are essential in establishing the precedent and rules around the use of FRT.</p> <p>The FRT sector welcomes such international standards and government oversight. With NIST and International Standards Organisations currently working to harmonise their approaches to Trustworthy AI, the onus now falls upon lawmakers and regulators to establish certainty by determining what FRT should enable in society, who should be able to use it and to construct the rules (laws) which enable and constrain such use, as well as hold it to account. Only then can FRT surveillance fulfil its true potential, not just in safeguarding our physical wellbeing, but also our rights and peace of mind.</p> Fri, 10 Feb 2023 14:27:14 +0000 Freya 16253 at /features/facial-recognition-technology-fight-against-terror#comments Drones & Counter Drone technology /features/drones-counter-drone-technology <div class="field-item even"><img typeof="foaf:Image" src="/sites/default/files/styles/696x462_content_main/public/adobestock_244474406.jpg?itok=infAsga4" width="696" height="232" alt="" /></div><div class="field-item even"><a href="/features/surveillance-biometrics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Surveillance &amp; Biometrics</a></div><p>On 19th July, CTB365 held a Drones and Counter Drone Technology webinar, one of the panellists, Jackson White, Business Development Director, RF Datalinks and Marketing (SPX Comtech) answers some of the questions that it wasn’t possible to answer during the webinar Jackson joined the British Army at 16 into the Royal Corps of Signals in 1990. After serving during the Cold War in Germany, Jackson specialised in tactical and strategic communications, deploying domestically and globally in the war on terror, serving in several roles. Jackson has served operationally on the UK Mainland, America, The Balkans, Africa, and the Middle East. Since leaving the Army, Jackson has completed a degree in Cyber Security and has combined this with his operational experience during his 15 years in Industry as a business development Manager at Enterprise Control Systems Ltd. and Esterline (Racal Acoustics). Then as a sales and marketing director for getac before returning to Enterprise Control Systems as business development director. Drones are likely to become much more utilised as weapons in warfare, as they are currently in Ukraine. How can we respond to drones being used as a method of attack in war? We’re currently witnessing the Ukrainian Armed Forces being faced with military UAS threats. Both military and commercial UAS present their own challenges to defeat from repurposed commercial UAS, through to weaponised military UAS. It’s therefore fair to assume that all military forces will use drones regardless of their budget. The question is how do we then defend and counter that threat within a security and military construct. The Counter-UAS challenge is at a tactical level, which often lacks doctrine and a concept of employment. Mission success depends on entry capabilities being scaled and field-upgraded in line with developing doctrine, and tactics in response to the threat evolutionary cycle, ensuring total lifecycle costs are minimised and the flexible Counter-UAS solutions remain fit for purpose in the future. Solution manufacturers must focus on staying one step ahead by combining advancements in radars, radio frequency (RF) detection, electro-optical/Infra-Red imagery and jammers in a multi-sensory approach. What is being done at national and international level to regularise the status of UAS and counter UAS technology in combat zones? We’re seeing a huge variation both internationally and within our national borders, mainly to do with the operation and the operational commander. For instance, if someone is involved in security and policing operations and wants to carry out a defeat of a drone to remove it from the airspace, the approach will be different to a military user on the battlefield wanting to take out a state actor or state-sponsored drone. In one place there will be significant legislation and rules about the use of RF defeat, whilst in the other – such as full-on warfare – there is actually no regulation. So one of the things we always have to do is talk with these users to understand their own legislative framework. However, those frameworks have been changing and it’s remarkable how quickly some of these have evolved in countries across the world in the space of seven years. To effectively assist in dealing with the threat, national and international legislation is going to have to keep changing to enable the true defeat, rather than just the detection of a threat. But equally, the technology has to change which will in turn make it easier for the legislators to agree to the use of the defeat. Ultimately, it works and must be considered from both ends, the moving goalpost of legislation and the practical changes in techniques. What is the effect of using an SDR-based device on a UAS – does the operator then control the UAS? Given the challenges presented by highly complex UAS waveforms, and the challenges given by the limitations of older DDS-based jamming solutions, the most effective Counter-UAS RF Defeat systems have had to undergo radical technology changes. This is the change to a software-defined radio, or SDR Source generation technique. The key elements of the SDR technique had already been developed as a next generation countermeasure waveform, but it wasn’t needed or implemented until the complex waveforms in the UAS domain appeared. The SDR-based source generation technique was of course triggered by the identification of this new threat, and then further refined to achieve maximum effectiveness in the inhibition and jamming of these complex signals. There is no doubt that the proliferation of this new threat initiated the introduction of SDR-based jamming techniques. So, if we take a high-level look as an SDR waveform, there are a number of key aspects to consider. Most importantly the SDR waveforms can be developed to disrupt the command and control links of UAS platforms in response to the ConOps of the operational user.</p> <div class="field-item even"><a href="https://365.counterterrorbusiness.com/2106/drones-counter-drones" target="_blank" title="nofollow">Read More</a></div> Thu, 27 Oct 2022 10:38:44 +0000 Freya 16093 at /features/drones-counter-drone-technology#comments Video surveillance vs civil liberties /features/video-surveillance-vs-civil-liberties <div class="field-item even"><img typeof="foaf:Image" src="/sites/default/files/styles/696x462_content_main/public/adobestock_419881690.jpg?itok=e-Fwfxsd" width="696" height="392" alt="" /></div><div class="field-item even"><a href="/features/surveillance-biometrics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Surveillance &amp; Biometrics</a></div><p>Simon Randall, CEO of Pimloc Limited looks at both sides of the video surveillance debate When new government policy created in the aftermath of the Manchester Bombing comes into force in early 2023, publicly accessible locations (PALs) companies will have a legal duty to implement security measures that protect the public from potential terror attacks. The “Protect Duty” legislation contains a raft of new demands and measures, including making a “watch list” of known terrorists which will be made available to businesses such as banks, airlines and hotels. The list will contain the names, dates of birth, and last known addresses of known terrorists plus, importantly, pictures of their faces. The government hasn’t stipulated how businesses should use the watch list, but it’s likely that many PAL organisations will use Live Visual Facial Recognition (LVFR) surveillance technologies. USAGE EXAMPLES Many English cities, and their respective police services, already make significant use of LVFR, as do PALs such as railway stations and airports, as well as other commercial and public sector organisations. However, when the Protect Duty comes into force, we can expect to see a further proliferation of video surveillance use. And, as more CCTV and other surveillance cameras are installed on business premises, more issues about personal privacy, ownership, and the right to use visual information will come under scrutiny. Arguably, few people, if any, would disagree with measures designed to keep them, their family, and friends safe. Conversely, many, and particularly those involved in ensuring civil liberties, have raised legitimate concerns about the consequences of using live facial recognition technologies and describe them as “Orwellian surveillance” tools. PROTECT DUTY It is important to remember that the Protect Duty legislation was drafted after consultation with a variety of organisations, sectors and campaigners from all sectors of the British public. As such, the new regulations and recommendations provide a response to the demands - and feelings of a population who’ve experienced a significant rise in terrorist incidents in recent years. The fact is that video surveillance is here to stay, the technologies work and have been of immense value in monitoring known terrorists and criminals as well as locating, people wanted for the most serious and violent offences. PRIVACY However, the debate about privacy, safety and personal rights will continue, and rightfully so. There are many key questions that must still be debated, answered, and most importantly, addressed by all parties concerned, from government and technology suppliers to law enforcement agencies, PALs and, increasingly, commercial user organisations. Until now, the focus – and challenge – has been on policing and the public sector. But as new government legislation prompts commercial organisations to make more use of video, their practices will, and should, come under greater scrutiny. CCTV and video surveillance have undoubted appeal for the commercial sector, especially for PALs which, particularly in the aftermath of the Manchester Bombing inquiry, the public sees as responsible for ensuring the safety of visitors, guests and concert goers. Having more video coverage means PALs, such as venues, can monitor events and, if they’re using the watch list, identify potential bad actors in real time. Should there be even the possibility of an incident, video provides useful evidence, proves the PAL was taking every possible precaution, and can go a long way to ensuring the organisation’s reputation doesn’t take a massive public hit. Unlike the police, government bodies and the military, most commercial companies will have neither the experience nor the in-depth understanding of the law, which is written to ensure proportionality, to make managing the delicate balance between public safety, civil and individual liberties and the organisation’s own rights, responsibilities and reputation less than tortuous! THE BASICS All too often, the debate has been centred on the ethics of surveillance, civil liberties and other big issues. That’s not going to get us anywhere. We need to take a step back and have a nuanced discussion about the basics, the factors that will improve management and effectiveness, at the same time as building in checks and balances that mitigate any potential risks to personal privacy risks posed by LFVR technologies. What do I mean by the basics? I’d start by comparing the benefits of using CCTV for safety purposes against the degree to which it intrudes on individual privacy. Remember, the law dictates proportionality. Then, a step-by-step assessment of every aspect of the way in which the organisation uses, stores, manages and secures data contained on both video footage – that is faces – and the watch list database. One of the ways in which organisations can protect the privacy of people captured on video is by facial blurring. Facial blurring technologies have been used by security and law enforcement agencies for some time because it protects people’s privacy while still allowing them to be identified. For many organisations in these sectors, anonymising faces is already an integral part of the storage process. The technology is becoming more sophisticated, with some software now capable of automatically detecting and blurring faces in images and video. FACIAL BLURRING The government has said that it is working with businesses and public sector organisations to implement software that can automatically blur facial images captured from CCTV, LFVR and other sources. This would mean that police, public sector services and businesses would still be able to identify individuals on their watch lists, but their identities would be protected from public view. Using automated facial blurring allows police and security forces to focus their efforts on tracking and apprehending individuals, without having to worry about redacting video manually before it is shared which is a slow, cumbersome and expensive process. The introduction of mechanisms such as the watch list, combined with the use of facial blurring, are just two of the ways in which the government is trying to strike a balance between security and privacy. There’s still a long way to go, and a lot of talking to be done before concerns about privacy and equality will be addressed. Whatever differences there may be on either side of the debate, we can all agree that public safety, especially that of our children, is our paramount concern. Compromise is essential and, with technologies like facial blurring at our disposal, it is possible to find a balance between protecting an individual’s safety and their privacy.</p> <div class="field-item even"><a href="http://www.pimloc.com" target="_blank" title="nofollow">Read More</a></div> Thu, 27 Oct 2022 10:33:20 +0000 Freya 16092 at /features/video-surveillance-vs-civil-liberties#comments DRONES AND COUNTER DRONE TECHNOLOGY /features/drones-and-counter-drone-technology <div class="field-item even"><img typeof="foaf:Image" src="/sites/default/files/styles/696x462_content_main/public/drones_airports_adobestock_284496239.jpg?itok=pB_g8RAy" width="696" height="448" alt="" /></div><div class="field-item even"><a href="/features/surveillance-biometrics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Surveillance &amp; Biometrics</a></div><p>Whether the operator is a negligent enthusiast or an individual intent on an act of terrorism, drones can pose a significant security risk. Drone detection systems and other counter-drone technologies now play a major role in national security and the fight against terrorism There are many examples which demonstrate the capability of drones to save lives, solve problems and boost the economy, but while this industry develops, the use of drones by malicious actors has skyrocketed. With the police responsible for taking enforcement action when it is believed that the requirements of the law have not been met, counter-drone technology, which can detect and - when needed - jam, capture, or disable unauthorized drones, has become an essential weapon in the arsenal to help keep the public safe. Sponsored by ECS &amp; TCI, the latest CTB365 webinar Drones and Drone Technologies provided food for thought. Hoisted by counter terror expert Philip Ingram MBE, the webinar delivered a fascinating insight into how the police force are coping with a wide variety of drone misuse, from simple illegal flying near airports to smuggling contraband into prison establishements. James Bingham, lead intelligence analyst at the National Police Chief Council’s Counter Drones unit, set the police scene with n examination into the capabilities of the latest counter drone systems it uses. Bingham was joined by TCI application engineer Peter Savage, who presented a session called Manual Drone Detection &amp; Geolocation vs Automatic, while Paul Taylor of ECS, David Beckett of TCI and David Eldridge of Chess Dynamics presented sessions which highlighted counter UAS integration challenges and technology to help oversome them. Following registration, the two hour webinar can be viewed free of charge.</p> <div class="field-item even"><a href="https://365.counterterrorbusiness.com/2106/drones-counter-drones/register" target="_blank" title="nofollow">Click here to register</a></div> Fri, 21 Oct 2022 15:25:11 +0000 Freya 16067 at /features/drones-and-counter-drone-technology#comments Behavioural Analysis 2022: sharing best practice /features/behavioural-analysis-2022-sharing-best-practice <div class="field-item even"><img typeof="foaf:Image" src="/sites/default/files/styles/696x462_content_main/public/ba22-rgb_colour-dark_w_dates.png?itok=1RIdblxR" width="696" height="548" alt="" title="Behavioural Analysis 2022: sharing best practice" /></div><div class="field-item even"><a href="/features/surveillance-biometrics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Surveillance &amp; Biometrics</a></div><p>Despite the fact that we ask the general public to 'see something, say something' and to report suspicious behaviour and unattended bags, there is a reluctance by many to truly embrace behavioural analysis as an effective screening process. There has been an unhealthy over-reliance on technology and excessive concern about subjective profiling.</p> <p>The reality is that every security agency in the world differentiates based on the perceived threat an individual poses. The security services identify terrorist plots because they focus their attention on persons or groups of concern rather than keeping every citizen under surveillance; customs and immigration agents screening passengers arriving from overseas do not search everybody the same way, yet identify wrongdoers because of their targeted approach; and, those securing tourist attractions, sports stadia and entertainment complexes all understand the profiles of those who could cause disruption.</p> <p>The Behavioural Analysis series of conferences was launched by Green Light back in 2018 when 136 security professionals and academics gathered at Cardiff’s Principality Stadium, the home of the Welsh Rugby Union, for a two-day event. A year later, at the invitation of Mall of America (whose security team had attended the Cardiff conference), 141 participants met in the USA and then, following the commencement of the coronavirus pandemic, the 2020 live event was transformed into an online conference which 211 tuned in to. As the restrictions on travel are lifting, Green Light has seized the opportunity to bring interested parties together again – this time in the academic setting of the University of Northampton on 8 and 9 June 2022.</p> <p>The university campus setting demonstrates the link between the conference material – academic presentations and case studies of industry best practice – and the goals of the event. The Behavioural Analysis series exists in order to provide security professionals with a greater understanding of the science behind the approach and, ultimately, proof of concept.</p> <p>At each iteration of the event, delegates have heard from those who have implemented effective behavioural analysis programmes. In the past, hotel chains, sports stadiums, casinos, places of worship, airports, shopping malls and even the Eurovision Song Contest have shared their experiences. Those gathering this year in Northampton will hear presentations from different police forces – including the Guardia Civil, who perform passenger screening duties at Spanish airports; the British Transport Police, which implements a programme specifically designed to address the issue of violence against, and harassment of, women and girls on the UK’s rail networks; and, the Royal Canadian Mounted Police (RCMP), which will be providing a drug interdiction perspective.</p> <p>The organisers have always been keen to ensure that those who have themselves intercepted those with negative intent can share their experiences. And this year, delegates will hear the personal account of a door supervisor, Avi Tabib, who, on 30 April 2003, undoubtedly saved the lives of many people enjoying an evening out at Mike’s Place bar in Tel Aviv by identifying and then physically engaging with a suicide bomber hellbent of perpetrating an act of mass murder. The bomber did detonate his device and Avi was very seriously injured, but we are blessed that he survived and can demonstrate that sometimes one has to act and that simply reporting concerns is not always the correct course of action.</p> <p>Arguably, that’s what should have happened in May 2017 when a suicide bomber attacked the Manchester Arena at the end of an Ariana Grande concert; 22 people were to lose their lives, despite the fact that the bomber had been identified as a possible threat well before he detonated his device. Not only had security guards discussed him, but a member of the public had been sufficiently concerned that he even questioned the bomber himself. The attack demonstrated that venues need to do far more to protect the lives of their guests and staff and it was Figen Murray, whose son Martyn Hett was killed in the attack, who took on the challenge to ensure that they did. Her Martyn’s Law campaign is set to bring about the necessary change in legislation.</p> <p>In January 2022, the UK government published its response to the Protect Duty public consultation, and legislation is being drafted to ensure that venues carry out proper risk assessments, ensure adequate training, implement effective protective security measures and develop robust plans as to how they would manage or respond to an actual terrorist attack. Figen Murray OBE will be presenting a keynote address at Behavioural Analysis 2022.</p> <p>Project Vigilant, first piloted by Thames Valley Police, is an initiative that uses a combination of uniformed and plain-clothed officers to carry out patrols in areas outside night clubs, bars and pubs, to identify people who may be displaying signs of predatory behaviour, such as sexual harassment, inappropriate touching and loitering. Lee Davies, previously a Detective Chief Inspector with Essex Police, was responsible for managing a multi-force operational response to high-risk organised crime groups, targeting firearms manufacture and supply, human trafficking, modern slavery, drug importation and supply and crimes in action. His presentation at Behavioural Analysis 2022 will draw together the way in which behavioural analysis has helped tackle these criminal activities and will drill down on his more recent consultancy role with Project Vigilant which is very much based on behavioural analysis. Furthermore, in respect of the night time industries, the figurehead for the sector – Michael Kill, the CEO of the Night Time Industries Association and the Chairperson of the UK Door Security Association - will provide the voice of the industry’s stakeholders and considers how behaviour detection techniques address their security concerns now and what we need to consider in respect of the sector’s plans for growth and diversification in the future.</p> <p>Behavioural Analysis 2022 might be taking place as we emerge from pandemic-generated restrictions and learn to live with the virus but, no sooner has one challenge started to diminish, another, more sinister one, has emerged – Russia’s invasion of Ukraine. Whilst this conference focuses on identifying threats in crowded places, within the workplace and at venues where the general public gather, we cannot ignore the broader geopolitical landscape. With this in mind, the opening address is from criminologist and author of the book Terrorist Minds Dr. Sagit Yehoshua. In a presentation entitled, Zelensky, Putin &amp; Johnson: the good, the bad and the…profiling perspective, she will share with delegates her profiles of Volodymyr Zelensky, Vladimir Putin and, as the conference takes place in the UK, Boris Johnson, specifically focusing on what behavioural indicators there were for their current actions.</p> <p>The crux of the conference will, however, address the science behind behavioural analysis techniques. Presenters include Abbie Maroño, a lecturer in psychology at the University of Northampton, and director of BRINC, who will be exploring lower body indicators of stress. Although lower body movements and gestures are highly communicative, they are often overlooked. Given that displays of the lower body are ‘less contaminated’ by social and cultural restriction, and less likely to be monitored by the observer, they may be more accurate indicators of one’s internal state. Abbie Maroño, will discuss how to train professionals, particularly those in a clinical or forensic setting, to recognise valuable nonverbal cues of emotional distress in the lower body in an unobtrusive way.</p> <p>Dr David Keatley, Associate Professor in Criminology from Australia’s Murdoch University will be presenting on timeline analyses. His presentation will examine research into threat detection related to criminal cases. Using timeline analyses (e.g., behaviour sequence analysis, indicator waves, crime script analysis), his presentation will outline how this work has helped with early threat detection whilst also assisting with cold case (including the 1965 bombing of Canadian Pacific Air Lines flight 21 - being one of the largest unsolved mass murders in Canadian history) reviews and major crimes investigations.</p> <p>From Luxembourg, Angelique Laenen, a psychologist with the Court of Justice of the European Union, will explore indicators as to when verbal aggression might become physical. And, from Holland, An Gaiser, Senior Manager Forensic Integrity &amp; Compliance with KPMG Nederland, will be busting the myths surrounding behaviour detection. Those passionate about profiling may well have watched the TV series Lie To Me and many are convinced that we can identify liars through an analysis of nonverbals. The real world is a little different!</p> <p>Microexpressions do exist but most of the current academic research indicates that the interpretation of them is being used, incorrectly and potentially dangerously, to indicate guilt. The reality is that microexpressions, which are highly unlikely to be identified in a crowded environment or at a fleeting glance, are simply indicators of stress. Potential guilt or negative intent can only be determined by implementing investigative interviewing techniques. Likewise, neurolinguistic programming is also being peddled as a means to indicate the construction of lies; rigorous academic research, such as that conducted by the University of Amsterdam, debunks this. An will also be demonstrating how academic research can positively influence workplace compliance issues and threat identification; the insider threat is, after all, a major concern.</p> <p>In this vein, there will also be a presentation entitled Keep the Red Flags Flying. Mathias Reveraert, a researcher at Belgium’s Universiteit Antwerpen, will discuss the results of research into insider threat identification where the goal was to discover potential ‘red flags’ that could point to an imminent insider threat incident. The study employed the Delphi Technique to compare and contrast the opinions of insider threat experts on insider threat mitigation. The presentation will include an overview of the different phases of insider threat development and will specifically drill down on the red flags/good practices that were determined to be priorities for identification.</p> <p>Whilst Green Light is itself a provider of behavioural analysis consultancy and training, Behavioural Analysis is not a sales pitch for the company’s own services. Indeed, the company does not even present itself and welcomes its competitors as participants and presenters. The aim, after all, is to ensure a broader acceptance of behavioural analysis as an effective screening tool and provide proof of concept. Whilst often deemed to be a swear word, there’s nothing wrong with profiling!</p> <p><em><strong>Register <a href="http://www.behaviouralanalysis.com" target="_blank">here</a> for Behavioural Analysis 2022.</strong></em></p> <p><em><strong>Readers of Counter Terror Business are eligible for a 15 per cent discount, simply enter the code CTB15 during checkout.</strong></em></p> Thu, 21 Apr 2022 14:26:55 +0000 Michael Lyons 15806 at /features/behavioural-analysis-2022-sharing-best-practice#comments Facial Recognition: Facing up to terrorism /features/facial-recognition-facing-terrorism <div class="field-item even"><img typeof="foaf:Image" src="/sites/default/files/styles/696x462_content_main/public/eye-2926215_1920_7.jpg?itok=NtNapaut" width="696" height="487" alt="" title="Facial Recognition: Facing up to terrorism" /></div><div class="field-item even"><a href="/features/surveillance-biometrics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Surveillance &amp; Biometrics</a></div><p>The EU Commission intends to define rules for the public use of artificial intelligence (AI) by setting precise boundaries for systems which may present risks for the rights to data protection and privacy.</p> <p>AI systems intended to be used for remote biometric identification of persons in public places will be considered high risk and will be subject to a third-party conformity assessment, including documentation and human oversight requirements by design. Although, there will be ‘serious’ exceptions to this prohibition, such as terrorism investigations, finding missing children and public safety emergencies, as officers will need to take urgent action.</p> <p>However, there are still misconceptions about the validity of facial recognition —both the technology itself and its deployment— which may entice some police authorities and organisations to limit or withdraw the use of Facial Recognition Technology (FRT) altogether. This article seeks to explore the use of FRT to combat terrorism, and how organisations operating in the supply chain ought to develop and utilise FRT to enhance public confidence and fit in line with the new EU’s proposals.</p> <p><strong>The need for FRT to combat terrorism</strong><br>In a hostile world, terrorism risks are increasing. These risks pose a significant threat to not only national security, but to political and social stability and economic development. The utilisation of facial recognition solutions can play a key role in improving the efficiency of police forces, intelligence agencies and organisations to respond and prevent major attacks, in a way that minimises intrusiveness for citizens.</p> <p>In general, FRT is a biometric surveillance aid which uses a camera to capture an image of an individual’s face, mainly in densely populated places such as streets, shopping centres, and football arenas. It is then able to provide a similarity score when it recognises a similarity between a facial image captured, with an image held within a criminal database. If a match is made, an alarm will inform the security operator to do a visual comparison. The operator then can verify the match and radio police officers and have them conduct a stop, if one is needed. It is important to note, the technology does not establish individual ‘identity’ – that is the job of humans.</p> <p>Moreover, many terrorists are not known to a database and can move around populated spaces largely unnoticed. With the advancement of AI, these surveillance systems can now monitor patterns of irregular behaviour, such as someone leaving a bag unattended for a long period of time or returning to a site regularly to take photographs. This information can then be used as the basis on which to perform actions, e.g. to notify officers to conduct a stop search or to record the footage. Security officers need access to this type of intelligence, to secure the perimeter of their facility and ultimately save lives.</p> <p><strong>Privacy remains a top priority</strong><br>It is essential, in this particular case, that privacy remains a top priority when utilising FRT. Privacy advocates, and AI sceptics suggest FRT can be hijacked for nefarious purposes, including unlawful surveillance. But what these sceptics are not aware of are the measures that can be put in place to ensure the privacy of individuals captured by Automatic Facial Recognition (AFR) cameras is protected. This can include anonymising by passers in the camera’s field of view without obscuring movements or encrypting original footage so that only authorised users can access sensitive data.&nbsp;</p> <p>Moreover, it is the responsibility of FRT developers to implement internal policies that clearly stipulate they will not partner with end users who do not act upon the importance of privacy and security in the implementation and operations of their systems. It is also equally important that these customers are properly prepared, trained and competent to use FRT lawfully.</p> <p>It must also be said, where FRT is being used under a Directed Surveillance Authority, the UK has one of the strongest and globally respected regulatory authorities under the Investigatory Powers Commissioner. This provides for oversight, restriction of collateral intrusion and greater provision of accountability for surveillance operations of this nature.</p> <p><strong>Accuracy is improving</strong><br>In addition, while face masks have helped reduce the spread of Covid-19, they have also become a significant security threat. Security officials have raised concerns that facial recognition cameras will not be able to identify terrorists because they can blend into crowds and hide their faces with a mask. In fact, as recently as July 2020, NIST flagged that even the best of the facial recognition algorithms studied were unable to correctly identify a mask-wearing individual up to 50 per cent of the time.</p> <p>However, these systems no longer have to operate best when environmental factors are controlled, meaning that extreme angles, occlusion and contrast in lighting can now be compensated for. It must be pointed out, that with most technology, FRT is advancing at rapid rates. Indeed, as of January 2021, facial recognition algorithms can now correctly identify individuals up to 96 per cent of the time, regardless of if they are wearing protective facial coverings or not.</p> <p>Moreover, however flawless our technology is when it is designed and produced, it can of course be abused when operated by an oppressive end user. Where inadequately regulated in a democracy, such dysfunction is a short ride away from dystopia. Developers must therefore work closely with end users such as police departments, to understand the user requirement and the legitimacy of endeavour. They must work collaboratively where necessary to enable and support client compliance to statutory obligations and to build appropriate safeguards where vulnerabilities may arise.</p> <p>Indeed, anyone who develops machines which have an impact upon society carries a responsibility, to ensure that they are only used as a force for good and to the benefit of the society and communities which our technology may help to shape.</p> <p><strong>Closing thought</strong><br>The entire supply chain must welcome the recent declaration made by the European Union to establish a pan European Data Governance Act and put further measures in place to ensure FRT is deployed and utilised in an ethical way.</p> <p>It is our hope that this legislation will provide much needed statutory leadership in the establishment of clear rules and guidance by which the use of technologies such as FRT can be more confidently designed, produced and operated in a manner which maintains trust in safer societies. This will allow overstretched and under-resourced law enforcement agencies to fight terrorism and other serious crime without one hand tied behind their backs. L</p> <p><em><strong>Written by Tony Porter, former Surveillance Camera Commissioner, is the Chief Privacy Officer at Corsight AI – a leading Facial Recognition Solutions provider with unparalleled speed, accuracy and privacy protection.</strong></em></p> <p><em><strong>Tony works across Corsight’s senior team to assist in further developing FRT to achieve best practice, legal compliance and to be best in class.</strong></em></p> <div class="field-item even"><a href="http://www.corsight.ai" target="_blank" title="nofollow">www.corsight.ai</a></div> Sun, 01 Aug 2021 20:12:20 +0000 Michael Lyons 15463 at /features/facial-recognition-facing-terrorism#comments