/* ---- Google Analytics Code Below */
Showing posts with label Policing. Show all posts
Showing posts with label Policing. Show all posts

Friday, February 24, 2023

German Constitutional Court Strikes Down Predictive Algorithms for Policing

Noting specially algorithms.  Meat to control a that level of specification?   Specifications of prediction? 

German Constitutional Court Strikes Down Predictive Algorithms for Policing

By Euractiv, February 17, 2023

Surveillance cameras at a German police station. 

In its ruling, the German Federal Constitutional Court struck down acts providing a statutory basis for police to process stored personal data through automated data analysis, in the case of Hesse, or automated data interpretation, in Hamburg.

The German Federal Constitutional Court declared the use of Palantir surveillance software by police in Hesse and Hamburg unconstitutional in a landmark ruling.

The ruling concludes a case brought by the German Society for Civil Rights (GFF) last year, hearings for which began in December. The plaintiffs argued that the software could be used for predictive policing, raising the risk of mistakes and discrimination by law enforcement. 

The German state of Hesse has been using the software since 2017, though it is not yet in place in Hamburg. The technology is provided by Palantir, a US data analytics firm which received early backing from intelligence agencies, including the CIA, FBI and NSA. 

The case was brought on behalf of 11 plaintiffs and rested on the argument that the software programme – named 'Hessendata' – facilitates predictive policing by using data to create profiles of suspects before any crime has been committed.

From Euractiv

View Full Article   

Saturday, February 04, 2023

Interpol Policing the Metaverse

Never seen this before,  Interpol involved in this kind of activity.  Has it been common?  Note mentions they are creating their own VR space. 

Interpol working out how to police the metaverse, In the BBC , By Marc Cieslak & Tom Gerken

Interpol secretary general Jurgen Stock says the global police agency is investigating how the organisation could police crime in the metaverse.

The metaverse is the widely-discussed, but not yet realised, concept that in the future people will be represented by 3D avatars in their online lives.

Interpol has built its own virtual reality (VR) space, where users can do training and attend virtual meetings.

Mr Stock said it is important for the agency to not get left behind.

"Criminals are sophisticated and professional in very quickly adapting to any new technological tool that is available to commit crime," he said., "We need to sufficiently respond to that. Sometimes lawmakers, police, and our societies are running a little bit behind., "We have seen if we are doing it too late, it already impacts trust in the tools we are using, and therefore the metaverse. In similar platforms that already exist, criminals are using it."

The environment, which can only be accessed through secure servers, enables police officers to experience what the metaverse could be, giving them a sense of the crimes that could occur, and how they could be policed.... ' 

Wednesday, December 28, 2022

Will a Spot Robot Make LA Safer?

No, and it won't even get the chance to.  See SF's attempt to do something similar. 

See Spot Spy? New Generation of Police Robots Faces Backlash

Los Angeles Times, Libor Jan; Gregory Yee,  December 21, 2022

Critics are eyeing the deployment of more sophisticated robots by U.S. police forces with suspicion, fearing their use for surveillance and potential threat to privacy and safety. For example, the Los Angeles Police Department intends to purchase a Spot robot from manufacturer Boston Dynamics to reportedly gather information in a "narrow set" of hazardous circumstances. This provoked opposition, as did recent attempts by San Francisco police to deploy weaponized robots in certain scenarios. The University of South Carolina's Geoff Alpert said the core issue is not whether police should use robots, but how police overseers should craft policies to guide their use. The University of California, Davis' Elizabeth Joh said local officials' "piecemeal efforts" to rein in police use of robots have mostly failed to keep pace with the technology's evolution, citing "increasing reliance by police on machine-made decisions."

Full Article  

Sunday, December 11, 2022

San Francisco Now Says No to Armed Robots

Mentioned the earlier decision here a week ago, and predicted it would NOT last in SF's policing environment.  It has now quickly been reversed.  

San Francisco decides killer police robots aren’t such a great idea

Explosive robots were approved for the SFPD arsenal, then the protests started.

By Ron Amadeo - 12/7/2022, 1:26 PM  in ArsTechnica

The robot police dystopia will have to wait. Last week the San Francisco Board of Supervisors voted to authorize the San Francisco Police Department to add lethal robots to its arsenal. The plan wasn't yet "robots with guns" (though some police bomb disposal robots fire shotgun shells already, and some are also used by the military as gun platforms) but to arm the bomb disposal robots with bombs, allowing them to drive up to suspects and detonate. Once the public got wind of this, the protests started, and after an 8–3 vote authorizing the robots last week, now the SF Board of Supervisors has unanimously voted to (at least temporarily) ban lethal robots.  ... ' 

Thursday, December 01, 2022

San Francisco to allow police 'killer robots'

 Depending on the definition, doubt this. 

San Francisco to allow police 'killer robots'   in the BBC

By Ben Derico & James Clayton,  BBC News, San Francisco

San Francisco's ruling Board of Supervisors has voted to let the city's police use robots that can kill.

The measure permits police to deploy robots equipped with explosives in extreme circumstances. Dr Catherine Connolly, from the group Stop Killer Robots, said the move was a "slippery slope" that could distance humans from killing.The city's police - the SFPD - told the BBC they do not currently operate any robots equipped with lethal force. They said though that there may be future scenarios in which lethal force could be used by a robot.

A spokesperson for the police said "robots could potentially be equipped with explosive charges to breach fortified structures containing violent, armed, or dangerous subjects".They also said robots could be used to "incapacitate, or disorient violent, armed, or dangerous suspects who pose a risk of loss of life".

Advocates for the measure said it would only be used in extreme situations. Opponents, however, say the authority could lead to further militarisation of the police force.

The measure passed, with an amendment on Tuesday specifying that officers could only use robots wielding deadly force after employing alternative de-escalation tactics. The board also stipulated that only a limited number of high-ranking officers could authorise its use. This type of lethal robot is already in use in other parts of the United States.

In 2016, police in Dallas, Texas, used a robot armed with C-4 explosive to kill a sniper who had killed two officers and injured several more. The SFPD said the department does not currently own any robots outfitted with lethal force, but said the measure might be needed in the future.  "No policy can anticipate every conceivable situation or exceptional circumstance which officers may face. The SFPD must be prepared, and have the ability, to respond proportionally," a spokesperson said.

The federal government has long dispensed military grade equipment, camouflage uniforms, bayonets and armoured vehicles to help local law enforcement.  But a California state law passed this year now requires city police forces to inventory military-grade equipment and seek approval for their use. Dr Catherine Connolly, from the campaign group Stop Killer Robots, said the move could "make humans more and more distant from the use of force and the consequences of the use of force". She also said the measure could make it "easier to make decisions to use lethal force in the first place".

Sunday, March 14, 2021

More Singapore Law Boosting Sensors

Its inevitable that policing will be advanced by cameras, sensors, automation.

Singapore Eyes More Cameras, Technology to Boost Law Enforcement  By ZDNet

Singapore intends to utilize more cameras and technology to support law enforcement and first responders, by harnessing sensors, video analytics, artificial intelligence (AI), automation, and drones.

Singaporean police already have deployed nearly 90,000 cameras in public spaces, and Minister for Home Affairs and Law K. Shanmugam said neighbor police centers and police posts have been repurposed with automated self-help kiosks, so citizens could access police services round the clock.

Efforts also are underway to build smart fire stations that use sensors and automation to enable operational response, decision-making, and manpower management.

An AI-driven system would transmit data in emergencies to officers before their arrival at the location.

Shanmugam also cited the use of drones and robots to effect security at Covid-19 isolation facilities, to reduce exposure risk for frontline officers. ... " 

Wednesday, February 24, 2021

Spot Dogs at Work for NYPD

See that the NY City Police department has again utilized one of the Boston Dynamics  'Spot' dogs.  Apparently for a situation with potential human danger involved. .  The impressive look of 'dog-like' droids is coming to life.  Is this the future of policing?   Some of the inhabitants seem unsure.  

The NYPD deploys a robot dog again   

Boston Dynamics’ little robot makes another appearance in New York City   By Bijan Stephen in TheVerge

The cyberpunk dystopia is here! (If you weren’t aware: I’m sorry. You’re living in a cyberpunk dystopia.) The latest sign — aside from corporations controlling many aspects of everyday life, massive widespread wealth inequality, and the recent prominence of bisexual lighting — comes in the form of robot dogs deployed to do jobs human police used to. Yesterday, as the New York Post reports, the NYPD deployed Boston Dynamics’ robot “dog” Spot to a home invasion crime scene in the Bronx.    ... " 

Wednesday, December 09, 2020

Amazon Wants Smart Neighborhoods

Amazon has received much hammering of late on their smart home capabilities. Giving openings to hackers, listening in on private conversations  and through its neighborhoods, overpolicing the streets.  Now they are giving indications that still want to move forward to make people safer.   I will be involved,  Why not?

Amazon Sidewalk will create entire smart neighborhoods. Here's what you should know

Launching soon via Echo speakers and all sorts of other devices, Amazon's low-bandwidth IoT network lets your smart home stretch beyond Wi-Fi range.  .. '

Ry Crist in CNet


Sunday, March 15, 2020

Airport Facial Recognition

More examples of regulating facial recognition in new contexts.

ACLU is suing the US gov for blocking airport facial recognition probe
By Ryan Daws Editor at TechForge Media.  

The American Civil Liberties Union (ACLU) is suing the US government for blocking a probe into the use of facial recognition in airports.

Homeland Security announced in December that it was ditching plans to scan the faces of every person arriving at airports in favour of those who are not US or permanent residents.... "

Sunday, March 08, 2020

Geofencing Warrant Policing

Had not heard the term 'Geofence' in law before, passed this along to lawyers I know to get their opinion on the current uses of this idea.

Google location data led police to investigate an innocent cyclist
The incident shows the very real problems with geofence warrants.
Jon Fingas, @jonfingas   in  Engadget
1h ago  ... ' 

Thursday, March 05, 2020

Adapting Policing from Your Doorbell

Continue to follow this.   As a user and a believer in using technical collaboration to make the world safer.  Some interesting adaptations to construct community watch capabilities via Ring.

Ring update gives you more control over police video requests
It also helps you limit the people and devices that access your account.

By Jon Fingas, @jonfingas

Ring is acting on its promises to improve privacy and security in 2020. The Amazon brand has introduced a Control Center in the Ring mobile app that aims to deliver more control over access and sharing. Most notably, there's a toggle to opt out of law enforcement video requests -- you don't have to wait to receive one before making a decision. Ring is unsurprisingly encouraging customers to leave it on (it has police partnerships to maintain) in the name of neighborhood security, but it's at least acknowledging that some users are uncomfortable with serving as de facto eyes for police officers. ... "

Thursday, February 27, 2020

Next Level of Police Enabling Technology

Obvious next step.   I see this is being done with our local police nearby, will look into this and see if I can get a demonstration.

Axon Rolls Out the Next Level of Police Technology: Live-Streaming Body Cameras   The Washington Post     By Tom Jackman

Police body-camera supplier Axon has deployed live-streaming cameras to the Cincinnati Police Department, allowing officers to show dispatchers or commanders crises as they unfold in real time, and helping rescuers find officers in trouble. The system automatically turns the camera on when a gun is drawn, emergency lights are activated, or a Taser is powered up. While the cameras will not include facial recognition software, they will have face-detection capabilities so police can quickly find video segments with people and react faster when footage is required for wider dissemination. Said Barry Friedman, a New York University law professor and founder of the Policing Project, “Body cameras go into sensitive places. With streaming, it won’t just be the officer, but somebody else. There have to be serious limits as to whom the video is streamed.”....'

Monday, January 27, 2020

Met Police to Use Live Facial Recognition Cameras in London

Not surprising at all to me, given the broad use of cameras already in the UK.   And despite all the complaints regarding privacy, this will be the future of such systems, tied to live facial recognition.

Met Police to Use Live Facial Recognition Cameras in London
The Guardian
Vikram Dodd

London's Metropolitan Police next month will deploy computer-linked live facial recognition cameras on city streets, despite outcries from civil liberties advocates and experts' doubts of the technology's efficiency. Essex University's Pete Fussey estimated the cameras are verifiably accurate in identifying wanted suspects in only 19% of cases, versus the Met's claim of 70% effectiveness. The Met said the cameras would be connected to a suspect database, and if someone is detected who is not in the database, that person's information will be deleted. However, if the system flags someone who is wanted, an officer will speak to that individual. The Met promised London Mayor Sadiq Khan the system will not be connected to other official databases, or used by authorities to monitor all of London or track someone down.  ... '

Monday, December 02, 2019

China Requiring Facial Scans

More use of facial data gathering.   While it could be used for convenience and general identification, say for voting or providing services, it can then also be used for policing as well.    Then its less than perfect accuracy comes into play.   Again, I think that its inevitable that these capabilities will expand.

China now requires face scans for all new mobile phone accounts
Reuters in VentureBeat

Above: People walk past a poster simulating facial recognition software at the Security China 2018 exhibition on public safety and security in Beijing, China October 24, 2018.

Image Credit: REUTERS/Thomas Peter
(Reuters) — China on Sunday put into effect new regulations that require Chinese telecom carriers to scan the faces of users registering new mobile phone services, a move the government says is aimed at cracking down on fraud.

The rules, first announced in September, mean millions more people will come under the purview of facial recognition technology in China.  ... "

Sunday, December 01, 2019

Drones and Facial Recognition in the UK

Another example that will likely be criticized as being easy to mis-use.

Police to Use Facial Recognition Drones to Help Find the Missing
BBC News (U.K.)
By Ken Macdonald

Police Scotland has launched a new aerial drone system to help search for missing people. The remotely piloted aircraft system (RPAS) employs advanced cameras and neural networks to identify individuals it is seeking, from up to 150 meters (492 feet) away. RPAS incorporates facial recognition software that learns as it goes and is able to differentiate between persons, animals, or vehicles, from a handful of pixels in a moving color image. Two policemen are required to operate the drone; one to fly it, the other to operate the facial recognition software. Police Scotland developed the system with colleagues from the multinational Thales Group and the University of the West of Scotland.  .... "

Friday, November 22, 2019

Locating Shooters Using Smartphone Videos

Seems quite a move forward, basically triangulation from smartphone microphones.  Examples and maps at the link.  Has been released as open source code for testing.

Carnegie Mellon System Locates Shooters Using Smartphone Videos
Carnegie Mellon University  By Byron Spice

Researchers at Carnegie Mellon University (CMU) have developed a system that can accurately locate a shooter based on video recordings from as few as three smartphones. The researchers tested the Video Event Reconstruction and Analysis (VERA) system using three video recordings from the 2017 mass shooting in Las Vegas that left 58 people dead and hundreds wounded; it correctly estimated the shooter's actual location. The system uses machine learning to match up video feeds and calculate the position of each camera based on what it is seeing; it also tracks the time delay between the sound of a shock wave caused by a bullet’s passage through the air, and the sound of the muzzle blast from the gun. Using video from three or more smartphones allows the direction from which the shots were fired to be triangulated. Said CMU’s Alexander Hauptmann, "When we began, we didn't think you could detect the crack with a smartphone because it's really short, but it turns out today's cellphone microphones are pretty good."  ... " 

Monday, June 17, 2019

Police Ask for Registration of Doorbell Cameras

Been intrigued by components of the smart home that can collaborate with a neighborhood for security.  Been following the ring Doorbell capabilities in this space for several years.    With an aim to create a 'smart neighborhood' by sharing video among users of the device.  In our area there are video contributions every few days.  Video clips are stored, but no recognition analysis is done.  This US Today article shows how police departments are asking for registration of cameras to aid them as well.  Statistics mentioned are interesting.

Monday, June 18, 2018

Drones Detecting Physical Behavior

Another case of complex pattern recognition.  In live video.  Here the drone is classifying a set of human behaviors and them classify them as a 'brawl'.  In English a violent, multi person fight.  Could be with or without weapons.  Could then be integrated further into a model of a crowd.  And identify individuals.    No indication that the drone would do anything other than alert the authorities.    Will see how accurate this is, and how its integration into police decision making is envisioned.

AI Drone Learns to Detect Brawls      in IEEE Spectrum  by Jeremy Hsu

Researchers at the University of Cambridge in the U.K., working with colleagues at the Indian Institute of Science, Bangalore and India's National Institute of Technology, Warangal, have used deep learning to develop a drone surveillance system that automatically detects small groups of people fighting each other. The system uses computer vision software that runs in real time to detect violent individuals, says the University of Cambridge's Amarjot Singh. The researchers trained deep learning algorithms to recognize violent actions by identifying body and limb poses in staged video footage of interns mimicking violence. Singh replaced some of the neural network layers at the front-end with fixed parameters, and used supervised learning toward the back-end, exchanging some of the deep learning process with human engineering input. This allowed the resulting ScatterNet Hybrid Deep Learning (SHDL) network to learn more quickly with less data and less available computing power. The researchers are securing permission from Indian officials to test the system at two upcoming music festivals, and Singh is working to incorporate crowd modeling into the deep learning models.  ... "