CaribbeanFever / FeverEyes / CaribFever

Caribbean Fever - Your ONLY destination to all things Caribbean and more

San Francisco is first US city to ban facial recognition

A poster simulating facial recognition software at the Security China 2018 exhibition in Beijing, October 24, 2018Image captionHigh-definition cameras "map" faces in a crowd and compare them to existing images

Legislators in San Francisco have voted to ban the use of facial recognition, the first US city to do so.

The emerging technology will not be allowed to be used by local agencies, such as the city’s transport authority, or law enforcement.

Additionally, any plans to buy any kind of new surveillance technology must now be approved by city administrators.

Opponents of the measure said it will put people’s safety at risk and hinder efforts to fight crime.

Those in favour of the move said the technology as it exists today is unreliable, and represented an unnecessary infringement on people’s privacy and liberty.

In particular, opponents argued the systems are error prone, particularly when dealing with women or people with darker skin.

"With this vote, San Francisco has declared that face surveillance technology is incompatible with a healthy democracy and that residents deserve a voice in decisions about high-tech surveillance," said Matt Cagle from the American Civil Liberties Union in Northern California.

"We applaud the city for listening to the community, and leading the way forward with this crucial legislation. Other cities should take note and set up similar safeguards to protect people's safety and civil rights."

Facial recognition camera

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Apple logo on Apple store Image caption Mr Bah claims Apple's facial recognition technology has mistakenly identified him as a thief

A student is suing Apple Inc for $1bn (£0.77bn), claiming that its in-store AI led to his mistaken arrest.

Ousmane Bah, 18, said he was accused of stealing from Apple Stores in four US states, and arrested at his home in New York last autumn.

He believes Apple's algorithms linked video footage of the thief with his name, leading to the charges.

Apple has told the BBC that it does not use facial recognition technology in its stores.

Mr Bah claims that a detective reviewed security footage from the time of one of the crimes and found the thief looked "nothing like" him.

Mr Bah had previously lost his provisional driving licence, which he believes may have been used by the thief during the robberies. The licence is not meant to be used for identification purposes, and does not include a photograph.

Mr Bah believes that Apple's algorithms are now trained to connect his name to images of the thief.

A detective with the New York Police Department allegedly told Mr Bah that the thief probably used Mr Bah's driving licence as identification during one of the robberies. The detective reportedly said that this may have caused Mr Bah to be charged with thefts committed at Apple Stores in New York, Delaware, New Jersey and Massachusetts, according to court papers

Mr Bah said one of the charges was for the theft of Apple pencils from a store in Boston - a city he had never visited. On the date of the robbery, he says he was attending his senior prom in New York.

Mr Bah claims that travelling to different states to respond to charges filed against him has affected his college attendance, and his grades have suffered as a result.

Apple's Face ID technology caused a stir when it was launched on the iPhone X in 2017, with commentators concerned that users' biometric data could be hacked if they used the feature. As far as is known, this is the first case against Apple that claims its facial recognition technology has been used to identify customers who have visited its stores.

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Amazon: Facial recognition bias claims are 'misleading'

Joy Buolamwini holding a white mask at a computer Image captionThe study author, Joy Buolamwini, has spent years researching bias in algorithms and machine learning

Amazon has defended its facial-recognition tool, Rekognition, against claims of racial and gender bias, following a study published by the Massachusetts Institute of Technology.

The researchers compared tools from five companies, including Microsoft and IBM.

While none was 100% accurate, it found that Amazon's Rekognition tool performed the worst when it came to recognising women with darker skin.

Amazon said the study was "misleading".

The study found that Amazon had an error rate of 31% when identifying the gender of images of women with dark skin.

This compared with a 22.5% rate from Kairos, which offers a rival commercial product, and a 17% rate from IBM.

By contrast Amazon, Microsoft and Kairos all successfully identified images of light-skinned men 100% of the time.

collage of faces

The tools work by offering a probability score that they are correct in their assumption.

Facial-recognition tools are trained on huge datasets of hundreds of thousands of images.

But there is concern that many of these datasets are not sufficiently diverse to enable the algorithms to learn to correctly identify non-white faces.

Clients of Rekognition include a company that provides tools for US law enforcement, a genealogy service and a Japanese newspaper, according to the Amazon Web Services website.

Use with caution

In a blog post, Dr Matt Wood, general manager of artificial intelligence at AWS, highlighted several concerns about the study, including that it did not use the latest version of Rekognition.
He said the findings from MIT did not reflect Amazon's own research, which had used 12,000 images of men and women of six different ethnicities.
"Across all ethnicities, we found no significant difference in accuracy with respect to gender classification," he wrote.
He also said the company advised law enforcement to use machine-learning facial-recognition results when the certainty of the result was listed at 99% or higher only and never to use it as the sole source of identification.

Amazon facial recognition tool analysing US star Oprah Winfrey and saying she is 76.5% likely to be a man

"Keep in mind that our benchmark is not very challenging. We have profile images of people looking straight into a camera. Real-world conditions are much harder," said MIT researcher Joy Buolamwini in a Medium post responding to Dr Wood's criticisms.
In an earlier YouTube video, published in June 2018, MIT researchers showed various facial-recognition tools, including Amazon's, suggesting that the US TV presenter Oprah Winfrey was probably a man, based on a professional photograph of her.
"The main message is to check all systems that analyse human faces for any kind of bias. If you sell one system that has been shown to have bias on human faces, it is doubtful your other face-based products are also completely bias free," Ms Buolamwini said.

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Face recognition police tools 'staggeringly inaccurate'

Woman with facial recognition marksImage copyrightGETTY IMAGES

Police must address concerns over the use of facial recognition systems or may face legal action, the UK's privacy watchdog says.

Information Commissioner Elizabeth Denham said the issue had become a "priority" for her office.

An investigation by campaign group Big Brother Watch suggested the technology flagged up a "staggering" number of innocent people as suspects.

But police have defended the technology and say safeguards are in place.

Which police forces are using facial recognition?

Big Brother Watch submitted freedom of information requests to every police force in the UK.

Two police forces acknowledged they were currently testing facial recognition cameras.

The Metropolitan Police used facial recognition at London's Notting Hill carnival in 2016 and 2017 and at a Remembrance Sunday event.

Its system incorrectly flagged 102 people as potential suspects and led to no arrests.

In figures given to Big Brother Watch, South Wales Police said its technology had made 2,685 "matches" between May 2017 and March 2018 - but 2,451 were false alarms.

Leicestershire Police tested facial recognition in 2015, but is no longer using it at events.

How does it work?

Police facial recognition cameras have been trialled at events such as football matches, festivals and parades.

How does live facial recognition work?

Any potential matches are flagged for a police officer to investigate further.

How have the police forces responded?

South Wales Police has defended its use of facial recognition software and says the system has improved with time.

"When we first deployed and we were learning how to use it... some of the digital images we used weren't of sufficient quality," said Deputy Chief Constable Richard Lewis. "Because of the poor quality, it was identifying people wrongly. They weren't able to get the detail from the picture."

It said a "number of safeguards" prevented any action being taken against innocent people.

"Firstly, the operator in the van is able to see that the person identified in the picture is clearly not the same person, and it's literally disregarded at that point," said Mr Lewis.

"On a much smaller number of occasions, officers went and spoke to the individual... realised it wasn't them, and offered them the opportunity to come and see the van.

"At no time was anybody arrested wrongly, nobody's liberty was taken away from them."

'Checks and balances'

The Metropolitan Police told the BBC it was testing facial recognition to see whether it could "assist police in identifying known offenders in large events, in order to protect the wider public".


Media captionIn your face: China's all-seeing surveillance system

"Regarding 'false' positive matches - we do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts," it said in a statement.

"All alerts against the watch list are deleted after 30 days. Faces in the video stream that do not generate an alert are deleted immediately."

But Big Brother Watch said it was concerned that facial recognition cameras would affect "individuals' right to a private life and freedom of expression".

Two police forces have said they are trialling facial recognition cameras

It also raised concerns that photos of any "false alarms" were sometimes kept by police for weeks.

"Automated facial recognition technology is currently used by UK police forces without a clear legal basis, oversight or governmental strategy," the group said.

What does Big Brother Watch want?

Big Brother Watch wants police to stop using facial recognition technology. It has also called on the government to make sure that the police do not keep the photos of innocent people.

Information Commissioner Elizabeth Denham said police had to demonstrate that facial recognition was "effective" that no less intrusive methods were available.

"Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public," said Ms Denham.

The Home Office told the BBC it plans to publish its biometrics strategy in June, and it "continues to support police to respond to changing criminal activity and new demands".

"When trialling facial recognition technologies, forces must show regard to relevant policies, including the Surveillance Camera Code of Practices and the Information Commissioner's guide," it said in a statement.



Views: 293

Comment

You need to be a member of CaribbeanFever / FeverEyes / CaribFever to add comments!

Join CaribbeanFever / FeverEyes / CaribFever

Comment by mr1stroke on May 15, 2019 at 12:48pm
Just about every one in NY who have been arrested are doomed they automatically used the eye technology for facial recognition

Nas: 'Prince Wouldn't Record With Me Because I Didn't Own My Masters'

Nas Speaks Out On Promoter's Kidnapping In Angola "He Was In Some Crazy Danger In Africa, So He Threw Me Under The Bus" [Video]

Nas: 'Prince Wouldn't Record With Me Because I Didn't Own My Masters'

Mom: Future Allegedly Forced Blac Chyna To Get ABORTION

Black Chyna's New Reality Show Looks Terrible {VIDEO}
Mom: Future Allegedly Forced Blac Chyna To Get ABORTION!!

Jeff Epstein Tried To HANG HIMSELF In Jail

Jeffrey Epstein is denied bail and ordered back to prison ahead of sex trafficking trial as judge cites the testimony of pedophile's victims in decision, calling him a 'danger to community' Jeff Epstein Tried To HANG HIMSELF In Jail

R&B singer Ciara Shows Off Her Amazing LEG MUSCLES

Ciara & Russell Wilson Launch Reality Show!!! (Exclusive Details) R&B singer Ciara Shows Off Her Amazing LEG MUSCLES!!

+++++++++++++++++++++

Celebrate your BIRTHDAY with CaribbeanFever on 107.5 WBLS, NY

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

FOR ALL YOUR DANCEHALL AND REGGAE NEWS CLICK PIC BELOWreggae dancehall queen 4

}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}

{{{{{{{{{{{{{{{{{{{{{{{{{{{{{{{{{{{{{{{

PUMP IT! or DUMP IT! SAT & SUN NIGHT on Caribbean Fever 107.5 WBLS NY (GET YOUR NEW MUSIC PLAYED) SONG{S} BEING VOTED ON ARE {------ ) and {----- }

CARIBBEAN NEWS

Caribbean Fever with the best Caribbean News online!

 

SOME TOP BLOGS

Groups

© 2019   Created by Caribbean Fever.   Powered by

Badges  |  Report an Issue  |  Terms of Service