/* ---- Google Analytics Code Below */

Sunday, February 03, 2019

IBM Facial Reco Data Sets Tests Bias

An increasing number of 'test' datasets for key aspects of learning.    If generally accepted,  are a way to certify algorithms and associated processes.

IBM Facial Recognition Dataset Aims to Remove Gender, Skin Bias 

Computer Business Review    By Conor Reynolds

IBM will provide a new million-image dataset to the global research community, to help developers eliminate gender and skin-type biases from facial-recognition software. The Diversity in Faces dataset uses publicly available images from the YFCC-100M Creative Commons dataset, annotated using 10 facial coding schemes, as well as human-labeled gender and age notes. The coding schemes include facial symmetry, facial contrast, pose, and craniofacial (bone structure) areas, in conjunction with conventional age, gender, and skin-tone schemes. IBM's John R. Smith said, "The [artificial intelligence] systems learn what they're taught, and if they are not taught with robust and diverse datasets, accuracy and fairness could be at risk." The researchers said the new dataset is designed so facial recognition “performance should not vary for different individuals or different populations."

No comments: