From: Fast Company

Apple is Studying Mood Detection Using iPhone Data. Critics Say the Tech is Flawed

New information about a current study between UCLA and Apple shows that the iPhone maker is using facial recognition, patterns of speech, and an array of other passive behavior tracking to detect depression. The report, from Rolfe Winkler of The Wall Street Journal, raises concerns about the company’s foray into a field of computing called emotion AI, which some scientists say rests on faulty assumptions.

Apple’s depression study was first announced in August 2020. Previous information about the study suggested the company was using only certain health data points, like heart rate, sleep, and how a person interacts with their phone to understand their mental health. But The Wall Street Journal report says researchers will monitor people’s vital signs, movements, speech, sleep, typing habits—even the frequency of typos, according to the report—in an effort to detect stress, depression, and anxiety. Data will come from both the Apple Watch and iPhone, utilizing the latter’s camera and mic. Data obtained through Apple’s devices will be compared against mental health questionnaires and cortisol-levels data (ostensibly retrieved from participants’ hair follicles).

Read the whole story: Fast Company


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Comments will be moderated. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.