My iPhone 14 Pro is the first device I’ve owned that uses facial recognition for logins of websites and the phone itself. I’ve made some observations about the phone’s ability to know that I’m me when I’m doing anything other than staring at it:
Facing the phone but with side-eyes, as if your cubicle-neighbor is farting: WORKS
Facing the phone with eyes closed gently, as if trying to will away a fart of your own: WORKS
Facing the phone but with eyes clenched shut, as if trying to suppress something more than a fart: WORKS
Looking up, as if a bat were flitting about your living room and you were attempting to assess possible rabidity: DOESN’T WORK
Picking your nose (1st knuckle): WORKS
Picking your nose (2nd knuckle): DOESN’T WORK
Wearing a hooded sweatshirt, balaclava and large sunglasses (the kind that make you look like those close-up pictures of houseflies): WORKS
Wearing novelty glasses with the eyeballs on springs: WORKS
Wearing a real or fake eyepatch: WORKS
Wearing a facemask for disease-prevention reasons: WORKS (if it’s powder blue, but not if it’s covered in sequins arranged like the Lithuanian flag)
Attempting to lick something sticky from the screen: DOESN’T WORK
Your results may vary.