My New iPhone Is Making Me Look Uglier

The selfie camera has gotten too good.

An illustration of a man taking a selfie
Ben Kothe / The Atlantic. Source: Getty.

This past spring, I participated in the sacred tradition that comes around once every few years: I got a new iPhone. The speaker on my old one had broken, forcing my hand. But let’s be clear. I didn’t care about the speaker. The real reason you upgrade an iPhone, of course, is to get a better camera.

Within a couple of weeks of unboxing my new iPhone 14 Pro, however, I noticed something odd happening. I’d take a selfie, think I looked great, and lock my phone, satisfied. Later, I’d open my camera roll to find that the same photo was different than I remembered. My skin no longer looked smooth, the way it had on my old phone, and even in the preview on my new one before I snapped the photo. Instead, every selfie seemed to intensify my imperfections. I could see the budding wrinkles on my 30-something forehead and the faint red glow of the eczema patches around my eyes. Startled, I began questioning my appearance. Then I began questioning my device.

Other new iPhone owners have done the same: “I’ve noticed that my skin looks awful on this new camera,” read one post on Reddit. A commenter complained that the iPhone 14 “turns you into [an] ugly panda with dark circles.” A woman on TikTok posted a plea, asking that someone from the Apple “community” please tell her “how to fix this raggedy colorless front camera.” Another called it a “travesty.” Hundreds of posts and comments across the internet complain about the selfie camera, and debate exactly what could be causing its problems.

The iPhone selfie camera is now so good that it is perhaps too good. On social media, people slather themselves in beauty filters; remote workers go through entire Zoom meetings forgetting that their and others’ skin might be blurred and brightened by the software. You can upload your face to a generative-AI tool and, in seconds, get a dozen glossy professional headshots of yourself, wearing clothes you don’t even own. The new Apple camera, by contrast, offers a cold dose of reality: You have blackheads! And acne! And frown lines!

In recent years, complaints about the selfie camera seem to pop up whenever people upgrade their iPhones. The launch of the new iPhone 15 this fall seems to have set off another round of whining. A few models in particular—the 13, 14, and 15—dominate internet grumbling about how selfies now look too detailed (and worse, in the eyes of would-be posters). A recurring theme is also that selfies look better in the preview, before the person presses the shutter.

All three of these iPhones have a 12-megapixel front-facing camera, compared with the 7-megapixel lens on my old phone. But the reason that selfies are now so detailed isn’t because of megapixels. (The iPhone 12 also has a 12-megapixel selfie camera, but I haven’t seen many complaints about it.) Apple didn’t comment on what, if anything, might have changed beginning with the iPhone 13, but noted that the device has gotten more advanced at processing images after they are taken. An iPhone 14 and above can perform 4 trillion operations per photo to enhance the details and render a more natural skin tone, and not all of these changes are previewed in the Camera app before you press the shutter. The goal is to make your final photos as accurate as possible, Apple said.

Neither the old iPhone selfies nor the new ones are necessarily more accurate. “A photograph taken on a consumer device isn’t a true record, necessarily, of what someone looks like in the real world,” Emily Cooper, a professor of optometry at UC Berkeley who has studied selfies, told me. Think about a hotel that offers a small magnifying mirror in the bathroom. The face in the magnified mirror is no less real than the one staring back at you in the regular one. Some people on social media have suggested that the way Apple processes its photos “oversharpens” them, emphasizing detail in an unnatural way.

A camera is fundamentally a tool for documenting the world, but it is also pretty subjective. And what makes a photograph “good” depends on what you want to do with it. If you’re taking a photo of your eyelid eczema to send to your doctor, you probably want an extreme level of detail. If you’re taking a selfie in front of the Eiffel Tower to send to your boyfriend, you probably don’t want every blemish on your skin in high-def. Apple’s software is post-processing selfies en masse, but “there’s no one universal algorithm that will make every picture better for the purpose it’s intended for,” Cooper said.

It’s hard to build a camera that’s just right. Five years ago, the iPhone presented the opposite problem. In 2018, Apple’s newly launched XR and XS models took photos that made people look suspiciously good. The phones were accused of artificially smoothing skin, in what came to be known as “beautygate.” Apple later said that a software bug was behind these unusually hot photos, and shipped a fix. “Do you want a nicer photo or a more accurate representation of reality?” Nilay Patel, the editor in chief of The Verge, wrote in his review of the XR. “Only you can look into your heart and decide.”

The answer to Patel’s question seems to be that people want something in the middle—not too hot, but not too real either. People are chasing a Goldilocks ideal with the selfie camera: They want it to be real, authentic, and messy, just not too real, authentic, or messy.

“When someone thinks of a perfect selfie, they don’t think of having no pores,” Maria-Carolina Cambre, an education professor at Concordia University in Montreal, told me. “And they don’t think of having every single pore visible. It’s neither one of those extremes.” For more than years, Cambre and a colleague ran selfie focus groups in Canada, discussing the style of photography with more than 100 young people. They found that people examine selfies in a very specific way, which they termed the “digital-forensic gaze.” People inspect such images closely, pinching in to look for details and for evidence of any filtering. They look for flaws and inconsistencies. “This is the paradox,” she told me. “Everything is optimized, but the best selfies look like they haven’t been optimized. Even though they have.”

Every smartphone tackles this selfie challenge in a slightly different way. But because devices mediate so much of our self-perception at this point, switching them out can knock us off balance. I spend far more time curled up on the couch, scrolling through my phone’s photo albums, than I do pondering my reflection in the mirror. Perhaps my old iPhone, with its meager front-facing camera, had for years misled me about what I actually look like. Do people see me more like the smoother selfies on my old iPhone, or the more hi-def ones on my new phone?

Cooper, the optometry professor, suggested I send screenshots of myself to people who know me, and ask them. Basically everyone confidently said that the more detailed version of my photo was more accurate. But there was one exception: my mom. She thought the softer, prettier version was more true to me. Thanks, Mom.

Caroline Mimbs Nyce is a staff writer at The Atlantic.