2026-01-14 06:44:00
It is a truth universally acknowledged, that chimpanzees are scary as hell. Fortunately, we’re decades past the days when we’d dress them up in clothes and have them entertaining children1, but it seems like there are still too many people who say “they’re so like us!” and want to bring them home and teach them sign language. Even though there’s no chimpanzee who’s so adorable that they don’t seem to be more than a few seconds away from going nuts and ripping you apart with their bare hands.
That’s exactly the fear that Primate taps into, it’s exactly what it promises in the trailers, and it’s almost exactly what it delivers. The movie asks two questions: “Are you deeply, innately terrified of chimpanzees?” and “Why the hell not?”
If you don’t have that deep-seated fear of Our Closest Relatives, then maybe Primate won’t work for you. I don’t know; I’m not even a little bit Catholic, and I still think The Exorcist is one of the scariest movies I’ve ever seen. All I can say for sure is that seeing a reasonably convincing-looking chimp among a bunch of hapless young people had my heart pounding even before the movie got cooking. And I spent the bulk of the movie reacting exactly like you’d expect someone bad at horror movies to be reacting: jumping, holding my hands over my eyes or various threatened body parts, crossing my legs, trying to curl up into the fetal position in my reclining theater seat.
Primate is very violent and often gory, and it doesn’t shy away from showing the gore the best it can. But the thing I like best about it: it’s not torturous.
The very first scene of the movie has a cameo from a recognizable actor that I didn’t expect to be in it. I won’t spoil who it is, but I will say that just seeing them in the movie helped set the tone for everything that was to follow. The scene has one of Primate‘s most gruesome effects, so it’s preparing you to be anticipating even more explicit things to follow at any moment. But at the same time, even though it’s not a comical scene, it’s an implicit reminder that it’s just a movie, and you shouldn’t be taking any of it too seriously.2
I appreciated that, since I’d sat down thinking, “hell yeah! Terrifying killer chimpanzee! Let’s do this!” but during the trailers, the opening credits, and all through the movie’s post-cold-open setup, I was getting more and more anxious about what exactly I’d gotten myself into. Was this going to be too intense for me? Was it going to be like Good Boy, where I was all-in on the premise without understanding how easily-manipulated I am, especially when animals are involved?
One of the dominant images in the marketing — including the wall-sized standee display at the theater I went to — has Ben the chimp dressed in his clothes, sitting sullenly yet menacingly in a corner, his beloved teddy bear nearby. Had I not sufficiently anticipated how this wasn’t going to be a monster movie so much as Old Yeller?
So I was pleasantly surprised, repeatedly, when I saw how deftly Primate walks the line between giving moments maximum impact, and pumping the brakes so that it doesn’t feel like too much. We see a few still photos of Ben as an adorable baby chimp, we get a couple of scenes showing him being a playful, rambunctious member of the family, but it mercifully never dwells on that aspect of the story for too long.
There are a few moments that emphasize this is an innocent animal who’s sick, and invite you to feel sympathy for him, but the movie quickly moves on from that. Letting the idea land, before turning it into something else. It’s tricky to keep a movie with this premise moving along and functioning as a horror movie, without concentrating too much on the tragedy of it.
I think it helps that I never bought it 100%. I was never tricked into thinking that Ben was an actual chimpanzee, instead of practical effects or a performer in a suit. I’ve only seen a few clips from the press junket interviews, but I still don’t know whether they used an actual chimp anywhere during the production or not. I’m honestly happier not knowing. Every time I felt that twinge of anxiety, thinking, I’m fully aware that this is a horror movie, and that no animals or humans were actually harmed, but should I feel genuinely miserable right now?, there’d be a shot of the chimpanzee that felt just over-the-top enough to reassure me that it was all just a movie.
And as for the human animals, Primate does a similarly good job at keeping the violence and suspense-anticipating-future-violence at its peak, without taking it so far that it just becomes miserable and torturous.
It’s never a horror comedy, but it does completely understand how horror movies like this — the “bunch of characters picked off one by one by a ruthless killer” variety — are always just a hair’s breadth away from black comedy. One of the kills involves a character falling off an impossibly steep cliff, and it’s supposed to be sudden and shocking and tragic, but the movie also shows you, in close-up, exactly what happens when the character hits the ground. It got a laugh from me, just for being so ignoble, and for the surprise of I can’t believe they went there.
But the bulk of Primate isn’t really shocking or surprising; it mostly plays out with the beats that you’d expect, to the degree that I wouldn’t have been all that surprised if they’d jump-cut to the Cabin in the Woods control room, and “RABID CHIMPANZEE” was written on the whiteboard.
What makes the movie work isn’t necessarily its shocking deviation from what you’d expect, but how well it executes on its beats. The main exception is that there are a few scenes that take great advantage of the fact that the character of the father is deaf; I’d expected the need for silence and sign language to play an even bigger part in the story, but still, the way that it was used for suspense here was excellent.
And Primate handles its human characters the same way it handles the chimpanzee: staying aware of how the audience probably feels in the moment, keeping a steady hand on the controls, turning up the shock and horror when it has the most impact, then turning it back down or defusing it before it gets too intense. Certain characters have plot armor, a couple of really gruesome injuries seem to be magically healed a few minutes later, a suggestion that a character is going into shock raises the tension in the moment but is forgotten a few scenes later. And it guesses how much the audience likes or dislikes certain characters, and mostly assigns them accordingly gruesome kill scenes.
For a long time — watching Cloverfield was the first time I really noticed it — I took it for granted that horror movies like this had to maintain a certain distance between the audience and the characters, or else the whole thing seemed ghoulish. It was asking you to get invested in characters that were doomed to die horribly.
After I saw Final Destination: Bloodlines last year, I started to re-think that assumption. That movie worked so well mostly because it perfectly understood how comedy and horror are similar but not identical, but I think it really landed because it also understood how to get audiences invested in the characters without being too somber, over-thinking their inevitable fate.
Primate kind of splits the difference, constantly turning the Sympathy Dial for each of its characters up and down for maximum effect from moment to moment. I don’t think it’s as good as Bloodlines, either in character development, or in the “fun” aspect of seeing characters get killed in horrible ways. But also, that’s an unfairly high bar. Primate delivers on exactly the kind of horror and suspense you’d expect from its premise, while being deft enough to keep it all from feeling too by-the-numbers.
2026-01-14 03:00:00
This morning I woke up thinking about One Battle After Another. This is at least partly due to the fact that Letterboxd has launched a barrage of year-end retrospectives and lists across multiple social media platforms. And the blitz is no doubt working as it’s intended, to put works back into the public consciousness around awards season.
After I watched the movie, I declared that I admired it more than I liked it. I debated whether it was sacrilege not to include it in my list of favorite movies of last year, but then I called that list “my favorites” for a reason. My biases are built in, not just in terms of genre and tone1, but in how I expect movies to work: the narrative communicates a basic idea, and everything else is in service of that.
My overall opinion of One Battle After Another hasn’t changed that much. It hasn’t expanded or transformed inside my memory. But it has undeniably stuck there.
In particular, all of these disparate images that are part of the overall narrative, but which I remember more as individual moments, rather than as parts of a story. A woman being unceremoniously shot in the head right as she exits a convenience store. A teenager sitting in a shop scrolling through his phone while chaos is erupting around him.
And of course, the shots from a car on a surreally rolling highway, which seemed to go on and on, long after the narrative-processing part of my brain had said, “yes, we get it. We understand that this is a car chase. Let’s move on.”
Lately, I’ve become increasingly aware of my own imperfect memory, and how often the ideas or images that I’ve retained over the years have kept shifting and mutating the longer they’ve been removed from their original context.
To describe it in a less stuffy way: someone online was discussing the movie Mimic, which I don’t remember liking very much. Except for one specific scene that I thought was comically overwrought at the time, but over the years I’ve grown to appreciate it, now that my understanding of cinema has matured.
The scene has some kind of SWAT team rappelling down into a building through a skylight at night, and immediately getting wrecked by the monsters inside. At the end of the scene, a guy is desperately trying to pull his buddy back up to safety, but he’s too late. He pulls up half a body, and then the camera pulls back to show up him look up into the heavens in anguish, artfully cutting away right as he begins to scream.
People who appreciate fine cinema have probably already recognized the problem: that scene isn’t from Mimic. It is from The Relic, which is technically a different movie, even though they’re both dark blue/gray, CGI-heavy monster movies released in 1997. For what it’s worth, I remember Mimic being the superior movie, but The Relic being more fun, because it felt like it had little aspirations beyond being a cheesy monster movie, while Mimic was aiming for art.
That last idea is what made this interesting to me as more than my having yet another senior moment. (And more than just a reminder that I need to watch Mimic again). Not only was I unfamiliar with the work of Guillermo del Toro back then, but I also simply wouldn’t have gotten it back then. Mimic famously had so much interference from the Weinsteins that del Toro disowned it, but even taking that into account, I had such rigid definitions of good/bad/so-bad-it’s-good that I wouldn’t have been able to appreciate what del Toro was even trying to do.
It was only several years later that I could see how he, much like Peter Jackson and Sam Raimi, combine all of their disparate influences into a kind of melange that is almost completely unconcerned with value judgments, or “high art” vs “low art.” It’s not even as simple as, for instance, bringing aspects of B-movie horror into high fantasy adventure, like The Lord of the Rings trilogy. It’s more like taking every piece of art and entertainment you’ve experienced as a whole and synthesizing something new from that. Recognizing that we’re all walking around with this giant network of fragmented memories that are all constantly mixing with each other, regardless of their original context.
With that specific scene in The Relic, I can appreciate it now for being so impactful that I still remember it, decades later. Even though I can’t remember anything else about the movie, and even though I’d forgotten that it existed as a separate movie in the first place.
And my memory of why it’s impactful has changed over time, too. At the time, I thought it was “an unintentionally iconic scene in a bad movie that was trying to be a good movie.” Over the years, that turned into, “the iconic scene revealing the intention of filmmakers to do a contemporary take on B-movie horror.”
After rewatching that clip — in particular, the janky monster effects, and the manic, borderline-incomprehensible editing — I’m reluctant to release an official statement on intention, and declare exactly where this movie falls on the spectrum from “failed art” to “fun, self-aware schlock.”2 But the one thing I can say comfortably is that it is somewhere on that spectrum.
It’s a pastiche of a certain type of movie, whereas Mimic is more like the creation of someone who counts monster movies and B-movie horror among everything that’s influenced him. Del Toro wasn’t necessarily trying to iterate on those influences, but the influences can’t help but show up in everything he makes. Ironically, The Relic is more of an imitation than Mimic.
At least in my memory. Which is the main idea I’m trying to get at here. I almost always think of movies, TV, games, all of the media that I watch or play and then try to pick apart, as being fixed in time. It was made in this context, it communicates these ideas, and even if that changes over time, that’s due to my gradually getting a better understanding of what it was communicating in the first place. Either understanding what it was trying to say at the time, or understanding how far it deviated from its intention.
But that’s not how art works.
Both The Relic and Mimic are very much products of 1997 — the color palette, the reckless disregard for whether CG or practical effects look like they’re actually present in the scene, the late-90s Dante’s Peak/Volcano trend of releasing two movies that can be easily confused with each other and converge into one movie in your memory.
But the version of me that watched those movies is still stuck in 1997, too. That guy had no clue how to read anything that didn’t fit into a familiar slot. He was still holding onto years of school asking for a single correct interpretation of art, and years of critics and reviewers giving everything a grade to define exactly how bad or good it is. (I’m still not sure exactly how he was able to appreciate the genius of Big Trouble In Little China and Deep Rising; I’m just grateful that he was).
Which is all a reminder of just how changeable and ephemeral our engagement with art is. The art itself is permanent — for instance, I never would’ve expected The Relic or Mimic to become relevant again, 28 years later — but each experience of it is locked to a specific time and context. Meaning that we’ve all got this giant network of ideas and images, bits of dialogue, vivid emotional reactions, and memories swirling around in our heads, and each person’s is as unique as the person is.
For me, it’s a signal to be more mindful that art isn’t the one-on-one communication that I like to think it is. It’s more like throwing ideas and images out into the universe, hoping that the right people will be able to reassemble them into something meaningful to them. (For my fellow computer nerds: it’s less like TCP, and more like UDP).
It also gives me even greater appreciation for artists who can tap into that. Especially David Lynch. I’ve been a fan since Blue Velvet3, but it’s always been with as much frustration as appreciation. I need it to make sense, even while I’m fully aware that it’s not supposed to.
Today I saw a clip from an interview with Amanda Seyfried, and despite all the things I’ve seen her in — things that made perfect sense, and which had meaningful ideas that I was perfectly able to interpret — the image that immediately came to mind was a scene in Twin Peaks: The Return, where her character is drugged up and blissed out, riding in a convertible, looking at the sky as if it’s all new and wonderful and everything is perfect, in yet another storyline that never seemed to resolve itself.
Is the idea that Laura Palmer’s experience is cyclical? That there will always be innocents who are corrupted, no matter how much people like Agent Cooper try to save them? Is it that there’s no such thing as an “innocent,” and the idea itself is infantilizing and patronizing? Is it that all our lives are a combination of bliss and tragedy, and the beautiful thing is our capacity for wonder?
I don’t know; it could be all of those things, none of those things, or nothing at all. But it’s one of at least a dozen images from that series that I will never forget, a perfect expression of a feeling that I’d never be able to articulate. And it’s floating around in my memory, along with the rest of The Entirety of Human Expression (Chuck’s Version), largely free of context, steadfastly refusing to be reduced down to a simple this means that interpretation.
And that ambiguity is probably the exact reason why it won’t ever be reduced or diminished. My memory of context is faulty and reliable; I actually don’t remember any of the other scenes in that character’s storyline. But if I had been able to sum it all up, to get that closure of what it was intended to mean, it would likely have disappeared completely. In much the same way as the first two seasons of Twin Peaks seem to deflate if you think of them just as being about the mystery of who killed Laura Palmer.
The ambiguity isn’t necessary for making an image memorable, obviously. It’s just that Lynch had a unique gift for embracing everything that makes an image have the profound impact of a dream, and he was fearless and unconcerned with being misunderstood or misinterpreted.
Ultimately, I’ve still got my biases, and I’ll probably always prefer narrative-heavy movies and TV. But it’s useful to be mindful that that’s just one approach to making cinematic art, not the standard that everyone in the media should aspire to. In fact, it seems a little precious to be obsessing over every detail, every line of dialogue, to make sure that everything flows perfectly, and it conveys exactly the intended meaning, if I’m just going to end up forgetting most of it, anyway.
2026-01-14 02:00:00
My coworker at LucasArts, through no fault of his own, ended up having to share an office with me at the beginning of my obsession with Soul Coughing. That meant he had to listen to El Oso countless times, since using headphones at work hadn’t yet become common practice for whatever reason.
I got a kind of karmic payback, though, at my next job: my office-mate was really into Dummy by Portishead. I never outright disliked it, but I did keep waiting for it to win me over, even on familiarity alone, and it never quite did. It was more that hearing the opening of “Sour Times” got to be like hearing the noon siren over the loud speakers in San Francisco. “Oh, it must be Tuesday again.”
Fast forward several years, and I’m listening to a cover of “Wandering Star” and wondering why on earth it sounds so familiar. So I listen to the entirety of Dummy again, and I realize that hey, this is a really good album. My coworker was right all along.
Radiohead is a tougher sell for me. I have really tried to get into them, I swear, but it just never sticks. Usually, when somebody tells me about a band they’re really into, I just nod and move on, but Radiohead is different. I really want to like them, because occasionally I’ll hear something that convinces me they were doing something genuinely special. Like the track “Fitta Happier” from Quakers, which samples the Pride of Arizona marching band doing a medley from Kid A and OK Computer.1
I was convinced that I would only like covers, and I’d never be able to get into the originals. But then I heard a bit of “Fake Plastic Trees” somewhere, and I thought, “Hey, I really like this song, I wonder who sings it?” So until I’m able to fully catch up to where everyone else was in the late 90s and early 2000s, I can at least say that yes, I do in fact like exactly one Radiohead song.
2026-01-07 02:00:00
If you were to see me in person, no doubt the first thing that would come to mind would be “now that guy is a dancer!”
Sadly, you’d be mistaken. But I sure do like looking at dancers, especially in movies. My intense love of movie musicals like The Band Wagon, Singin’ in the Rain, and An American In Paris was yet another one of those things that went blissfully unquestioned in all the years before I came out. But the big movie musicals have mostly gone out of favor, despite the efforts of Steven Spielberg, and when they do pop up, they’re usually emphasizing the animation or the CGI or Lin-Manuel Miranda’s wordplay, instead of live humans dancing.
So even though I’m not a huge fan of Jungle as musicians, I do appreciate when their videos pop up in my YouTube recommendations, because they’re all showcases for a recurring troupe of contemporary dancers. I think my favorite is “Keep Me Satisfied,” because the choreography feels to me like a combination of Michael Jackson’s and Gene Kelly’s. The guys at the beginning are doing poses right out of Kelly’s “gotta dance” ballet.
At least, that’s how I see it. I’m pretty much completely ignorant of contemporary dance. Which is probably a huge part of why I like watching it. Because it’s inherently expressive, and because I don’t even have the language to describe it, I can’t overthink it or over-analyze it. It either fails to connect with me at all, or it knocks me flat on my ass.
Which is precisely what happened with “It’s Not That Serious,” directed and choreographed by Ricky Ubeda and set to the song “Sympathy” by Vampire Weekend.
I saw a clip of it on Instagram, starting at the bit where the man and the woman are looking into each other’s eyes, and all of a sudden, a woman in a sweater vest and neckerchief pops up in the foreground and appears to be doing choreography from an entirely different type of video. Is this some kind of Tik Tok1 thing where people are dancing in front of music videos or something?
Anyway, I was immediately captivated by that short clip, and I had to find the rest of it. What is this for? Is it from a movie? Apparently not; as far as I can tell, it exists only to be awesome. At the time I’m writing this, I first saw it about 3 hours ago, and I’ve already watched it around 6,000 times. My husband — who had to suffer through my obsession with the Kenzo World video starring Margaret Qualley — came upstairs and asked, “uh oh, is there a new earworm?”
I don’t even pay much attention to clothes in videos, but I’m obsessed with what everybody’s wearing in this one, like they’re all from some international prep school for only the hottest children of diplomats. I love every single thing about it, is what I’m saying.
And from what little I know about dance, this video seems to have all of it. Modern dance to hip hop to boxing to Bollywood and probably a dozen styles I could never hope to recognize. The only frame of reference I have is the one time I saw Punchdrunk perform “The Drowned Man” in London, and while it was mostly lost on me as immersive theater, seeing the dancers right there in front of me performing with such precision and energy after 2+ hours took my breath away.
I’m glad I don’t know enough about it to understand it, because every once in a while I just get to see people pouring themselves into the creation of something like this, and it makes me think, “No, this world is a good place, actually.”
2026-01-05 10:59:01
Today on Spectre Collie: shameless objectification of professional actors!
Well, not really shameless; it is embarrassing to realize how much you’ve been programmed by the media, and how you’ve still got these hard-coded ideas of what’s supposed to constitute “hot,” even though they really don’t make any sense at all.
The thing that prompted this realization was watching Game Night. I mentioned that its self-aware asides felt like vestiges from the early 2000s, unnecessary in something that was already working extremely well as a contemporary screwball comedy. But I didn’t mention the part that felt even more dated, which was the sub-plot about the friend Ryan, played by Billy Magnussen.
His storyline is pretty charming. He brings a date to game night who’s supermodel hot but vapid, if not outright dim-witted. We quickly learn that none of his relationships last much longer than a week, and he’s always bringing a new woman to the party, and they’re all supermodel hot and vacuous and basically interchangeable. Eventually, he brings a coworker played by Sharon Horgan as a platonic date, to be his ringer so he can actually win a game for once. Over the course of the night, he learns that the problem with his relationships has always been that he’s trying to be the smart one, when he’s actually the pretty but dim-witted himbo.
The thing that immediately stood out to me was how old-fashioned the stereotype of “hot, dumb girlfriend” was. Superman did it with its version of Eve Teschmacher, but most of its characters deliberately have the corniness turned up to maximum, to capture the Silver Age comic book feel. And Game Night calls back to old-fashioned screwball comedies, so it feels similar there. “This is a deliberate throwback, instead of anything trying to feel contemporary.”
It felt a little jarring in Game Night, though, since it was throwing the image of “this is the kind of super hot woman a shallow guy would be attracted to” into a movie where Rachel McAdams, Kylie Bunbury, and Sharon Horgan are presented as “baseline attractive.” Not “plain,” but more that they’re so normal that it’s not worth commenting on.1
Where Game Night does comment on it is with the men. In addition to Magnussen being the “dumb, hot one,” Jason Bateman’s character is repeatedly contrasted against his brother played by Kyle Chandler, who is cooler, wealthier, and much, much hotter.2 Meanwhile, the whole storyline with Lamorne Morris’s character is that he feels insecure when he finds out his wife (Bunbury) slept with a high-profile celebrity while they were on a break.
Even though the movie trots out stereotypically hot women for effect, I didn’t pick up any sense that it was saying, “these women are so hot that you should feel bad about yourself.” Even Horgan’s character, who’s set up the most to be insulted by Billy’s not being attracted to her, doesn’t seem to care in the slightest.
My gut reaction when seeing Billy’s girlfriends was that it was odd for the movie to present these women as unusually hot when Rachel McAdams was standing right there. But thinking back on the roles McAdams has played (that I’m familiar with), she’s gone from glammed way up to way down all throughout her career. Early on, she was the stereotypical, impossibly hot blonde; she was the head of the Mean Girls. In the upcoming Send Help, she’s the frumpy office worker who gets ignored or dumped on by everyone including her terrible boss. In between, she’s been at just about every level of glamour from unattainable ideal girlfriend to relatable ideal girlfriend to femme fatale to relatable housewife and mom.
McAdams seems to treat it all as drag, which is something I think Scarlett Johansson does really well, too. Being super-glamorous movie star (like in Asteroid City) is something she can do, but she never seems all that interested in making that her whole thing.
It’s funny to me that there was so much attention paid to Avengers marketing concentrated on showing Black Widow as a sex object — and for good reason, because they totally did — but in my mind, the MCU has gone even harder in objectifying its male characters. I’m sure I’m not the only person who saw the teasers for the Thor movies, highlighting every single curve of Chris Hemsworth’s physique, and thought, “is it okay for me to be looking at this in public?!” And the actual movies famously linger on shots of Chris Evans, coming out of the Super Serum chamber or grabbing onto a helicopter, with only slightly more subtle horniness than all the shots of Grace Kelly in Rear Window.
It seems increasingly rare overall for movies to call out how hot a character is, or at least compared to how it was in my formative movie-going years. And interestingly, the examples I can think of are evenly distributed between men and women. Outside of the MCU: Emma Stone gawking over a shirtless Ryan Gosling in Crazy, Stupid, Love. Jon Hamm having an entire storyline in 30 Rock about how he’s so hot, he doesn’t have to be smart or actually good at anything. The recent movie Eternity, which had just about every single character saying repeatedly and at length how impossibly good-looking Callum Turner is.
For the ladies, I can only think of Barbie, where everyone is unusually beautiful, and still Greta Gerwig via Helen Mirren’s narrator comments on how incongruous it is to see Margot Robbie saying that she doesn’t feel pretty. And the Jumanji movies, where Karen Gillan is presented as a hyper-idealized, sexy, bad-ass video game character. Both of them feel like explicit or implicit commentary about objectification of women, and both of them also have male characters who are even more explicitly objectified. Look at how hot and ripped The Rock, Ryan Gosling, and Simu Liu are. Just look at ’em!
The one that stood out the most to me was Emily Blunt in Edge of Tomorrow. The movie introduces her character — over and over and over again, since it’s about time loops — in mid-workout. It’s so over the top that I wouldn’t have been surprised to hear old-timey striptease music and a crowd of men cheering and hooting. And it was difficult for me to process, after decades of movies and TV explicitly presenting actors that might as well have arrows pointing at them with signs that read THIS IS HOT, to see a Real Actress
being shown in much the same way as the Barb Wire poster. My lizard brain was saying “you’re showing me a 9 and telling me I’m looking at a 15.”3
While I’m in the middle of a blog post about objectifying actors, I do want to stress that I’m talking about how Hollywood presents people, not my actual opinions, or to suggest that there is any objective truth in hotness ratings. Ultimately, my main point is that it’s almost entirely bullshit. And only “almost entirely,” since a big part of the entire point of those scenes was to show how much body-building prep work Blunt had done for the part.
But the part that makes Edge of Tomorrow interesting to me is that while the scenes are over-the-top, they’re not gratuitous. You learn that Blunt’s character was previously caught in the same type of time loop as our protagonist, and the experience turned her from someone who was presumably a rank-and-file soldier, into a total bad-ass super-hero. I suspect that casting Blunt against type was the whole point.
And I suspect that I’ve spent so long being trained how to “read” relative attractiveness in movies and television that I got hung up on it and missed the point entirely.
The overall impression I get is that Hollywood has very slowly, almost imperceptibly changed since the days of Weird Science. I’m purposefully ignoring a ton of trash, and pretty much every single thing on reality TV, but among the stuff that I think still has relevance to art and popular culture:
Which seems like a kind of progress, I guess? I hope it’s obvious that I’m in no way claiming that Hollywood has Fixed All The Problems, and it’s not every bit as crass, sleazy, misogynistic, and manipulative as it’s always been. But it does feel like the bar has been slightly nudged upwards; it’s harder to get away with the same kind of objectification that was prevalent in popular movies during my teens and twenties.
And it seems rarer to have big stars who can coast on their looks. Glamor feels more like something that actors can put on or take off as they like.
As evidence for that bold claim: Ana de Armas, Florence Pugh, and Sydney Sweeney are three of the most objectively, flawlessly beautiful women I’ve ever seen. And they all have managed — by being extraordinarily talented — to take a variety of roles that depend on more than their being in full-glam mode. Contrasted to actors like Charlize Theron and Nicole Kidman, who had to take parts that deliberately obscured or downplayed their appearance before people seemed to take them seriously as actors, instead of just beautiful.
If my theory is true, and Hollywood is gradually getting better at not pointing at actors and screaming at the audience, this person is a 10, there’s a sinister side effect. That baseline level of “Hollywood ugly” is still active and likely will continue to be active forever, so we’re left with an unrealistic level of what constitutes a 7. If you’re casting Rachel McAdams and Jason Bateman as average enough to be unremarkable, you’re perpetuating the idea that those of us who always considered ourselves average have been actually been at best a 3.
One of the most commonly seen and savagely mocked cliches from the movies of my formative years: the mousy, bookish woman who takes off her glasses and lets down her hair and… gasp!… she looks like a movie star!
It’s mocked for good reason, because it’s almost entirely the product of male filmmakers reducing women to nothing more than what men find attractive. The gross “That’s What Makes You Beautiful” syndrome. I still remember catching a scene from some movie that tried to pass off Ginnifer Goodwin as “the plain, insecure one,” and I yelled at the television, “you go straight to hell” and angrily switched it off.
But I do have to wonder how much, if any, of that cliche could get “taken back” and turned into something positive. Because if you can somehow remove the sleaze from it, and the core idea that appearance is the last, crucial, step towards actual self-worth, you’re left with an idea that we don’t see enough of.
That it is all surface bullshit, and some of the most beautiful people in media look like that only because a ton of people work hard to make them look like that, and it’s something they can put on or take off at will. Reject the bit where the male lead sees the woman take her glasses off and realizes she’s perfect for him, and just think about the bit where the glamorous movie star is only a pair of glasses away from looking like what passes for average in Hollywood.
I’m never going to look like the Kens in Barbie, but then I really enjoy never spending any time in a gym, so it feels like it comes out even. And any time I remember I’m about the same age as Jon Hamm, it’s slightly reassuring to remember that even he doesn’t look like Jon Hamm all the time. Which is less healthy than appreciating that none of it matters in the slightest if you’re not making a living based on your appearance, but I’m a fundamentally insecure person who occasionally needs a little bit of shallow pettiness to get by.
But in terms of thinking about how media works, it’s interesting to see an industry largely built on glamour having to reassess and redefine how important glamour is.
2026-01-02 10:04:02
We spent the week of Christmas at a hotel in midtown Atlanta, situated right between the house where Margaret Mitchell wrote Gone With the Wind, and an intersection with pride flags painted in the crosswalks.
I knew that Atlanta has a large population of LGBT people, and it has for decades, but it was still jarring to see such an open, even mundane, display of pride every morning, in the middle of a place that I’ve always associated with repression. Shouldn’t that kind of thing remain safely contained within Little Five Points, at least?
Inside the hotel itself, there was little indication of place; it all looked like you could be at any mid-to-high-end hotel in any American city at Christmas time. Except for inside the bathroom, where there were two pieces of digital collage artwork (which I assume were in every room), combining photos of things that uniquely signify Atlanta: the Fox theater, Atlanta Falcons tickets, a pop top from a bottle of Coke, etc. One of them had the images on top of repeated text reading “Georgia on my mind,” which is fine, and “The city in the woods,” which is bizarre. I spent the first 25 years of my life living within an hour of Atlanta, and I never, ever heard it called that by anyone, ever.
But it was still familiar, in a way, since for as long as I’ve been alive, Georgia has been weirdly over-eager to make up stuff to be proud of. Why not pick “the city has an awful lot of trees?” Really it’s not that much weirder than building your cultural identity around soda, or pretending that people in the southeast are invested in professional sports anywhere near the level they’re invested in college football. And I mean, there’s not a lot to latch onto that isn’t problematic.
Especially for those of us who grew up in the 1970s and 1980s, it often felt like we’d been handed a heritage along with a shrug and a billion asterisks attached to it. You basically had three options:
It’s always seemed like a lot of work, trying to maintain some small level of performative Georgianess, just to have the feeling of being from somewhere. The alternative is being unmoored, a kind of blandly generic American. Is that what people from Delaware feel like all the time?
Even though I did live within an hour of Atlanta for the first 25 years of my life, we rarely went inside the city limits. We always lived in the suburbs, as the circles of white flight radiated outwards, and going into the city was either dangerous, or a nightmare to drive. The former was hugely overblown, even in the 70s and 80s, the product of white suburbanites being generally afraid of cities. The latter has always been the case, and the Atlanta area is still a nightmare to drive, even worse than anywhere I’ve been in LA.1
I’ve seen a lot more of the city as a visitor, and the overwhelming impression I keep getting every time we go back is “inauthentic.” Likely at least partly due to the fact that we tend to stick to touristy areas, but also inescapable. Playing to some made-up idea of what The South is supposed to be like. Like a hack stand-up comedian doing a “white people are like this, but black people are like this” bit, where the stereotypes are either impossibly dated, or just bizarre and completely unrelatable.2
But over this last trip, I realized that everything that I tend to think of as “inherently Georgian” is made-up bullshit, too. Not just in the sense that all culture is made up, but in the sense that my home state has a very long history of clinging to stuff that’s just fake. All of the “Lost Cause” bullshit from the Daughters of the Confederacy. The controversy over removing the Confederate flag from the Georgia state flag, even though it had been only added in the first place as a protest against the Civil Rights movement.
There’s long been a cynical inauthenticity to all of it, like people desperate to preserve a heritage built entirely on revisionist history. Even at its most charitable, it’s clinging to an overly romanticized version of a past that most of us don’t actually even remember.
After all, if cultural identity were actually based on cultural impact, then our hotel should’ve been filled with pictures of Outkast and the movie industry. Both hip hop and film production have been Georgia’s greatest cultural contributions this century, but the city and state both seem to be stuck selling themselves with peaches and sweet tea and biscuits and remnants of a 160-year-old war that our side lost.
It was odd coming from Los Angeles to Atlanta, knowing that the city where I live is constantly giving me reminders of the glitz and glamour of the movies, while the city where I’m from is where the movies are actually being made. But it’s also not surprising, since the reason Georgia has been attracting so much production3 comes down to tax breaks and lack of identity, two things that don’t often come up in tourist brochures.
More and more often, I’ll see a movie and only realize after the fact that it was filmed in Georgia, since it’s been Vancouverized. It’s been used for decades to represent “Any City, USA,” or at least any city in the eastern half of the US, and the productions almost never take advantage of any of the things that make Georgia towns and cities unique. My own home town has a “historic” old town area, which was used for filming a TV series… but only after they re-did it to look like New Orleans.
Which is all a long, meandering train of thought leading me back to pride flag crosswalks in midtown Atlanta. Sure, they’re every bit as performative as painting “y’all” in huge cheugy letters on the wall of your Southern Fusion Cuisine restaurant, or for that matter, painting the stars and bars on your truck. The difference, of course, is that the performance actually means something, and even more importantly, that it’s not aspirational. It represents the way things really are, not the way we want them to be.
Most obviously, it was a reminder to reject the perception that it’s unsafe to be gay in Atlanta, even though I spent most of my life convinced that it was. And more significantly, to reject the idea that this is some modern invention, the result of a ton of LGBT people moving into the city. They’ve always been there; the only recent development is that they’re more free to acknowledge it. The pride flag inherently symbolizes rejecting assumptions about the way things are, and accepting that you’ve got nothing to be ashamed of.
And then more significantly than that, seeing the crosswalks kept making me think “oh, this hotel is in the gay neighborhood.” Which is itself an outdated notion. I was reminded of the Castro in San Francisco, which feels like a gay theme park, full of all the flags and signifiers of “where all the gays are,” but too expensive for most of the city’s gay people to actually live there.
The boundary is mostly imaginary. Useful as a sign of visibility, an acknowledgement that “you’re not alone,” but not a real border of any significance. I remembered years ago, while we were in Piedmont Park in Atlanta, my then-fiancé asked a passing family to take a photo of us kissing under a photo op with mistletoe, and I was mortified, but of course the family did it with no hesitation, and asked us to go back and kiss again so they could get a better shot. Meanwhile, the only place I’ve ever had strangers yell the f-slur at me was in the gay haven of San Francisco, California.
And as long as we’re breaking down fake, outdated notions, why not dispense with all of them? A lot of the things I know to be true about Georgia and Atlanta specifically are based entirely on things I heard from my parents, who lived and/or worked in the city for decades before I was born; and other white suburbanites through the 1970s, 80s, and 90s. Even if my assumptions were true at one point — and that’s a pretty big if — then they’re almost certainly out of date and irrelevant to now.
Over the past few years, such a big deal has been made over “turning Georgia purple.” I’ve always had an image of the state as being islands of progressive Democrats in Atlanta and a little bit in Athens, surrounded by a sea of repressive Republicans. But as with every other place in the United States, there’s a spectrum of opinions and viewpoints everywhere.
I never in a million years would’ve expected my actual hometown, outside of Atlanta, to be predominantly Democratic, and yet that’s exactly how they voted the previous election. Realistically, that’s because the demographics have shifted, and more non-whites have settled there; it’s not as if a bunch of conservative, small-town white people were suddenly convinced to change their minds. But even that is still clinging to the notion of borders and demographics and voting blocks, instead of people.
It all made me realize just how much my thinking has been poisoned by politics over the past couple of decades. The media dividing everyone up into districts and states and even wide geographical areas that we can all understand: this is what southerners are like, this is what The Heartland believes, these are the types of people who live in urban areas.4 Even in the brief periods when we get a respite between campaigns and fund-raising requests, it’s so thoroughly infected the ways we think about each other that we assume it extends to culture and everyday life as well.
Over the past ten years — okay, 20 — okay, maybe more like 55 — it’s felt harder and harder to envision a way forward. It’s easy to have high-minded, vague ideas about unity and progress, but the specifics are where I get hung up; how do you show grace to the people making inexcusable decisions, enabling the people doing irreparable harm?
Remembering that “we’re all just people,” and “there are more good guys than bad guys” and the like are fine as mantras to get you through the day, but how do you act on any of it? How do you push for unity without regressing to a place where you’re just forgiving, excusing, or even enabling the perpetuation of evil?
I don’t claim to have all the answers, but one thing I’ve realized is that my respect for politicians and politics in general has been chipped away to the point of non-existence5, but I’ve just shrugged and left it at that. Maybe a good first step is to ask myself: if I hate the cynicism, insincerity, and selfishness of politics so much, why do I still think of things in those terms? Why am I still clinging to ideas about what people and places are like, which were developed decades ago specifically to win an election, based on facts that are no longer relevant, assuming they were ever true in the first place? How about we all stop thinking like politicians and go back to thinking and behaving like human beings?