The Art of Putting On Airs

The paintings in Dickie Greenleaf’s studio are bad. Hilariously bad, so much so that the set dressers on Ripley, the Netflix adaptation of Patricia Highsmith’s The Talented Mr. Ripley, must have let out a real cackle when they were commissioned. Dickie, the Ivy-educated dilettante son of a New York ship-building titan who has moved to seaside Italy to fritter away his inheritance, claims, about his work, “I happen to be pretty good at it.” But he can’t see what the audience can: the droopy Modigliani knockoffs and derivative Cubist faces that would make Picasso wince. When Tom Ripley, who will soon focus his affection and envy and rage at Dickie, first looks at the paintings, he practically laughs into his sleeve. As he quickly comes to realize, it doesn’t matter that the paintings are tripe—Dickie himself is a piece of art worth copying.

The Ripley canon, which includes Highsmith’s quick-twisting 1955 novel, the sun-drenched 1999 Anthony Minghella film, and the new television series from Steven Zaillian, has always been about American reinvention. Not only does the peon and con man Tom end up refashioning himself as the rich and carefree Dickie, but Highsmith’s novel itself was a retelling of Henry James’s The Ambassadors. Each subsequent version has added another layer, transparent enough to still show the original but sturdy enough to stand on its own. And one facet of the story stands out more clearly than ever: This latest revival of Tom Ripley and his elaborate charade is of a part with the 21st-century obsession with optimizing one’s personality into its most ideal and palatable version.

Tom starts out as a nobody, or at least as the kind of person whom Dickie would think of as a nobody. In the novel, Tom is so vaguely drawn that we don’t know precisely what he looks like or where he comes from, just that he’s been running a small-time con in Manhattan. Dickie, on the other hand, is a golden god, with blue eyes and crinkly blond hair and a decided ease about him. He’s always mixing drinks and stretching out his legs, buoyantly chatting with Italian townspeople and taking up new friends like Tom, whom he invites to move into his home almost immediately after they meet. Dickie has it—charisma, insouciance, a devil-may-care freedom, all of which Tom wants for himself.  

[Read: I gave myself three months to change my personality]

Eventually, Tom’s talent for ingratiating himself wears thin, and when he realizes he’s about to lose access to Dickie and his vibes, he bashes Dickie’s head in while they boat around San Remo, and formally takes over his identity. Tom spends the rest of the story signing into various Italian hotels as Signor Greenleaf, and writing letters and cashing checks with forged signatures to keep up the ruse that Dickie is alive but off the grid. And yet, his aim isn’t just to avoid the police or spend Dickie’s inheritance. If Tom merely wanted the Greenleaf money, he could have scammed Dickie out of it; if he’d wanted Dickie’s life, he’d have found a way to stay among his friends, and further stitch himself into their community of expat layabouts.

What he wants, however, is Dickie’s sense of self: his nonchalance, his elitist disdain for the second-rate, his innate effortlessness. And so that is what he takes, even though becoming Dickie exposes Tom to far more risk. He can’t exactly have Dickie’s face, and, depending on the version, it’s either more or less believable that he might freely jaunt around Italy using Dickie’s passport. But in every Ripley story, we see Tom practicing playing Dickie—his loucheness, his forceful voice, his sure and irrefutable stare—before fully embodying the role. Tom may be a murderer, but it’s tempting to admire the rigor and completeness of his transformation.


In many ways, Tom was ahead of his time; identity is more performance-based than ever. Public life, of which social media is one component, wants and needs fine-tuned, particularized selves to populate it. Those identities require two things: a face and a personality. Faces function as business cards, attached to us in the same way as our names. They’re everywhere, replicated ad infinitum across screens; as a result, perfect strangers appear as familiar as family members. The face is the canvas of the 21st century, a token that is both recognizable and subject to change.

Personalities—our likes, dislikes, tendencies, faults, strengths—are just as fervently distilled into consumable little bites online. The demands of the internet push users toward presenting themselves as a type, a category that can be marketed to and sent just the right kinds of ads.  The proliferation of online profiles, self-summaries where all the haze and contradictions of a person are left out, hastened this process; hashtags turned personal branding into a sport, and the algorithms took things from there.

Everyone knows just enough pop psychology to classify friends and strangers: People are toxic or maybe narcissists, simple baskets in which to dump entire ranges of human behavior. The absurdly quick emergence of new micro-trends means that users become, en masse, “bedrotters” or “coastal grandmothers” or “rat girls.” For a brief period of time this winter, a mass of young women on social media proclaimed that their entire personality now revolved around bows. And if you struggle to lock down the exact parameters of who you are, don’t worry: Numerology, astrology, or enneagram tests might help sort you into some just-vague-enough group and explain away any less-than-admirable traits.

[Read: Prestige TV’s new wave of difficult men]

Buying into any of these typecasts entails some performance, yes, but they’re also meant to be interpreted as an indicator of precisely who one wants to be. The quick-change artist is someone who can’t be left behind or out of date, someone who’s always freshly appealing to their audience without losing their essence. Culture has learned a lesson from literature’s most prominent con man: A personality morphs and stabilizes depending on the circumstances, and the most common form of art being performed right now is the continual reinvention of the self.

Those performances can also take place behind closed doors. All of the versions of Tom stay in character as Dickie, even when he’s alone, evidence of his commitment to his new identity. Tom seems to evaluate the quality of his performance not based on whether or not it might get him caught but on whether or not his artistry can convince even himself. In Highsmith’s novel, he thinks of himself as a tabula rasa, and considers his entire pre-Dickie life a different performance. When he occasionally has to “play” Tom again, he hams it up: “He could stoop a little more, he could be shyer than ever, he could even wear horn-rimmed glasses and hold his mouth in an even sadder, droopier manner,” Highsmith writes. The new Ripley deadens Tom into a blank-eyed cipher who comes alive only when he’s in his new life, fingering antiques and glorying in the fine furnishings and little trimmings that someone like Dickie can afford and that someone like Tom can only appreciate.

To press the point that Tom’s new personality is worthy of appraisal as a showpiece, Ripley is buffered with abundant long, steady, black-and-white shots of Renaissance paintings and chiseled architectural embellishments. This Italy isn’t just a playground for American expats, as it is in the Minghella film—it’s also the premier site for viewing the work of the masters. When Tom first arrives, Dickie insists that they go on a pilgrimage to the closest Caravaggio, but after they arrive at the church, Dickie barely glances at it. Art is a signifier for him, and nothing more. But when Tom finds the time and funds and social capital to be a man of leisure, he really takes it all in, and tours the country’s greatest artistic sites. In particular, he dives into the work of Caravaggio—the 17th-century painter who also notoriously murdered a rival—to see what it might tell him about how to proceed when your art and your life are in conflict.

Ripley’s focus is on how we decide what is worthy of being called art. Dickie’s ridiculous, egregious paintings are a demonstration that money, time, and desire still can’t manufacture talent. His girlfriend, Marge, is writing a book that we hear about in scraps, a facile “American in Europe” memoir written in prose so mealy-mouthed, it makes Tom cringe. But when Tom continues Dickie’s painting, he improves it, and the letters he writes are better than anything Marge turns out. For hours and hours, we watch as Tom picks up Dickie’s body language, hotel preferences, and sartorial leanings, mimicking them with perfect accuracy—an art form of a different order.

Just as Dickie’s paintings riff on the ideas of the Modernists, neither is Tom’s performance very original. But like Highsmith before him, he’s taken something sublime and transformed it for his own purposes, with a result that is, arguably, better than the original. In Ripley, just after Tom moves into Dickie’s house, he’s left alone for an evening. He sorts through Dickie’s clothes—a fine suit, Brooks Brothers shirts, even his underwear—and puts it all on. Gazing at himself in the mirror, Tom embodies Dickie for the first time, and asks: “You like art, Tom? Well, you’re in the right place.” As he performs, Picasso’s The Guitar Player—the actual painting, which Dickie owns—hangs just beside him in the frame, standing parallel to Tom’s own art. Both are classics.


​​When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

Click here to see original article

The Growing Incentive to Go Nuclear

From now on, any state with genuine fears for its own security is bound to consider building nuclear weapons. Over nearly eight decades after the bombings of Hiroshima and Nagasaki, careful diplomacy and multinational collaboration have limited the number of nuclear-armed countries to nine. But that count is likely to rise—ironically because of American policies designed to prevent nuclear escalation with Russia. Recent events have shown how much deference even superpowers give to countries with nuclear weapons, and how grievously Ukraine has suffered for lacking them.

Last Saturday, Iranian forces launched a large air assault on Israel. They used a range of systems, including relatively simple drones as well as cruise missiles and ballistic missiles. The apparent goal was to overwhelm Israeli air defenses so that at least some of the missiles and drones could get through and hit their target. Iran’s move seems to have been inspired by devastating Russian aerial attacks on Ukrainian infrastructure in recent months—and indeed was larger than any that Russia had launched on a single night.

Ultimately, though, the operation was a bust. Israeli officials estimated that 99 percent of the Iranian attack force was intercepted; U.S. officials told The Wall Street Journal that half of Iran’s ballistic missiles crashed or failed to launch. A 7-year-old girl was critically injured, but there were no other casualties. Few if any targets of military value were hit. (Last night, Israel retaliated accordingly, with a limited strike on an Iranian military base. Both sides seemed to play down the significance of the move.)

Excellent Israeli air defense and faulty Iranian equipment aren’t the only reasons the Iranian attack failed. An extraordinary coalition of other states—the United Kingdom, France, Jordan, and, most important, the United States—put their own aircraft into action and destroyed many, probably most, of the incoming drones and missiles.

[Anne Applebaum: Why did the U.S. defend Israel but not Ukraine?]

The United States has made a number of strategic miscalculations since Russia invaded Ukraine in 2022, but the single greatest may be the message that the Biden administration just sent about nuclear weapons. The U.S. showed that it would protect a nuclear-armed friend, Israel, from an as-yet-nonnuclear enemy (Iran); at the same time, Washington has refused to consider using its forces to defend a nonnuclear friend (Ukraine) against a nuclear-armed Russia.

Other governments will deduce that states with nuclear weapons can barbarically attack America’s friends and bully U.S. leaders into abandoning them. The British government has underscored that sentiment by basically admitting that, precisely because of fears of escalation with Russia, Ukraine won’t get the same help that Israel did. Even if the U.S. and its allies were more coy about their calculations, their conduct will encourage a wave of nuclear proliferation in the coming years.

Indeed, escalation worries have made the U.S. timid about helping Ukraine in ways far short of the direct defense it provided Israel. Even when grudgingly going along with Kyiv’s requests for advanced weaponry, the Biden administration has imposed limits on how and where that equipment can be used.

The immediate aid that the U.S. and its allies provided to Israel hints at how much more of a difference they could make in the Ukrainian war effort. They could, for instance, institute a no-fly zone over the western part of Ukraine. This would protect vital infrastructure in half of Ukraine from missiles and drones—while also using the West’s own power of deterrence to keep out manned Russian aircraft. Furthermore, by protecting the western half of Ukraine, the U.S. would allow the Ukrainian military to concentrate its air-defense efforts in the east. The beleaguered Ukrainians would have fewer variables to worry about and could use their precious stocks of anti-air ammunition more efficiently.

Instead, the Biden administration is allowing Russia to use the threat of nuclear weapons as cover for its effort to conquer a sovereign neighbor by force. Ukraine is not just any nonnuclear state; it is a state that gave up its nuclear weapons because the U.S. and Russia firmly promised in 1994 to respect its territorial integrity.

In their passivity, the U.S. and its allies are acquiescing in the destruction of the post–World War II nuclear order—which in many ways was a great success. Since the Second World War, the two major nuclear powers never used their nuclear weapons to win wars—even when, as with the U.S. in Vietnam or the Soviet Union in Afghanistan, they were losing in conventional warfare. And although a small number of other states, including China, India, Pakistan, Israel, and North Korea, have built nuclear arsenals, many more governments with the capacity to develop nuclear weapons have so far declined to do so.

[From the July/August 2022 issue: We have no nuclear strategy]

The global order is becoming less stable in other ways. The Biden administration’s weak response to Russia is bad enough; a second Trump administration could follow a still more destructive policy of telling even close, longtime allies that they can’t count on American support. When Donald Trump said publicly earlier this year that he would encourage Russians to do “whatever the hell they want” with European NATO member states that don’t spend enough on defense, he was signaling to leaders in Europe and around the world that the North Atlantic Alliance is in jeopardy.

Other countries will take note—and begin to arm themselves for a more dangerous world. South Korea, for one, is quietly discussing the prospect of developing nuclear weapons. It’s also talking about constructing a new generation of nuclear-powered submarines, even though it has an agreement with the U.S. not to do so. Many governments will make similar calculations.

We have reached a dangerous moment. In its desperate attempts to de-escalate tensions with Russia, the Biden administration is reinforcing the message around the world that nuclear weapons provide security and freedom of action. When countries are presented with a clear choice between being shielded from attack and being left to their fate, no one should be surprised at which option they’ll take.

Click here to see original article

The New Rules of Political Journalism

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

In our digitally chaotic world, relying on the election-reporting strategies of the past is like bringing the rules of chess to the Thunderdome.

First, here are three new stories from The Atlantic:


New Rules

This past weekend, I was on a panel at the annual conference of the International Symposium on Online Journalism, in beautiful downtown Austin. Several journalists discussed the question: Are we going to get it right this time? Have the media learned their lessons, and are journalists ready for the vertiginous slog of the 2024 campaign?

My answer: only if we realize how profoundly the rules of the game have changed.

Lest we need reminding, this year’s election features a candidate who incited an insurrection, called for terminating sections of the Constitution, was found liable for what a federal judge says was “rape” as it is commonly understood, faces 88 felony charges, and—I’m tempted to add “etcetera” here, but that’s the problem, isn’t it? The volume and enormity of it all is impossible to take in.

The man is neither a riddle nor an enigma. He lays it all out there: his fawning over the world’s authoritarians, his threats to abandon our allies, his contempt for the rule of law, his intention to use the federal government as an instrument of retribution. Journalists must be careful not to give in to what Brian Klaas has called the “Banality of Crazy.” As I’ve written in the past, there have been so many outrages and so many assaults on decency that it’s easy to become numbed by the cascade of awfulness.

The former White House communications director Dan Pfeiffer points out a recent example in his newsletter: On a radio show earlier this month, Donald Trump bizarrely suggested that Joe Biden was high on cocaine when he delivered his energetic State of the Union address. It was a startling moment, yet several major national media outlets did not cover the story.

And when Trump called for the execution of General Mark Milley, it didn’t have nearly the explosive effect it should have. “I had expected every website and all the cable news shows to lead with a story about Trump demanding the execution of the highest military officer in the country,” this magazine’s editor in chief, Jeffrey Goldberg, told The Washington Post. “If Barack Obama or George W. Bush had done so, I’m sure [the news media] would have been all over it.” (Trump’s threats against Milley came after The Atlantic published a profile of Milley by Goldberg.)

In our digitally chaotic world, relying on the reporting strategies of the past is like bringing the rules of chess to the Thunderdome. There has, of course, been some progress. The major cable networks no longer carry Trump’s rallies live without context, but they still broadcast town-hall meetings and interviews with the former president, which boost ratings. NBC’s abortive decision to hire Ronna McDaniel, a former chair of the Republican National Committee, as a contributor, despite her role in spreading lies about the 2020 election, highlighted the disconnect between this moment and much of the national media.

And then there is the internet. It is certainly possible that richer, more insightful media will emerge from the digital revolution, but we’re obviously not there now. Back in 2016, we worried that social media had become a vector for disinformation and bigotry, but since then, we’ve seen Elon Musk’s extraordinary enshittification of X. In 2016, we worried (too late) about foreign interference and bots. In 2024, we are going to have to contend with deepfakes created by AI.

This year will see some of the best journalism of our lifetime. (You’ll find much of it here in The Atlantic.) But because both the media and their audiences are badly fractured, much of that reporting is siloed off from the voters who need it most. Because millions of Americans are locked in information bubbles, half of the country either won’t see important journalism about the dangers of a second Trump term or won’t believe it.

As Paul Farhi notes in The Atlantic, MAGA-friendly websites have experienced massive drops in traffic, but social media continues to thrive on negativity and providing dopamine hits of anger and fear. And of distraction—last week, the most-liked videos on TikTok about the presidential race included a video of a man singing to Biden and Trump’s visit to a Chick-fil-A.

To put it mildly, the arc of social media does not bend toward Edward R. Murrow–style journalism.

So what’s to be done? I don’t have any easy answers, because I don’t think they exist. Getting it right this time does not mean that journalists need to pull their punches in covering Biden or become slavish defenders of his administration’s policies. In fact, that would only make matters worse. But perhaps we could start with some modest proposals.

First, we should redefine newsworthy. Klaas argues that journalists need to emphasize the magnitude rather than simply the novelty of political events. Trump’s ongoing attacks on democracy may not be new, but they define the stakes of 2024. So although live coverage of Trump rallies without any accompanying analysis remains a spectacularly bad idea, it’s important to neither ignore nor mute the dark message that Trump delivers at every event. As a recent headline in The Guardian put it, “Trump’s Bizarre, Vindictive Incoherence Has to Be Heard in Full to Be Believed.”

Why not relentlessly emphasize the truth, and publish more fact-checked transcripts that highlight his wilder and more unhinged rants? (Emphasizing magnitude is, of course, a tremendous challenge for journalists when the amplification mechanisms of the modern web—that is, social-media algorithms—are set by companies that have proved to be hostile to the distribution of information from reputable news outlets.)

The media challenge will be to emphasize the abnormality of Donald Trump without succumbing to a reactionary ideological tribalism, which would simply drive audiences further into their silos. Put another way: Media outlets will need all the credibility they can muster when they try to sound the alarm that none of this is normal. And it is far more important to get it right than to get it fast, because every lapse will be weaponized.

The commitment to “fairness” should not, however, mean creating false equivalencies or fake balance. (An exaggerated report about Biden’s memory lapses, for example, should not be a bigger story than Trump’s invitation to Vladimir Putin to invade European countries.)

In the age of Trump, it is also important that members of the media not be distracted by theatrics generally. (This includes Trump’s trial drama, the party conventions, and even—as David Frum points out in The Atlantic—the debates.) Relatedly, the stakes are simply too high to wallow in vibes, memes, or an obsessive focus on within-the-margin-of-error polls. Democracy can indeed be crushed by authoritarianism. But it can also be suffocated by the sort of trivia that often dominates social media.

And, finally, the Prime Directive of 2024: Never, ever become numbed by the endless drumbeat of outrages.

Related:


Today’s News

  1. The Senate dismissed the articles of impeachment against Homeland Security Secretary Alejandro Mayorkas and ruled that they were unconstitutional, ending his trial before it got under way.
  2. House Speaker Mike Johnson will proceed with a plan, backed by President Joe Biden, to vote on separate bills to provide aid to Ukraine, Israel, and U.S. allies in the Indo-Pacific. The proposed move has raised criticism from some conservative representatives.
  3. Four Columbia University officials, including the president, Nemat Shafik, testified in a congressional committee hearing about student safety, free speech, and anti-Semitism on campus.

Dispatches

Explore all of our newsletters here.


Evening Read

A salad with spots of flashing color
Illustration by The Atlantic. Source: Getty.

Something Weird Is Happening With Caesar Salads

By Ellen Cushing

On a November evening in Brooklyn, in 2023, I was in trouble (hungry). I ordered a kale Caesar at a place I like. Instead, I got: a tangle of kale, pickled red onion, and “sweet and spicy almonds,” dressed in a thinnish, vaguely savory liquid and topped with a glob of crème fraîche roughly the size and vibe of a golf ball. It was a pretty weird food.

We are living through an age of unchecked Caesar-salad fraud. Putative Caesars are dressed with yogurt or miso or tequila or lemongrass; they are served with zucchini, orange zest, pig ear, kimchi, poached duck egg, roasted fennel, fried chickpeas, buffalo-cauliflower fritters, tōgarashi-dusted rice crackers. They are missing anchovies, or croutons, or even lettuce … Molly Baz is a chef, a cookbook author, and a bit of a Caesar obsessive—she owns a pair of sneakers with “CAE” on one tongue and “SAL” on the other—and she put it succinctly when she told me, “There’s been a lot of liberties taken, for better or for worse.”

Read the full article.

More From The Atlantic


Culture Break

Members of the German light-machine-gun bicycle corps wear gas masks while standing beside their bicycles
Hulton Archive / Getty

Look. These photos, compiled by our photo editor, show the importance of bicycles in World War II.

Read.The Vale of Cashmere,” a short story by Benjamin Nugent:

“What I liked about your father was that he helped me find my contact lens.”

Play our daily crossword.


Stephanie Bai contributed to this newsletter.

When you buy a book using a link in this newsletter, we receive a commission. Thank you for supporting The Atlantic.

Click here to see original article

The Uncomfortable Truth About Child Abuse in Hollywood

During Nickelodeon’s golden era, the network captivated young viewers by introducing them to an impressive roster of comedic talent—who happened to be kids, just like them. Starting in the mid-1990s, actors such as Amanda Bynes, Kenan Thompson, and Ariana Grande became household names, as popular children’s shows including All That, Drake & Josh, and Zoey 101 helped propel Nickelodeon to astronomical ratings. For nearly two decades, the network dominated not just kids’ programming, but the entire cable-TV landscape.

A new docuseries argues that at least some of this success came at a great cost. Quiet on Set: The Dark Side of Kids TV explores troubling allegations of child abuse and other inappropriate on-set behavior during this run at Nickelodeon. The documentary builds on a 2022 Business Insider investigation into programs led by the prolific producer Dan Schneider, and on details from a memoir published earlier that year by the former child star Jennette McCurdy. (McCurdy, who doesn’t identify Schneider by name in her book but describes an abusive showrunner widely believed to be him, was not involved with the documentary.) Over its five episodes, the series offers an important record of how the adults working on these shows—and Hollywood as a whole—repeatedly failed to protect young actors. But Quiet on Set also, perhaps unintentionally, ends up creating a frustratingly tidy narrative that elides some crucial complexities of abuse.

The series spends its first two episodes painting a picture of the toxic environment that Schneider allegedly cultivated for adults and children alike. Two former Amanda Show writers say that Schneider harassed female employees; former All That actors recall their discomfort performing sketches full of racial stereotypes and sexual innuendo. Several interview subjects described a culture of deference to Schneider, one in which they felt afraid to raise their concerns.

In a video response to the series, Schneider apologized for requesting massages from female staffers, said that he wished he could go back and change “how I treat people,” and conceded that he would be willing to cut any upsetting jokes from his shows that are streaming. (At the end of every Quiet on Set episode, a title card relays Nickelodeon’s response to the producers’ questions: The network said it “investigates all formal complaints as part of our commitment to fostering a safe and professional workplace … We have adopted numerous safeguards over the years to help ensure we are living up to our own high standards and the expectations of our audience.”)

[Read: What tween TV teaches kids]

Quiet on Set shows how the culture of silence created work environments that endangered young performers. The documentary covers multiple harrowing cases of child sexual abuse perpetrated by individuals who worked in close proximity to Nickelodeon’s underage actors. Jason Handy, a production assistant on All That and The Amanda Show, was arrested for lewd acts with children in 2003 and later pleaded no contest to two of the felony counts and one misdemeanor charge. He was sentenced to six years in prison and later arrested on new sex-abuse charges in 2014. In the documentary, the Business Insider journalist Kate Taylor reads stomach-churning quotes from Handy’s journal, before revealing that another Nickelodeon crew member was arrested just four months after him: Brian Peck, a dialogue coach and an occasional actor on All That, was charged with 11 counts of child sexual abuse. After pleading no contest, Peck was convicted of two of the counts against him and sentenced to 16 months in prison.

The documentary’s most shocking revelation is that the unnamed victim in Peck’s case is now an adult who wants to tell his story: The Drake & Josh star Drake Bell, speaking publicly about the abuse for the first time, explains how Peck integrated himself into Bell’s life after the two met at an Amanda Show table read. “In hindsight, I should’ve been able to see,” Bell says. “But as a kid, you have no clue.” Bell’s chronicle of the abuse is wrenching, in no small part because it underscores how adults failed to keep him and the other children in Nickelodeon’s studios safe from predators.

Quiet on Set argues that Peck’s on-set behavior fits within a larger pattern on Schneider’s shows: boundary-crossing behind the scenes and inappropriate sexual innuendo on the air. In a clip from an old All That episode, a celebrity guest complains of hunger, and Peck’s recurring character, known as “Pickle Boy,” hands him a pickle to eat through a hole in the dressing-room door. The camera zooms in to capture that visual, which clearly evokes a pornographic trope. One former All That actor recalls that, during downtime, Peck would play video games with the children; another reads an old note in which Peck thanked her for walking on his back. The former child actors repeatedly emphasize that although other grown-ups were present on set for many questionable incidents, no one from Nickelodeon ever stepped in. (In his video statement, Schneider says that he didn’t hire Peck and was devastated to hear the allegations of abuse.)

In making many of these stories public for the first time, Quiet on Set is the latest project to expose the ways in which Hollywood enables child sexual abuse—and to call for industry reforms. The former actors speaking in the new series echo many of the sentiments expressed in Dear Hollywood, an incisive podcast by the former Disney Channel ingénue Alyson Stoner. Three years ago, Stoner wrote about a phenomenon they called the “toddler-to-trainwreck pipeline,” describing it as a profitable system that has continued apace since the 19th century by “censoring the harm happening behind the scenes, manicuring aspirational lifestyles and outcomes, and then watching young lives tragically implode.” In their writing and on their podcast, Stoner presents disturbing personal testimony and discusses issues that child stars face, such as the prevalence of eating disorders, fractured family dynamics, and the psychological toll of fame. Stoner also offers concrete steps the industry should take, such as requiring a qualified, third-party mental-health professional on every set.

Last week, Quiet on Set, which was originally billed as a four-part series, released a bonus fifth episode that explores tangible solutions. Shane Lyons, a former All That cast member, said that the first place to start would be updating the law “so that no individual who is a convicted child molester can ever get on a Hollywood set again.” That may sound like an obvious fix. But the California law that details protections for children in the entertainment industry, and which mandates background checks for many professionals who work with child actors, has a major loophole: It doesn’t apply if a parent or guardian is always present with their child on set.

[Read: Don’t judge I’m Glad My Mom Died by its title]

The series makes the limits of this provision—and the stakes of leaving it unchanged—incredibly clear. Even if the onus is on parents to protect their kids, abusers frequently conceal their predatory actions from other adults. What’s more, parents who try to advocate for their kids can end up ostracized, putting their children’s career (and self-esteem) on the line.

The docuseries creates a startling and horrifying picture of how Hollywood’s systemic flaws have long put children at risk. But Quiet on Set also has its shortcomings. The series isn’t always careful with its depictions of alleged victims or of former child stars, especially those who chose not to participate in the project. Amanda Bynes was a key part of Nickelodeon’s rise, but the documentary’s commentary about her closeness to Schneider and her later mental-health struggles sometimes registers as cursory speculation without Bynes there to speak for herself.

[Read: The hard lessons of Amanda Bynes’s comeback]

Parts of Bell’s story are similarly under-contextualized, despite the actor’s heavy involvement in the series: Quiet on Set publicizes the names of several industry figures who wrote letters of support for Peck after his conviction. (These letters were previously sealed, along with other court documents.) Excerpts from some of the 41 letters show just how much backing Peck had in Hollywood, but in its eagerness to implicate others, the series overlooks how Peck may have wielded authority over some of the signatories.

Throughout the series, Peck is described as a master manipulator, someone who infiltrated Bell’s life when the actor was a teenager partly by earning his mother’s trust. But the documentary never meaningfully addresses the fact that some of the performers who wrote letters of support for Peck had met the much older dialogue coach while they, too, were teens. This doesn’t necessarily absolve them of criticism. But the series could have examined how such unequal dynamics can influence young people’s behavior in an ecosystem as insular as children’s programming, and considered the possibility that Peck’s manipulation extended further. Even including the detail of the letter signers’ ages along with this commentary would have provided valuable information to viewers attempting to make sense of the case and how it was perceived at the time.

In the weeks since the documentary began airing, former Nickelodeon fans have criticized many Hollywood figures, including former child actors, for having shown support for Peck. And some of the network’s former actors have faced backlash for simply not speaking up—whether in solidarity with Bell or to publicly share their own negative experiences. In last week’s bonus fifth episode of Quiet on Set, Bell asked that fans be more compassionate toward his mom and reiterated an earlier request for fans to “take it a little easy” on his former co-star Josh Peck (who is no relation to Brian Peck).

In another unfortunate misstep, Quiet on Set avoids wrestling with the full reality of Bell’s life after Peck’s abuse. In 2021, Bell himself pleaded guilty to felony attempted child endangerment and a misdemeanor charge of disseminating matter harmful to juveniles in a case involving a 15-year-old girl, when Bell was 31. The documentary largely brushes past this, allowing Bell to obfuscate the details of these allegations by conflating the case with his other “self-destructive behavior” and suggesting that the media have spread “misinformation” about him.

These oversights undermine the docuseries’ attempts to rigorously confront the pernicious nature of abuse, and instead present viewers with clearly delineated camps of good and evil, perpetrator and victim. This flawed framing has also left Bell’s accuser vulnerable to heightened public scrutiny: After the series premiered, fans began creating TikTok videos discussing the 2021 case. There, and on other social-media platforms, some people shared the accuser’s real name or suggested that she had been lying. People also harassed Bell’s former girlfriend, who in 2020 accused the actor of physical and emotional abuse during their relationship—allegations that Bell has flatly denied as “offensive and defamatory.” Just last week, Bell insisted that he was innocent in the 2021 case (despite already having pleaded guilty) while speaking about Quiet on Set on a podcast, which further emboldened these fans.

Many of these more recent updates couldn’t possibly have been accounted for in a documentary that had already finished filming. But the bonus episode—a coda of sorts—offered a chance for Quiet on Set to reckon with the sad fact that it’s not uncommon for abuse victims to become offenders in adulthood. True intervention requires understanding abuse in ways that aren’t binary, and the series would have benefited tremendously from asking a mental-health expert to talk about these cycles. Protecting children in Hollywood and beyond is a collective effort, one that demands seriously engaging with even the most uncomfortable truths. Quiet on Set marks one important step in that direction, but there’s so much more left to do.

Click here to see original article

Your Fast Food Is Already Automated

Moments after receiving my lunch order, the robots whirred to life. A clawlike contraption lurched forward, like a bird pecking at feed, to snatch dishes holding a faux-chicken cutlet and potatoes, then inserted them onto a metal track that snakes through a 650-degree-Fahrenheit oven. Seven minutes, some automatic food dispensers, and two conveyor belts later (with a healthy assist from human hands), my meal was sitting on a shelf of mint-green cubbies. It was a vegan fried-chicken sandwich, a cucumber salad, crispy potatoes, and a smattering of other sides.

This is Kernel, a fast-casual venture that opened its first store, in Manhattan, this February. Its founder, Steve Ells, kicked off the lunch-bowl boom when he started Chipotle in 1993. Now, he told me during my visit, he is betting that machines will trigger a “reinvention of how a fast-food or fast-casual restaurant can run.” Robots, he prophesied, will bring faster and more accurate service at lower overhead costs. Plenty of chains have tested out semi-automated cooking, with mixed success—including deep-frying robots at Jack in the Box and robotic bowl assembly at Sweetgreen and Chipotle. But Kernel has been built from the ground up for robots. Just three employees are needed in the restaurant at any time, compared with the dozen required for a typical fast-casual restaurant. Soon many more people may be eating robot-prepared vegan chicken: Ells has raised $36 million and hopes to expand quickly, starting with several more locations throughout New York City this year.

But robots may represent less of a fast-food revolution than the obvious next step in its evolution. For more than a century, technology has made fast food more efficient—and, in particular, more automated. That’s what turned McDonald’s into a giant 60 years ago. Such restaurants can be considered “sort of mini-factories,” Dave Henkes, a food-industry analyst at Technomic, told me, and have always used “automation to drive speed and convenience.” And, like the simpler cooking technology before them, today’s robots are speeding up humans’ work without fully replacing them. For now, Kernel is no different.

[Read: A robot’s nightmare is a burrito full of guac]

Kernel’s entirely vegan menu is limited (Ells prefers “focused”), but everything looked and tasted like it came from fine dining. That is no coincidence: Kernel’s chief culinary officer, Andrew Black, was a sous-chef at Eleven Madison Park, a three-Michelin-star restaurant with a $365 tasting menu, located a block away from Kernel. While I ate, he and Ells gave passionate spiels about each item: The marinated beets, a surprise best seller, are topped with quinoa, green hummus, and a seed crunch to make the dish nutritionally complete. For the crispy potatoes, Black specially selected a spud variety for its sugar, starch, and water content, and they’re then cooked three times—steamed, fried, baked—to achieve a shattering crunch and pillowy interior. Black and his staff dredge and fry every piece of “chicken” by hand; as I bit into my sandwich, Ells mused that they should try swapping imitation meat for a block of tofu.

Simply put, Kernel is a group of excellent chefs equipped with the world’s most high-tech toaster oven. All the food is cooked by chefs at a central kitchen about 10 minutes away, delivered hourly by a bicycle courier, and heated by a robot. That off-site preparation, Ells told me, provides at least 80 percent of the menu’s quality. The food then has to be assembled by still three other people. Human one, the “replenisher,” loads the hourly delivery of prepared food onto a shelf that the robotic arm can reach. The “assembler” puts together every sandwich and side, and a third person, the “bundler,” bags each order and places it in a cubby.

A wall of green cubbies at Kernel next to a bowl of roasted carrots
Courtesy of Kernel

The setup is “extraordinarily fast, accurate, and predictable,” Ells told me, nothing less than a “paradigm shift.” Employees barely have to move their feet. But a robot that heats and moves around your food is just the next iteration in the pursuit of speed and standardization. The restaurant with the strongest claim to inventing fast food may be White Castle, which, in 1921, “did something that was unusual for the time—they tried to standardize their operations from restaurant to restaurant,” David Hogan, a fast-food historian at Heidelberg University, in Ohio, and the author of Selling ’em by the Sack: White Castle and the Creation of American Food, told me. Cooking procedures were precise and uniform; cooking implements were manufactured in a single location; even the physical buildings came out of a central factory.

The playbook hasn’t substantively changed since. Before buying McDonald’s and launching its global success, Ray Kroc sold the chain automatic milkshake mixers. What first captivated him about the restaurant, he wrote in his 1977 memoir, was how “each step in producing the limited menu was stripped down to its essence and accomplished with a minimum of effort.” That year, the Bureau of Labor Statistics published a study noting that fast-food chains had “introduced principles of industrial engineering” to restaurants. In particular, “the off-premise preparation of foods” and improved “cooking devices,” such as microwaves and convection ovens, reduced preparation time and added uniformity. Restaurants today use specialized equipment, extensive training manuals, and various trackers to ensure speed and consistency. Sweetgreen has an app that instructs employees exactly how to heat and prepare food, and McDonald’s cooks beef patties for precisely 42 seconds. If anything, Kernel’s off-site kitchen is conceptually closer to the centrally prepared, frozen patties and fries served by fast-food burger joints of old than the chicken grilled on-site at a Chipotle.

[Read: Too many Americans are missing out on the best kitchen gadget]

To the extent that Kernel is a reinvention, Ells hasn’t invented a new paradigm so much as found another. Sweetgreen already acts like a tech company, and Domino’s has touted itself as one. Now Ells talks about his robot-assisted process as an “operating system.” What may one day distinguish Kernel’s automation is that the space is designed for robots from inception. So far, other chains have retrofitted human kitchens with robots, which creates confusion and disaster, Stanislav Ivanov, who studies robotics and restaurants at Bulgaria’s Varna University of Management, told me. Robots malfunction, and even when they don’t, bulky machines interfere with equipment, stations, and a floor plan designed for human movement. In 2018, an early burger-flipping robot that was tested at a CaliBurger in Pasadena was temporarily decommissioned because it couldn’t be incorporated into the human workflow.

Kernel is, at least in theory, built for “the technology that we know is coming,” Ells said. The equipment is all mobile and can be swapped or calibrated for newer gadgets (permanent counters, ovens, and stovetops, for instance, are unnecessary because robots don’t care if workstations are waist-height). Drones could bring prepared food from the central kitchen to restaurants, and robots might assemble burgers in their entirety. Efficient robots and a vegan menu, he said, will continue to reduce the restaurant’s carbon footprint. Gesturing to the “bundler” who bagged all the food, Ells said, “Instead of Carlos, imagine a robot arm.” (Carlos kept bundling without so much as a flinch.)

With automation, of course, comes the risk of disappearing jobs. Kernel and other restaurants are experimenting with robots not only in pursuit of efficiency, but because the industry is facing a chronic labor shortage. The low pay doesn’t help, and the jobs are also exhausting as well as, at times, hazardous. Deep-frying, for instance, is extremely dangerous, which is why one of the most popular cooking robots in the industry simply runs the fry station. Fast-food chains pursuing automation are trying to reduce head count, especially as some states raise their minimum wages. But for now, Henkes said, robots have typically led restaurants to redeploy people to different positions. Ells claimed that Kernel’s existing employees, who are currently paid $25 an hour, will eventually be moved to more front-of-house jobs, helping guests and monitoring the robots.

But a burger prepared, cooked, and served without a human touch is still more accurately described as speculation. Faster and more automated cooking technology may well be imminent, but humans will still be involved for years to come. Automated pizzaiolos, line cooks, and salad tossers have failed; successful robots typically target a specific task, such as plunging fries into boiling oil.

Just as the quality of Kernel’s food depends on human chefs, the quality of its automation will depend less on technology than on human vision and feedback. Each day, employees meet to discuss what worked and what didn’t, which will help iterate the technology: Dozens of bugs, including stalled production and locked cubbies, have been smoothed out. On the first day, the cubbies didn’t open; last month, a stray potato shut down Kernel’s production line. Kernel is building new tools, but relying on the same human logic that made White Castle, McDonald’s, and Chipotle successful. I came to the restaurant to witness fancy robots, but I would return simply for the faux-chicken sandwich and the cucumbers topped with cashews and chili jam. Kernel the restaurant is far more impressive than Kernel the tech company.

Click here to see original article

The Columbine-Killers Fan Club

Mass shootings didn’t start at Columbine High, but the mass-shooter era did. Eric Harris and Dylan Klebold’s audacious plan and misread motives multiplied the stakes and inspired wave after wave of emulation. How could we know we were witnessing an origin story?

The legend of Columbine is fiction. There are two versions of the attack: what actually happened on April 20, 1999, and the story we all accepted back then. The mythical version explained it all so cleanly. A pair of outcast loners dubbed the “Trench Coat Mafia” targeted the jocks to avenge years of bullying. Dwayne Fuselier, the supervisory special agent who led the FBI’s Columbine investigation, is fond of quoting H. L. Mencken in response to the mythmaking: “There is always a well-known solution to every human problem—neat, plausible, and wrong.”

The legend hinges on bullying, but the killers never mentioned it in the huge trove of journals, online posts, and videos they left to explain themselves. The myth was so insidious because it cast the ruthless killers as heroes of misfits everywhere. Fuselier warned how appealing that myth would sound to anyone who felt ostracized. Within a few years, the fledgling fandom would find one another on social media, where they have operated ever since.

Around the world, Eric and Dylan are idolized as champions of “the nobodies.” Eric hated the nobodies. He mocked them mercilessly on his website and in his journal. He wasn’t a loner or an outcast, and neither was Dylan. Eric and Dylan made clear in their writings that they were planning the attack for their own selfish motives—certainly not to help the kids they ridiculed at the bottom of the social food chain.

[Read: The Columbine blueprint]

They were not in the Trench Coat Mafia. They were not Nazis or white supremacists, and they did not plan the attack for Hitler’s birthday. They did not target jocks, Christians, or Black people. They targeted no one specifically. They shot randomly and designed their bombs to kill indiscriminately. That’s where “they” ends: Their polar-opposite personalities drove opposite motives. Psychopaths are devoid of empathy; Eric was a sadistic psychopath who killed for his own aggrandizement and enjoyment. Dylan was suicidally depressed and self-loathing. Eric lured him into punishing the world for the pain it inflicted on him, instead of punishing himself. Columbine was a suicide plan, but on “Judgment Day,” as they called it, Dylan would show the world the “somebody” we’d never seen.

The Columbine killers have fans. Eric and Dylan’s adoring online following spreads across nearly every continent, and it’s growing across multiple platforms. The Russian government, which has been plagued by an explosion of both Columbine fandom and mass shootings, estimates that more than 70,000 members exist worldwide. They call themselves the TCC, for “True Crime Community,” and I’ve spent much of the past 15 years inside their online world. My book Columbine made me enemy No. 1 for portraying Eric and Dylan as ruthless murderers.

In 2016, a young fan tweeted: “hey @DaveCullen block me or else i shoot my school.” She’d been ranting for hours, posting pictures of school shooters, and tweets such as: “It’s also something a lot of people need, To die….I wish i was dead…I LIKE VIOLENCE…I want to be killed in front of an audience. … I think someone failed to abort me (:”

These teens are ensnared in an American tragedy that just keeps growing worse.

diagram of shootings inspired by Columbine

I’ve tried to leave this story so many times, but this diagram haunts me, ruthlessly expanding like an unstoppable spider web, devouring all the lives and futures in its path. It demands that we address the cause—25 years too late. That web is made up of 54 mass shootings that have killed nearly 300 people and wounded more than 500. And every gunman left evidence that they were inspired or influenced by the murderers at Columbine. The Columbine effect.

Eric and Dylan’s bombs failed. Yet the legend made them heroic to their progeny and gave birth to their fandom. By the tenth anniversary, a small band of “Columbiners” had formed online. They gravitated to the TCC, to Ted Bundy, to the younger Tsarnaev brother, ‎to Dylann Roof, and to others—but Eric and Dylan are the megastars. The groupies multiply, as fresh crops of teens join their ranks each season.

Most gunmen die in the act, so the 54 attacks itemized in the diagram are just the ones that we know of, and that were carried out. A 2015 Mother Jones investigation of Columbine copycats found more than two thwarted attacks for each one that succeeded. It identified 14 plotters targeting Columbine’s anniversary and 13 striving to top its body count. Surviving mass shooters have admitted that they were competing with one another.

[From the Marsh 2024 issue: To stop a shooter]

All roads lead back to Columbine. The Virginia Tech shooter, Seung-Hui Cho, wrote in a school assignment that he wanted to “repeat Columbine” and that he idolized its “martyrs.” The Northern Illinois University killer marked a third generation, explicitly inspired by both Virginia Tech and Columbine. Sandy Hook was the fourth generation; Adam Lanza had studied all three. Six more school shooters later referenced Sandy Hook and Columbine. Five generations of fallout, all reenacting the original legend.

Most early Columbiners were just curious teenagers interested in the criminal mind or in analyzing Columbine. Many still are, and their analyses are often useful. Many are angry about being tarred with the group’s reputation, but they have been outnumbered by new arrivals unabashedly calling themselves fans. Many use the killers’ faces as avatars, extoll their virtues, and compose love poems, fan fiction, and gory memes about them. Sue Klebold said she was shocked by the volume of letters she received calling Dylan “heroic” and by the number of girls saying, “I wish I could have his baby.”

How little these groupies know about the murderers they obsess over is ironic. They keep repeating the misreporting that was debunked decades ago, convinced it’s true because it has metastasized into TCC dogma. The TCC twists the story to recast the murderers as victims; and the dead, wounded, and traumatized as villains. The groupies didn’t start these myths; we in the media bear that shame. But the groupies are now the carriers, spreading the legend of Dylan and Eric to remote reaches of the globe.

Seventy thousand is a tiny fraction of the adolescent population, but a magnet for a dangerous cohort of marginalized, disaffected, and hopeless teens—a major pool of aspiring shooters. Most TCC members outright say that they condone the Columbine murders, often in their profiles. They have turned Eric and Dylan into folk heroes, and they celebrate them as avenging angels. Adam Lanza obsessed over the Columbine killers and spent years immersed in these groups online. Then he murdered 20 little kids and six adults at Sandy Hook Elementary School.

Here’s the twist: Most of the TCC members I’ve engaged with describe themselves as awkward outcasts desperate to fit in. The TCC embraces them. The TCC feels cool—Eric and Dylan are super cool—and so they finally feel cool. I find it heartbreaking to hear them describe the pain they endure at school and the affinity they feel for “Dylan” and “Eric,” the fictional characters they’ve constructed. These kids are shocked when I tell them that other members of the TCC have told me the same—that they are putting on the same show, sure that all the others really mean it. Did Adam Lanza believe the posers? We’ll never know, but we can be certain that as you read this, a distraught, lonely kid somewhere is contemplating an attack—and the one community they trust is screaming, Do it!

[Elaine Godfrey: The club that no one wants to join]

Lots of kids fantasize about killing. Two days after Columbine, Salon ran “Misfits Who Don’t Kill,” in which three people came clean about their youthful fantasies of enacting mass murder. The phenomenon was widely reported that week. But none of those people did anything, because they knew how horribly wrong acting out the fantasy would be. Inside the TCC bubble, the constant message is that if your classmates are tormenting you, killing them is not just moral —it’s heroic and noble.

The TCC has a tell: Actual shootings unnerve them. Their posts grow quiet, respectful, and even mournful after some troubled young person heeds their call. I can gauge the change instantly, because the incessant harassment I get from them stops cold—for a week or two. Parkland was different: Six months went by before the taunts began trickling back in, and I haven’t gotten a death threat in the six years since. Why? I have no way to be certain about this, but my educated guess is that David Hogg, X González, and the rest of the March for Our Lives kids were suddenly cooler than the young shooters. And so much more powerful.

Eric and Dylan weren’t powerful—their plan failed. They’d planned Columbine as a bombing, the primary terrorist tactic. They thought they were launching a three-act drama: The cafeteria bombs would kill nearly 600 people instantly; what they called the “fun” part would be shooting up hundreds of survivors; and the massive car bombs set in the parking lot outside were to be the coup de grâce. Those timers were set to explode 45 minutes after the initial blast, wiping out countless more survivors and first responders, live on national TV. The Columbine killers’ performance was staged as the most apocalyptic made-for-TV horror film in American history. Eric complained in his journal that his “audience” would fail to understand. He got that right. He got everything else wrong.

Every element fizzled. All of the big bombs failed. Eric and Dylan went down to the cafeteria in a last desperate move to ignite the bombs with gunfire and a Molotov cocktail. Failed. Experts on psychopaths say they get bored after their initial kills, and Eric had likely lost interest. His gun’s recoil had broken his nose, so he spent that time in acute pain. The cops refused to kill them in the blaze of glory that they’d described as their final curtain. The smell of all the blood and already decomposing bodies was overpowering. Out of options, each shot himself in the head.

A more obscene and pathetic way to die is hard to imagine. Yet their fans have never confronted that ugly reality, because the opposite story took hold, making Eric and Dylan masterminds of the “worst school shooting in American history.”

The Columbine effect has gone global. It has inspired mass shootings in Finland, Sweden, Brazil, Mexico, Canada, France, Germany, the Netherlands, Ukraine, and Russia—as well as knife and axe attacks in places as remote as Siberia. In 2022, Russia designated the online “Columbine movement” a terrorist group. To comply with the ruling, my publisher required me to disavow the group in the Russian translation of Columbine. Mass murder inspired by those inept perpetrators is America’s most revolting cultural export.

I know when the TCC colonizes a new region, because I start getting a barrage of taunts in a different language. It’s a social contagion. Researchers have described school shootings as the American equivalent of suicide bombings—an ideology joined with a tactic. The phenomenon is escalating and self-perpetuating.

The Columbine groupies have no idea that they’re exporting a fraud. The media set this whole thing in motion 25 years ago. To untell a legend is a formidable task. It will be possible only when the media finally begin to convey how pathetic and gruesome the killers’ final moments were. The fans need to hear the ugly truth. Eric and Dylan viciously murdered innocent kids for their own selfish and petty agendas, and they died miserable failures.


This essay is adapted by the author from the new preface to a 25th-anniversary edition of Columbine.

Click here to see original article

The Illogical Relationship Americans Have With Animals

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

American society has a confused, contradictory relationship with animals. Many dog owners have no compunction about eating feedlot-raised pigs, animals whose intelligence, sociality, and sentience compare favorably with their shih tzus and beagles. Some cat lovers let their outdoor felines contribute to mass bird murder. A pescatarian might claim that a cod is less capable of suffering than a chicken. Why do some species reside comfortably within our circles of concern, while others squat shivering beyond the firelight, waiting for us to welcome them in?  

In Our Kindred Creatures, their meticulously researched history of the dawn of the animal-rights movement, Bill Wasik and Monica Murphy argue that America’s animal attitudes were largely shaped over a period spanning the mid-1860s to the mid-1890s. It was during those decades, Wasik and Murphy write, that many Americans came to realize that animals weren’t mere “objects” but “creatures whose joys and sufferings had to be taken into consideration.”

This moral awakening, described by one contemporaneous journalist as a “new type of goodness,” still influences Americans’ love of certain animals today, and our indifference toward many others. These disparate feelings, Wasik and Murphy suggest, are an inheritance from that late-1800s era. They are also influenced by spatial and psychic proximity: Most people are more likely to care about the well-being of a pet with whom they cohabit than a pig that resides in a slaughterhouse. The future of animal welfare in the United States may depend on whether Americans can expand their concern beyond the boundaries drawn by 19th-century reformers—whether, as Wasik and Murphy put it, we can apply our “reservoirs of pet love” to other, more distant creatures.

[Read: The meat paradox]

Wasik and Murphy’s book often makes for disturbing reading, so unflinchingly does it document humankind’s capacity for cruelty. In the 19th century, horses, ubiquitous beasts of burden in the pre-automotive age, were whipped mercilessly and forced to haul impossibly heavy loads. Medical-school instructors vivisected rabbits in anatomy lessons. High-society women sported fanciful hats adorned with the plumes of egrets, terns, and other birds “slaughtered wholesale for the cause of fashion”; offshore bobbed ships full of live sea turtles flipped on their shell, slowly dying as they waited to become soup. Every day in New York City, stray dogs were rounded up and “killed by drowning in a giant metal box … used to dispatch some sixty to eighty dogs at a time.”

Although Wasik and Murphy make the case that women eventually became central to the animal-rights movement, their account focuses principally on two men who were among its most forceful leaders. One is Henry Bergh, the dyspeptic heir to a shipbuilding fortune who embraced animal welfare after watching a bullfighting exhibition in Spain. Bergh’s approach was a punitive one: Beginning in the 1860s, he cajoled New York’s legislators into passing welfare laws, then, under the auspices of a new organization called the American Society for the Prevention of Cruelty to Animals, delegated agents to enforce those laws in cooperation with local police. His counterpart was George Angell, the president of the Massachusetts SPCA and the son of a Baptist preacher, who founded a newsletter called Our Dumb Animals and packed its pages with treacly poetry and stories written from the perspective of horses. Angell was a skilled rhetorician and salesman: When a compassionate “autobiography of a horse” called Black Beauty was published in the United Kingdom, Angell reprinted it in the U.S. (ignoring its original publisher’s copyright) and marketed it so ardently that one reporter speculated it would outsell the Bible.

Through legal and moral suasion, Bergh, Angell, and their conspirators made rapid progress. They passed laws preventing horse abuse, broke up dog-fighting rings, and nudged the meat industry to adopt less crowded train cars for cattle. In Philadelphia, a reformer named Caroline White opened a humane dog shelter at which strays were “fed a healthy diet of horsemeat, cornmeal, and crisped pork skin.” Those who weren’t adopted were euthanized in a carbon-dioxide chamber, which was thought to be less painful than drowning. Some species, then as now, were easier to promote than others: Bergh’s prosecution of a ship captain for mistreating sea turtles failed when a judge absurdly ruled that turtles were fish, and thus not subject to new welfare laws. Such setbacks notwithstanding, near the end of the 19th century, 39 of the country’s 44 states had adopted laws proscribing animal cruelty.


Although Wasik and Murphy share their subjects’ sympathies, they are admirably clear-eyed about their deficiencies, including some lamentable anti-science sentiments. Wasik and Murphy’s previous book, Rabid, tackled the history of rabies, and Our Kindred Creatures, too, spends time on that dread disease. Rabies, a common and deadly scourge in the 19th century, posed a contradiction to animal advocates. On the one hand, the development of a human rabies vaccine in 1885 was good for dogs: Once pooches were no longer terrifying disease vectors, people could welcome them into their home without reservation. On the other hand, the vaccine’s creation entailed copious animal experimentation, including “cerebral inoculation,” whereby researchers drilled holes in anesthetized animals’ skulls to infect them. Bergh and his allies deemed the rabies vaccine a “hideous monstrosity” and campaigned against its “evils,” seeming to recognize only the cruelties associated with the vaccine, and not its ultimate benefits.

Early welfarists had another blind spot: agriculture. Although Bergh and his allies occasionally waded into livestock advocacy, they railed primarily against abuses they could see: the horse whipped by his rider, the dog kicked by her owner. To Bergh’s mind, such public displays inculcated a culture of cruelism—the notion, as Wasik and Murphy put it, that witnessing meanness had a “coarsening influence on human minds … priming them for further acceptance of cruelty against man and beast alike.”

But a worldview focused on the prevention of visible cruelty proved a poor match for the meat industry. The slaughterhouses and packing plants that sprang up in Chicago in the late 1800s, for instance, concealed the brutality of their slaying methods—cows battered in the head, the occasional still-living pig dunked in boiling water—behind closed factory doors. Humane groups mostly ignored meatpacking’s horrors. The Illinois Humane Society even appointed the meat magnate Philip Armour to its board of directors and wrote him a praiseful obituary that, as Wasik and Murphy write, washed “away the blood of the countless millions of animals so cruelly disassembled in his slaughter factories.”

That cognitive dissonance—“the selective care for certain species and not others”—still afflicts American society. In their afterword, Wasik and Murphy argue that modern Americans, like their 19th-century forebears, need to adopt their own new “goodness,” one that emphasizes a “systems-driven moral thinking.” The misery of sows held captive in feedlots, or the suffering of wild creatures evicted by habitat loss, must become as real and urgent as the pain of chained dogs and starved cats. Meat-loving Americans would do well, Wasik and Murphy write, to reconsider the “patterns of consumption” that have led to the confinement of about 99 million cows and 74 million pigs. They might use their concern for pets as “well-springs from which to love, and to aid, all those distant, unseen animals we know only as abstractions.”

It’s a welcome proposal. Aside from that brief afterword, though, Wasik and Murphy’s book is almost entirely a study of the past. Our Kindred Creatures would have benefited from a more thorough examination of how early animal-welfare campaigns still reverberate—or don’t—today. Does P. T. Barnum’s deplorable treatment of captive beluga whales in the 19th century inform the campaign to free orcas and other cetaceans housed in modern aquariums? How have Indigenous-led efforts to restore bison to North America’s prairies managed to grow from the poisoned soil of 19th-century buffalo massacres? Lingering in the present would have made for a different—and longer—book, but also, perhaps, a more resonant one.

[Read: How P. T. Barnum helped the early days of animal rights]

Our Kindred Creatures also could have spent more time on the evolution of wildlife conservation. At the animal-welfare movement’s outset, some of the same people and groups who inveighed against horse beatings and dog drownings also fought the annihilation of bison and birds. But those causes soon diverged, as scientists and upper-crust sportsmen came to dominate conservation and largely squeezed out the lay crusaders who had launched welfarism. Today, many animal-welfare groups focus on pets and livestock, while organizations such as the National Wildlife Federation and the World Wildlife Fund advocate for their free-roaming brethren. Some scientists seek to reunify conservation and animal rights via the wild-animal-welfare movement, which works to both protect creatures and make their daily lives more pleasant—for example, by studying the effects of light pollution on owls, and by sponsoring research that provides birth control to overpopulated and starving pigeons in urban areas. After more than a century of divergence, animal welfarism and conservation may once more align, potentially to the benefit of the wild creatures whose lives have been immiserated by human activity.

Ultimately, in spite of its accomplishments, the crusade launched by Bergh, Angell, and their peers remains unfinished. As Wasik and Murphy point out, early welfarists were fond of analogies as a rhetorical tool. Some activists even extended the logic of animal rights to protect children from domestic abuse; in one instance the authors write about, Bergh dispatched ASPCA agents to rescue a mistreated child and prosecuted one of the first child-welfare cases on her behalf. If the modern animal-rights movement is to continue racking up victories, more Americans should perhaps think in analogy. If dogs and cats deserve good lives, why not cows, pigs, and chickens? If elephants, tigers, and other large, charismatic mammals are worthy of protection, why not bats, reptiles, insects, and other smaller, less endearing critters? Animals have long been beset by not only human cruelty but also human hypocrisy. What they need now, perhaps, is moral consistency.

Click here to see original article

How Being Busy Became a Status Symbol

This is an edition of The Wonder Reader, a newsletter in which our editors recommend a set of stories to spark your curiosity and fill you with delight. Sign up here to get it every Saturday morning.

On our How to Keep Time podcast last year, co-host Becca Rashid shared an anecdote that has long stuck with me. “I was having lunch with a friend last weekend who was trying to organize a birthday party for her colleague,” she began. “And, typical story, she said she was having trouble gathering everyone because everyone was too busy and it was impossible to get them to commit.”

The unforgettable part is this: One person in the group apparently said that she couldn’t make it because “she had to go to Crate & Barrel at 7 p.m. on a Friday.”  (Co-host Ian Bogost’s response—“She had a flatware appointment?”—never fails to make me chuckle.) The anecdote is equal parts amusing and concerning: What has modern life come to if shopping for dishes must be scheduled in the same way that work meetings are? Today’s newsletter explores the many different meanings of “I’m so busy,” and what we miss when our focus is on being busy above all else.


On Being Busy

How to Be Less Busy and More Happy

By Arthur C. Brooks

If you feel too rushed even to read this, then your life could use a change.

Read the article.

‘Ugh, I’m So Busy’: A Status Symbol for Our Time

By Joe Pinsker

Once, long ago, being richer meant working less. (From 2017)

Read the article.

Why Americans Suddenly Stopped Hanging Out

By Derek Thompson

Too much aloneness is creating a crisis of social fitness.

Read the article.


Still Curious?


Other Diversions


P.S.

Feeling too busy to read? Here are five books that’ll easily fit into your schedule.

Before you go: Do you encounter wonder, in forms big or small, throughout your days? I’d love to know what sparks your sense of awe in the world: nature, your pet, a cookie, whatever wondrous means to you. Send a photo so we can share your wonder with fellow readers in a future edition of this newsletter or on our website.

Please include your name (initials are okay), age, and location. By doing so, you agree that The Atlantic has permission to publish your photo and publicly attribute the response to you, including your first name and last initial, age, and/or location that you share with your submission.

— Isabel

Click here to see original article