The Future of Monkeypox

The World Health Organization has recommended a new name for monkeypox, asking countries to forget the original term in favor of a new one, “mpox,” that scientists hope will help destigmatize the disease. But in the United States, the request seems to be arriving late. The outbreak here has been in slow retreat for months—and has already left many Americans’ minds.

About 15 cases are now being recorded among Americans each day, less than 4 percent of the tally when the surge was at its worst. After a sluggish and bungled early rollout, tests and treatments for the virus are more available; more than a million doses of the two-shot Jynneos smallpox vaccine have found their way into arms. San Francisco and New York—two of the nation’s first cities to declare mpox a public-health emergency this past summer—have since allowed those orders to expire; so have the states of New York and Illinois. “I think this is the endgame,” says Caitlin Rivers, an infectious-disease epidemiologist at the Johns Hopkins Center for Health Security.

But “endgame” doesn’t mean “over”—and mpox will be with us for the foreseeable future. The U.S. outbreak is only now showing us its long and ugly tail: 15 daily cases is not zero daily cases; even as the number of new infections declines, inequities are growing. Black and Latino people make up a majority of new mpox cases and are contracting the disease at three to five times the rate of white Americans, but they have received proportionately fewer vaccines. “Now it’s truly the folks who are the most marginalized that we’re seeing,” says Ofole Mgbako, a physician and population-health researcher at New York University. “Which is also why, of course, it’s fallen out of the news.” If the virus sticks around (as it very likely could), and if the disparities persist (as they almost certainly will), then mpox could end up saddling thousands of vulnerable Americans each year with yet another debilitating, stigmatized, and neglected disease.

[Read: America should have been able to handle monkeypox]

At this point, there’s not even any guarantee that this case downturn will persist. “I’m not convinced that we’re out of the woods,” says Sara Bares, an infectious-disease physician at the University of Nebraska Medical Center, in Omaha. Immunity, acquired through infection or vaccines, is now concentrated among those at highest risk, says Jay Varma, a physician and epidemiologist at Weill Cornell Medicine. But researchers still don’t know how well those defenses can stave off another infection, or how long they might last—gaps in knowledge that may be tough to fill, now that incidence is so low. And although months of advocacy and outreach from the LGBTQ community have cut down risky sexual activities, many cautionary trends will eventually reset to their pre-outbreak norm. “We know extensively from other sexually transmissible infections that behavior change is not usually the most sustained response,” says Boghuma Kabisen Titanji, an infectious-disease physician at Emory University.

At the same time, this year’s mpox outbreaks are stranger and more unwieldy than those that came before. A ballooning body of evidence suggests that people can become infectious before they develop symptoms, contrary to prior understanding; some physicians are concerned that patients, especially those who are immunocompromised, might remain infectious after the brunt of visible illness resolves, says Philip Ponce, an infectious-disease physician at the University of Texas Health Science Center at San Antonio and the medical director of San Antonio’s Kind Clinic. (Some 40 percent of Americans who have been diagnosed with mpox are living with HIV.) Researchers still don’t have a good grip on which bodily fluids and types of contact may be riskiest over the trajectory of a sickness. Cases are still being missed by primary-care providers who remain unfamiliar with the ins and outs of diagnosis and testing, especially in people with darker skin. And although this epidemic has, for the most part, continued to affect men who have sex with men, women and nonbinary people are getting sick as well, to an underappreciated degree.

[Read: What should worry most Americans about our monkeypox response]

Intel on the only mpox-fighting antiviral on the shelf, a smallpox drug called tecovirimat, also remains concerningly scant, even as experts worry that the virus could develop resistance. The treatment has been given a conditional green light for use in people who are currently, or at risk of becoming, severely sick. Anecdotally, it seems to work wonders, shaving days or weeks off the painful, debilitating course of symptoms that can send infected people into long-term isolation. But experts still lack rigorous data in humans to confirm just how well it works, Bares, who’s among the scientists involved in a nationwide study of the antiviral, told me. And although clinical trials for tecovirimat are under way, she added, in the U.S., they’re “struggling to enroll patients” now that infections have plummeted to such a sustained low. It’s a numerical problem as well as a sociocultural one. “The urgency with which people answer questions declines as case counts go down,” Varma told me.

Recent CDC reports show that a growing proportion of new infections aren’t being reported with a known sexual-contact history, stymieing efforts at contact tracing. That might in part be a product of the outbreak’s gradual migration from liberal, well-off urban centers, hit early on in the epidemic, to more communities in the South and Southwest. “In small towns, the risk of disclosure is high,” Bares told me. In seeking care or vaccination, “you’re outing yourself.” When mpox cases in Nebraska took an unexpected nosedive earlier this fall, “a colleague and I asked one another, ‘Do you think patients are afraid to come in?’” Those concerns can be especially high in certain communities of color, Ponce told me. San Antonio’s Latino population, for instance, “tends to be much more conservative; there’s much more stigma associated with one being LGBT at all, let alone being LGBT and trying to access biomedical interventions.”

Hidden infections can become fast-spreading ones. Monitoring an infectious disease is far easier when the people most at risk have insurance coverage and access to savvy clinicians, and when they are inclined to trust public-health institutions. “That’s predominantly white people,” says Ace Robinson, the CEO of the Pierce County AIDS Foundation, in Washington. Now that the mpox outbreak is moving out of that population into less privileged ones, Robinson fears “a massive undercount” of cases.

Americans who are catching the virus during the outbreak’s denouement are paying a price. The means to fight mpox are likely to dwindle, even as the virus entrenches itself in the population most in need of those tools. One concern remains the country’s vaccination strategy, which underwent a mid-outbreak shift: To address limited shot supply, the FDA authorized a new dosing method with limited evidence behind it—a decision that primarily affected people near the back of the inoculation line. The method is safe but tricky to administer, and it can have tough side effects: Some of Titanji’s patients have experienced swelling near their injection site that lasted for weeks after their first dose, and now “they just don’t want to get another shot.”

The continued shift of mpox into minority populations, Robinson told me, is also further sapping public attention: “As long as this is centered in BIPOC communities, there’s going to be less of a push.” Public interest in this crisis was modest even at its highest point, says Steven Klemow, an infectious-disease physician at Methodist Dallas Medical Center and the medical director of Dallas’s Kind Clinic. Now experts are watching that cycle of neglect reinforce itself as the outbreak continues to affect and compress into marginalized communities, including those that have for decades borne a disproportionate share of the burden of sexually associated infections such as syphilis, gonorrhea, and HIV. “These are not the groups that necessarily get people jumping on their feet,” Titanji told me.

Some of the people most at risk are moving on as well, Robinson told me. In his community in Washington, he was disappointed to see high rates of vaccine refusal at two recent outreach events serving the region’s Black and American Indian populations. “They had no knowledge of the virus,” he told me. Titanji has seen similar trends in her community in Georgia. “There’s some sense of complacency, like, ‘It’s no longer an issue, so why do I need to get vaccinated?’” she said.

[Read: America is zooming through the pandemic panic-neglect cycle]

The tide seems unlikely to shift. Even tens of thousands of cases deep into the American outbreak, sexual-health clinics—which have been on the front lines of the mpox response—remain short on funds and staff. Although the influx of cases has slowed, Ponce and Klemow are still treating multiple mpox patients a week while trying to keep up the services they typically offer—at a time when STI rates are on a years-long rise. “We’re really assuming that this is going to become another sexually associated disease that is going to be a part of our wheelhouse that we’ll have to manage for the indefinite future,” Klemow told me. “We’ve had to pull resources away from our other services that we provide.” The problem could yet worsen if the national emergency declared in August is allowed to expire, which would likely curb the availability of antivirals and vaccines.

Rivers still holds out hope for eliminating mpox in the U.S. But getting from low to zero isn’t as easy as it might seem. This current stretch of decline could unspool for years, even decades, especially if the virus finds a new animal host. “We’ve seen this story play out so many times before,” Varma told me. Efforts to eliminate syphilis from the U.S. in the late ’90s and early 2000s, for instance, gained traction for a while—then petered out during what could have been their final stretch. It’s the classic boom-bust cycle to which the country is so prone: As case rates fall, so does interest in pushing them further down.

Our memories of public-health crises never seem to linger for long. At the start of this mpox outbreak, Titanji told me, there was an opportunity to shore up our systems and buffer ourselves against future epidemics, both imported and homegrown. The country squandered it and failed to send aid abroad. If another surge of mpox cases arrives, as it very likely could, she said, “we will again be going back to the drawing board.”

Click here to see original article

Seven Books That Will Make You Smarter

The cover of a nonfiction book is like the hood of an automobile: Nudge it open, and you’ll find sentences like cylinders and pistons folded and coiled together, an engine ready to propel us toward answers to daunting questions. How did life begin? What is art for? What transpires inside our cells? How do our nation’s values hold up in an era of accelerating change? The best nonfiction does more than just assemble information. It takes a reader through curious landscapes, offering a deeper grasp of how the world moves and, most important, what moves it.

The seven nonfiction titles below are not textbooks; they’re accessible to lay readers, give an overview of crucial topics, and can serve as a jumping-off point for further research. They investigate what our society values and what it’s built on, driving us to the monumental, the sublime, the quintessentially human.


The cover of Transformer
W. W. Norton and Company

Transformer: The Deep Chemistry of Life and Death, by Nick Lane

Lane, an evolutionary biochemist, is eminently qualified to investigate how we define life. In an earlier book, The Vital Question, he put forward a provocative hypothesis on how cells formed from the jostle of atoms. For almost 2 billion years, he proposed, bacteria and archaea, two of the three kinds of organisms that exist on our planet, churned through ancient oceans until a single archaeon swallowed a bacterium and emerged as the mother of all multicellular descendants—including us. With Transformer, he continues his indefatigable exploration of the genesis of biology. Lane focuses on millions of years of evolution and the planet’s twists and turns—the Great Oxidation Event, the Cambrian explosion—while profiling visionary scientists. He beautifully lays out the sheer improbability of our biosphere, explains why life may be exceedingly rare in our universe, and considers death as a process, not simply as an instantaneous end.


The cover of David Smith
Farrar, Straus, and Giroux

David Smith: The Art and Life of a Transformational Sculptor, by Michael Brenson

Throughout his career, the American sculptor David Smith was immensely versatile: Although his work was rooted in post–World War II abstract expressionism, he remained committed to industrial materials, reflecting his apprenticeship at a Studebaker factory during his youth. Smith’s aesthetic, which steamrolled over European conventions, owed a debt to Pablo Picasso’s (like the Cubists before him, Smith saw his pieces as removed from, even opposed to, how the real world looked), and he defiantly claimed abstraction as the American artistic idiom. Brenson’s rich, authoritative biography conjures not only the man and his myth, but also the ruptures of modernity and the tensions between abstraction and representation, set against a backdrop of global change. Smith’s pervasive influence shaped artists as diverse as Louise Bourgeois, Frank Stella, and Richard Hunt. “Smith’s breakthrough works of 1951–1952 did not so much as transition out of the 1930s and ’40s as erupt into the second half of the century,” Brenson writes, “projecting an entire different speed and flow, in their inventiveness … contributing to the revolt against ‘the fixed form, the changeless and self-contained.’” For Smith, sculpture was a declaration of independence.

[Read: Picasso as sculptor]


The cover of Capital in the Twenty-First Century
Harvard University Press

Capital in the Twenty-First Century, by Thomas Piketty, translated by Arthur Goldhammer

Published nearly a decade ago, and a surprise best seller, Capital in the Twenty-First Century looks back in order to look forward, plumbing economic patterns from the 18th century onward and homing in on the staggering inequities that dominate our age. Piketty, an internationally acclaimed French economist and polymath, draws on social history and literary classics—European revolutions, novelists such as Jane Austen and Honoré de Balzac, the sectarian divide in the United States. He devotes special attention to the economic hierarchies that have taken root over the past 40 years. Piketty pinpoints 1980 as a pivotal year: The rise of the free-market ideologues Ronald Reagan and Margaret Thatcher cemented the power of elites and dealt a blow to the fairy tale that integrity and work would always pay off. Like Binyamin Appelbaum in his book The Economists’ Hour, Piketty credits capitalism with improving efficiency while creating economic divides that push the bounds of morality.


The cover of The Hemingses of Monticello
W. W. Norton & Company

The Hemingses of Monticello: An American Family, by Annette Gordon-Reed

Our most erudite founder enslaved and exploited hundreds of people, but he was especially enmeshed, both publicly and privately, with one family in particular. Gordon-Reed, a law professor and historian, unspools this saga in her magisterial Pulitzer Prize–winning account of the relationships between the Hemingses and Thomas Jefferson, virtuoso of the revolutionary generation. In Gordon-Reed’s vibrant, judicious telling, Elizabeth, an enslaved person and the Hemings matriarch, was the fulcrum of her children’s fates and fortunes, ensuring their survival from the moment she was brought to Monticello after being inherited by Jefferson’s wife, Martha Wayles Jefferson. Elizabeth’s daughter Sally, Martha’s half-sister, accompanied the future president to Paris in the 1780s, where she became pregnant. In France, Sally was legally free; she agreed to return to bondage in Virginia after Jefferson promised to liberate their future children when each turned 21, an oath he only partially honored. “Hemings seized her moment and used the knowledge of her rights to make a decision based upon what she thought was best for her as a woman, family member, and a potential mother in her specific circumstances,” the author writes. Gordon-Reed layers her book with meticulous research and revelatory anecdotes, exposing how Jefferson’s life is inextricable from the Hemingses’ just as America’s history is inextricable from slavery.

[Read: Black America’s neglected origin stories]


The cover of Democracy's Data
MCD

Democracy’s Data: The Hidden Stories in the U.S. Census and How to Read Them, by Dan Bouk

A data analyst by training, Bouk burrows into the 1940 census, taken as the U.S. emerged from the Great Depression and the world teetered on the cusp of conflagration. His investigation starts with a casual anecdote from that year: A census taker named Selena Catalano makes a house call in Rochester, New York, to interview the matronly Nellie Oakden, Bouk’s great-grandmother. Bouk then widens his aperture by combing through archives and transcripts in other locales, looking at other lives and crafting a cultural history about how information is collected and processed. As a whole, the censuses’ data show that the explosion of cities and suburbs poses challenges to our frail—and, in some cases, outdated—political institutions. As sweeping demographic change has escalated in recent decades—a surge in immigration and a sorting of the parties between cities and suburbs for Democrats and rural counties for Republicans—so too have disputed elections and the threat of gridlock. This searching, textured inquiry illuminates how much simple population figures can teach us.


The cover of Apollo's Angels
Random House

Apollo’s Angels: A History of Ballet, by Jennifer Homans

Homans’s classic, published in 2010, charts the arc of ballet from its origins in Renaissance Italian and French courts to the dawn of the 21st century, when the grace and vigor of George Balanchine and other modern masters bloomed. She examines the way ballet as an art form has intersected with political ideas over the past 500 years, encompassing the divine right of kings and the twilight of empires. She’s particularly strong on seminal figures such as Louis XIV, the “Sun King” (himself a dancer); Pyotr Tchaikovsky; Sergei Diaghilev; and Jerome Robbins, as New York rose to challenge a wounded Europe as the epicenter of dance after World War II. Twists and torques, leaps and lunges, relevé and glissade—the body is the canvas upon which the choreographer paints murals of social flux and personal epiphany. Apollo’s Angels is not just a cultural history of a single art form; it’s a prism through which to contemplate the human physique through time and space.

[Read: The death of the American dance critic]


The Song of the Cell: An Exploration of Medicine and the New Human, by Siddhartha Mukherjee

Mukherjee is an eminent oncologist and the Pulitzer Prize–winning author of the best-selling The Emperor of All Maladies and The Gene. His newest book is an expansive study of the cell—the common denominator of all life—and its dizzying range of types and functions. He studies neurons, the cells involved in reproduction, and rampant cancers, and gestures toward a future in which cell engineering could eradicate diseases and transform medicine. Cells are anything but simple structures; rather, they’re sinuous ecosystems, and they come together at a dazzling scale in the body. Blood, for example, is “a cosmos of cells. The restless ones: red blood cells … The healers: tiny platelets … The defenders, the discerners: B cells that make antibody missiles; T cells, door-to-door wanderers that can detect even the whiff of an invader.” Using that twirling prose, he braids history with science; we meet pivotal figures such as the quirky Dutch autodidact Antonie van Leeuwenhoek (who first glimpsed what he called “animalcules” through his microscope) and contemporary Nobel laureates ensconced in their labs, testing gene-editing technologies. Understanding the cell is the key to an age of personalized medicine, Mukherjee argues: Are we ready to embrace it?


​​When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

Click here to see original article

The Kingdom Is the Strangest Medical Drama You’ll Ever See

The Kingdom, Lars von Trier’s 1990s miniseries about the supernatural goings-on in a Copenhagen hospital, isn’t your average medical drama. The focus isn’t on mass emergencies, rushed consultations, or code blues. Instead, the first episode features a tedious argument about who has administrative clearance to order CT scans. Also: A medical student steals a cadaver’s head, an occultist masquerading as a patient goes ghost-hunting, and two seemingly clairvoyant dishwashers offer cryptic commentary. The Kingdom is less ER than it is Scrubs by way of Twin Peaks—a hospital farce with paranormal tendencies.

By the end of the pilot, however, it’s obvious that von Trier is making more than a spooky workplace comedy. The most haunting scene is one not in which we glimpse ghosts, but in which we learn that the hospital’s celebrated neurosurgeon botched an operation that left a young girl with catastrophic brain damage. Many such acts of malpractice and medical machismo dominate the original eight episodes. Some events are played for laughs—for instance, the chaos that ensues when one doctor tries to turn another into a zombie—but the critique is clear. Even as the show focuses on unearthly forces, it insists that the true enemies are closer at hand, walking the corridors in white coats, and occasionally gathering for frat-like “lodge” rituals. In The Kingdom, cronyism, bureaucratic inertia, and medical hubris do more harm than any ghoul.

[Read: How Twin Peaks invented modern television]

Put another way, the show offers a crash course in what the philosopher Hannah Arendt called the “banality of evil.” Von Trier might seem to be an unlikely messenger for this critique, given his previous pro-Nazi musings, but here he aims his impudence at the arrogance of a blinkered elite more concerned with, say, securing a private parking space than with their patients’ lives. Or, as one of the dishwashing prophets puts it, offering his own appraisal of the growing chaos, “It may start as stupidity but it will end as evil.” That line could very well serve as a motto for the series, which is returning after 25 years for a third and final season, titled The Kingdom Exodus. The first new episode aired on Mubi yesterday, following the rerelease of the first two seasons earlier this month; the remaining four episodes will air weekly.

The timing is auspicious, given that the series’ satire of institutional incompetence seems custom-made for our current moment. Consider the the eerie prologue that precedes the original episodes: As a camera pans across a desaturated, fog-covered expanse, a voice-over solemnly informs us that these are the boggy grounds on which the hospital was built to house “the best brains in the nation and the most perfect technology.” But, we’re told, “tiny signs of fatigue are appearing in the solid, modern edifice.” The idea of a public institution built on a compromised foundation will have new resonance for American viewers, who, in the past decade, have witnessed a series of staggering systemic failures—of democracy, public health, and civic infrastructure. When, later in the first episode, an orderly announces that “the water has undermined everything,” it’s hard not to see the show’s portrayal of a sinking behemoth as a metaphor for our own national predicament.

As the series progresses, von Trier seems to relish imagining the endgame of such systemic failure. Over the second season, the hapless neurosurgery chief, Einar Moesgaard—Denmark’s answer to Leslie Nielsen—abandons his duties to practice regression therapy in the basement. The hospital’s resident mad scientist transplants a diseased organ into his own body for research purposes. Some medical staff are spending much of their time betting on ambulance drag races. Oh, and a demonic former doctor played by Udo Kier fathers a child who emerges from the womb a monstrously big and long-limbed baby, also played by Udo Kier.

The Kingdom’s documentary style—the show is shot on 16mm film, using handheld cameras—lends this Grand Guignol a certain sepia-toned realism. It’s one of the features that distinguish this series from better-known supernatural shows of the 1990s, including the original Twin Peaks (1990–91), an avowed inspiration for von Trier, and long-running hits such as Buffy the Vampire Slayer (1997–2003) and The X-Files (1993–2002, 2016–18), which leaned more heavily on an episodic, monster-of-the-week structure. By contrast, The Kingdom, as if anticipating the Dogme 95 movement von Trier would help found, locates much of its horror in the everyday. Despite its undeniable creepiness, the show is ultimately, to paraphrase Buffy’s Rupert Giles, “more normal and less … para.”  

One doesn’t need to be a fan of von Trier, a long-standing agent provocateur, or of his equally provocative cinema, to appreciate the show’s sharp assessment of incompetence carried out under the cover of occupational prestige. Even those who actively dislike his films may appreciate the show’s cheeky flourishes—the in-person cameos von Trier puts in at the episodes’ end, or the title card that gives way, like the elevator doors in The Shining, to a geyser of blood. Such gestures provide a respite from the veneer of respectability that has settled over a certain subset of “quality” TV in the years since the series’ original run.

Amid COVID-era assaults on science and public-health measures, a withering send-up of the medical profession might seem to be the last thing we need. In fact, The Kingdom is just the provocation we deserve, one that asks viewers to question not medical science, per se, but the reflexive worship of science, and the abuse of professional cachet for personal ends. The hospital’s doctors may have fancy STEM degrees, after all, but they’re also dangerously, if hilariously, oblivious, both to the human damage they cause and to the dark forces that are rising. In this sense, von Trier’s supernatural soap opera is a cautionary tale. In the world of The Kingdom, those endowed with the most power are, unfortunately, most likely to exploit it.

Click here to see original article

A Pocket-Size Time Machine

Between the two of us, my father and I have more than 50 diaries. Mine are a wealth of embarrassments: elementary-school poems that rhyme first base with corn flakes, a photo of an ex–best friend with the edges burned in some teenage rage, gushing during college about first love and infidelity, and more recently, a list of baby names that I’m relieved were never chosen. (Was I really considering Amapola?) My father’s diaries, which date back to the 1960s, are a mash-up of half-finished watercolors, to-do lists, and reflections on addiction. As humiliating and incoherent as most of these diaries are, I cannot part with them. And so they sit there, stacked in banker’s boxes in my childhood attic, collecting dust and rat poop.

My diary collection is dwarfed by Sally MacNamara Ivey’s. She has read more than 10,000 unpublished diaries and spent 35 years collecting them. She keeps nearly 1,000 in her Washington State home. With her blue-rimmed librarian glasses and wavy golden hair, she’s part archivist and part romantic, on a mission to sort, catalog, and find a forever home for all of her diaries. They’re tucked away in plastic bins in each of her closets, stacked on nightstands, and stored securely in six-foot-tall, 1,000-pound safes in her garage. “If someone robs me,” she told me, “they’re going to be very disappointed.”

Whenever MacNamara Ivey has had pocket change, it’s gone to purchasing diaries. Back in the late ’90s, when she and her husband were raising four kids with the money she brought in waiting tables and he made working at the local mill, she bought a diary on layaway for $500 (about $900 today). It was written by a woman who attended the 1893 Chicago World’s Fair. In exquisite detail, the author described the exotic events, including the devastating fire that erupted in a cold-storage building nicknamed the “Greatest Refrigerator on Earth” and the wonders of this new thing called electricity.

June 8, 1893: As dusk came on we obtained seats in front of the Administration Building and watched the surrounding buildings being lit with rows of incandescent lamps, and the dome of the Administration with its beautiful crown of electric lights. The fountains were illuminated by changing colored lights, and the Search Lights sending out their great arms of light, crossing and recrossing each other, were truly wonderful. These great rays seemed more supernatural than anything else at the Fair.

I’d always assumed that, because I was an ordinary, non–World’s Fair attending person, my journals were destined for the trash when I died or couldn’t store them anymore. But to some people, all diaries are treasures, especially the ordinary ones—not just because they are physical pieces of history, but for the same intangible reasons that I’ve been hanging on to my dad’s.

Until the 1970s, the value of ordinary people’s journals was often overlooked in Europe and North America. “They were seen as sentimental, parochial, and female,” Polly North, a diary scholar, told me. Though writing and reading diaries for pleasure were common throughout the 20th century, academics had traditionally focused on studying material from the powerful or famous. Starting in the ’60s, historians began to push for an understanding of history “from below.” Letters, oral histories, and diaries all became more common and attractive sources. As counter-histories have become fashionable, North said, the very things that used to be considered flaws of journal writing are now seen as its virtues.

To scholars, diaries are visceral firsthand accounts of history. Sure, you can gain some understanding of World War II through a textbook, but such descriptions will always be abstract compared with the in situ writing of, say, Mary Rowlands, a British teenager who recorded her time inside a bomb shelter. North has a copy of her unpublished diaries and letters, in which Rowlands describes her parents giving her cigarettes to keep her calm.

People will pay good money for some of these reflections. Starting in the late ’90s, MacNamara Ivey began selling diaries on eBay to collectors, institutions, and museums. Her most lucrative sales: a diary from a 1912 Machu Picchu visitor and one by an 1868 Missouri River traveler, which went for about $9,000 each.

Most diaries for sale online or in auction houses in America were written by a small segment of society; after all, writing and preserving a journal requires education, time, and money. When Angela Hooks began studying diary as literature a decade ago, she did not see herself in any public collection. “Pretty much every diary that was referenced was white and British,” Hooks, who is Black, told me. Historically, she said, Black people’s diaries have been disregarded, neglected, and occasionally destroyed by malice. This is especially true for Black women’s diaries, of which only about a dozen in the public record were written before 1900. (One of them was published in The Atlantic in 1864.)

[From the May 1864 issue: Life on the Sea Islands]

Still, Hooks has found unpublished diaries by Black writers dating back centuries, as well as one her own grandmother wrote in the 1880s. “It was amazing to see my grandmother’s handwriting,” Hooks said. Although Hooks wasn’t able to preserve that diary, she has stored and labeled each of her own—more than 80 in total. She told me that her writing “narrates the pain and sorrow, the joys and victories, the hopes and dreams of a Black woman’s life.” Hooks, like many diarists, uses writing to vent, understand herself, and grow. And research backs journaling as a tool for alleviating anxiety and handling stress.

Whereas the emotional benefits of writing in a diary have been long established, what we might gain by reading other people’s diaries is just now being discovered. In 2019, the cultural psychologist Joshua Conrad Jackson ran an experiment in which he shared journal entries between Pakistanis and Americans. After merely a week of the exchange, Americans who read Pakistani entries and Pakistanis who read American ones felt closer to one another, and held fewer negative stereotypes about the other group. The power, Jackson told me, comes from how quotidian diaries are. Unlike posts on social media, which people are almost always writing to get attention, journal entries are usually private affairs. They’re not written to provoke outrage or elicit a response. “They’re such an invaluable research tool because they give us access to the naturalistic thoughts of so many people,” Jackson said.

That power to connect is exactly what keeps diary collectors hooked. There’s an intimacy that comes with studying the distinct penmanship of another person, seeing the exact date or time they jotted down their words, and discovering what was going through their mind at that very moment. When I read my own father’s diaries, the rawness makes me cringe, cry, and laugh. It peels back the layers to reveal just how similar we really are.

[From the May 1897 issue: The deathless diary]

In 2008, MacNamara Ivey’s husband was killed in a construction accident. In her grief, she decided to escape by reading a random diary. But instead of getting lost in some exotic land, she found herself reading words that seemed to have spilled from her own mouth. The diary she’d picked up was written in 1927 by a Dallas man named John. The brown cover was falling off and the binding was loose. “Almost all the pages have a crease at the very top right-hand corner, as if John turned them down to mark where he left off,” MacNamara Ivey said.

When the diary begins, John is in the intense grip of grief after his wife and baby have both died following childbirth.

March 31st, 1927: Perhaps I should be in bed—I guess I need some rest. I’ve been lost in the sea of memory again this evening and the loneliness of life; the emptiness within my heart has been urging me to search and search for that which might heal it again. Happiness; It is indeed a word of mystery.

MacNamara Ivey saw the darkness and pain she felt reflected so perfectly in John’s writing that it was as if he’d reached across space and time to hold her hand. “It’s one of the things that saved me,” MacNamara Ivey said.

What made that diary so special was its very ordinariness—perhaps also the thing that makes it junk in anyone else’s eyes. It’s easy to find homes for the journals of famous people; the dilemma is what to do with diaries like John’s, or mine, or my father’s: the musings of everyday people, interspersed with doodles and to-do lists. Such diaries may not always change our understanding of history, but they can transport and unite us, like bridges across time.

[Read: When diary-keeping gets in the way of living]

“So many people ask me, ‘Will you take my diaries?’” MacNamara Ivey said. She has trouble saying no. “It’s like my heart just wraps around them, and I can’t let go.” But as she’s gotten older, MacNamara Ivey has started to worry about what will happen to all these “ordinary people’s” diaries when she’s not around to rescue them—or when her closets fill up. “I want them to be placed somewhere safe, with someone that appreciates them and loves them as much as I do.”

One evening this fall, I joined MacNamara Ivey and several other diary collectors from across the country on a video call. They were gathered to hash out their idea for an archive that would house not just their own diaries, but those of anyone in the U.S. The meeting began like diary-collectors anonymous, each of them listing the number of years they’ve been held hostage by their attraction to ephemera. Their collections consist of paper products that have been culled from a combination of flea markets, eBay, estate sales, and even Dumpsters.

As tightly as they’ve held on to these relics, they’re willing to give them all away to the right partner. “We would like our donations to be the seed for a place where other people can donate diaries and letters,” the group’s front man, Robert K. Elder, said in the meeting. Elder, the CEO of a nonprofit, said they’ve considered working with local libraries and historical organizations, but those places have their own collecting priorities and limited space. This group wants an open submission policy, fewer gatekeepers, and more voices than a local project could accommodate.

Europe has several such models of diary archives. The largest, The Great Diary Project, in London, is run by Polly North. North co-founded the archive in 2007, and since then has received more than 17,000 journals. “They call me the diary mama,” North told me. Though donors have the option to request that their diaries remain private until long after they’re dead, most don’t. “Their diaries cover adoption, abortion, family brawls, and still the majority allow them to be viewed,” North said. Historians and nosy people alike are thrilled.

[Read: Keeping a diary at the end of the world]

An archive specifically devoted to the diaries of everyday people is a modern idea, and one that has found support in countries such as Italy, Portugal, and Germany. That’s not to say that diarizing is a uniquely European tradition. In fact, the type of introspective journal writing we are familiar with today can be traced back more than 1,000 years to East Asia. One of the best-known diaries from that time was written by a 10th-century Japanese courtesan named Sei Shonagon. Inside her “pillow-book,” she reflects on life inside the emperor’s palace, shares gossip, and records lists (under one titled “Adorable Things”: “The face of a child drawn on a melon”; under “Depressing Things”: “A dog howling in the daytime.”)

These days, diaries with lists and secrets are ubiquitous; there are far too many to properly sort and preserve in just a few archives. That’s why Sally MacNamara Ivey and her team of collectors are anxious to secure a space in the United States. And it will have to be big, because they’re considering accepting other forms of autobiographical material as well, such as scrapbooks, audio recordings, and family cookbooks—all those rare, ordinary, transcendent glimpses into another person’s life.

I recently asked my father whether he would consider donating his diaries to such an archive after he’s no longer alive. Ever the pragmatist, he replied, “Sure, it would be nice if they were useful.” The likelihood that any scholar will cite my father’s entries about learning to bake cornbread or recovering from knee surgery is low. But there are a million humbler ways that his—and my—most banal thoughts might be useful. As embarrassing as my own journal entries may be, I’ve come to recognize their flaws as their power. Their unfiltered mundanity is precisely what makes them so valuable. Will my diaries alter the course of history? Unlikely. Would I still donate them? I would. Besides, we’re running out of space in our attic.

Click here to see original article

An Unequal Liberty

The conservative justices have a selective and destructive notion of “liberty.” In overruling Roe v. Wade last term, the Supreme Court found that the “liberty” explicitly protected by the Fourteenth Amendment’s due-process clause does not include freedom against forced childbearing. In contrast, in cases that concern the Constitution’s structural provisions creating and empowering the institutions of the federal government—provisions that do not mention liberty—the Court has wrapped itself in liberty to explain why it has acted to protect the liberty of a powerful few. Yet the liberty of the vulnerable many, which grows rather than recedes with the kinds of federal protections the Court has rejected, goes unremarked.

In the Supreme Court’s current term, it will hear several cases about the powers and structure of the federal government in which these competing perspectives on liberty become apparent. In one, the justices are being asked to drastically cut back on the EPA’s authority to control water pollution under the Clean Water Act. In two others, the underlying legal claims seek to end the political independence of administrative-law judges and to make the very notion of an independent federal agency, whose leading officials are protected from being fired by the president for personal or political reasons, unconstitutional.

[Linda Greenhouse: What in the world happened to the Supreme Court?]

These cases follow a line of recent decisions in which the conservative justices have deployed their understanding of the constitutional separation of powers in the service of a project to weaken and restructure the government. Last term, for example, the Court cited “separation of powers principles” in rejecting an EPA rule governing greenhouse-gas emissions from fossil-fuel-fired power plants. In taking a no-holds-barred view of the separation of powers, the conservative justices explained that their aim is to protect human liberty.

What is the nature of the liberty the conservative justices are trying to protect? It is, quite simply, the liberty of a powerful few to keep the government out of their business while the rest of the country suffers at their hands. Agency actions in the bull’s-eye of the separation-of-powers cases making their way through the federal courts—cases spurred on by the Supreme Court’s activism in this area—aim to address serious harm inflicted by entities such as payday lenders engaging in unfair lending practices, property developers destroying wetlands, and hedge-fund managers engaging in fraud. The freedom at stake in such cases includes more than the self-interested freedom to take liberties with the public interest. It includes the opportunities and capacities—the life—that can emerge when people are protected from wrongful harms inflicted by other people. Yet the conservative justices see only the former, stingier kind of liberty.

One can discern this narrow and skewed view of liberty in the Court’s recent cases embracing a requirement—which it has called the “major questions doctrine”—that Congress must speak super clearly if it wants to empower an administrative agency to address an issue of major policy importance. According to this doctrine, it is not enough for Congress to identify the kind of problem it wants an agency to address, to identify the agency charged with addressing it, and to provide general parameters for the agency’s decisions. It is not enough, the Court has said, for Congress to write a statute that provides a “plausible textual basis” for an agency’s authority to tackle a problem. Instead, Congress must speak with precision even on matters that are difficult or impossible to predict in advance but that require decisive action when they emerge. In making application of this requirement of extreme clarity turn on their subjective assessment of which rules are important enough, in the right way, the conservative justices are empowering themselves to pick and choose between the regulations they like and those they think run afoul of this arbitrary standard.

West Virginia v. EPA, the climate case from last term, presents a perfect example of this dynamic. Due to the Supreme Court’s surprising willingness to pass judgment on a rule that no longer had practical effect, the Court actually had in front of it two different EPA rules, issued several years apart, taking very different approaches to the massive and dangerous carbon output of fossil-fuel-fired power plants. The contrast in the Court’s reactions to these two rules exposes the Court’s biased perspective on government regulation and the kind of liberty that the Court is willing to protect.

The first rule was the Obama administration’s Clean Power Plan, which set emission limits based in part on shifts in electricity generation from coal-fired power plants to gas-fired plants and renewable energy sources. The second, the Affordable Clean Energy (ACE) rule issued by the Trump administration, identified but did not mandate several measures to reduce emissions by increasing the efficiency of individual coal-fired power plants. When it issued the ACE rule, the EPA also repealed the Clean Power Plan. The EPA predicted only “modest” (less than 1 percent by 2030) reductions in carbon emissions from the ACE rule; it had projected much greater reductions of 32 percent from the Clean Power Plan.

In the lower-court ruling reviewed in West Virginia, the D.C. Circuit invalidated both the ACE rule and the EPA’s repeal of the Clean Power Plan. The Supreme Court reversed the D.C. Circuit’s judgment on both counts. The Supreme Court also thought that the D.C. Circuit’s decision invalidating the repeal of the Clean Power Plan had the effect of reviving the Clean Power Plan, and thus justified the Court’s review of the Clean Power Plan as well as the ACE rule. By its own logic (according to which the invalidation of a rule’s invalidation brings the rule back into effect), the Supreme Court’s decision simultaneously spared the ACE rule and sank the Clean Power Plan.

[Kimberly Wehle: The Supreme Court’s extreme power grab]

In issuing each of these rules, the EPA considered the same legal question: Does the Clean Air Act empower the agency to require shifts in energy generation when setting emission limits for existing power plants? In the Clean Power Plan, the EPA answered yes. In the ACE rule, EPA answered no. The statutory language governing each rule was exactly the same. The economic and environmental consequences and the political fallout of the rules—central factors in identifying a “major question”—were mirror images of each other. Under a judicial test that denies agencies’ authority to make major policy decisions in the absence of clear language from Congress, shouldn’t the challenges to each of these rules have come out the same?

The Supreme Court, however, apparently believes that a major policy question can become a minor policy question when the agency gives an answer that the conservative justices approve of. As shown in West Virginia v. EPA, ineffective regulation is an answer the conservative justices approve of.

The same goes for no regulation at all. Two past climate cases illustrate this point. In Massachusetts v. EPA, the Court rejected the EPA’s claim that it had no authority whatsoever to regulate greenhouse gases under the Clean Air Act. Three current justices (Roberts, Alito, and Thomas) joined Justice Scalia in dissent, arguing that the statutory language was unclear and that the Court should have deferred to the EPA’s denial of authority. In a later case, Utility Air Regulatory Group v. EPA, the Court rejected the EPA’s claim of authority to control greenhouse gases under a specific permitting program of the Clean Air Act. In a precursor to the major-questions doctrine as articulated in West Virginia v. EPA, the Court explained that it was looking for clear statutory language because the underlying policy question was so important.

What was the difference between these two cases? In Massachusetts, the EPA did not want to do anything about climate change. In UARG, it did. If anything, the question the EPA answered in Massachusetts—whether it had any power at all to regulate greenhouse gases under the Clean Air Act—was far more consequential than the comparatively narrow permitting question it addressed in UARG. Comparing the conservative justices’ differential treatment of these cases, it becomes plain that their unspoken premise is that an administrative agency may indeed speak authoritatively on a major policy question, and may even get judicial deference for its decision, but only if it supplies an ideologically appropriate answer—which, for the conservative dissenters in Massachusetts, meant refusing to regulate at all.

The anti-regulatory bias of the major-questions doctrine means that the liberty the justices say they are protecting reflects only one side of the debate over regulation: the side that wants less or no regulation, not the side that needs more in order to avoid harm at the hands of other people. The Supreme Court’s liberty is mostly the liberty of the elite few—the few who have enough power and resources to inflict the kind of widespread harm that the most important regulations try to address. In this vision of liberty, there is no room to consider the freedom lost when others’ freedom to harm is unfettered.

[Robinson Meyer: The Supreme Court’s EPA ruling is going to be very, very expensive]

The Court’s blindness to the kind of freedom preserved by government rules was obvious in its decision earlier this year striking down the Occupational Health and Safety Administration’s shot-or-test rule for large employers. The conservative justices shed a tear for “the lives—and health” of the “vast number of employees” who would be required to choose between vaccination against COVID-19 and testing, but declined to engage with the government’s prediction that the rule would save more than 6,500 lives and prevent hundreds of thousands of hospitalizations.

This anti-regulatory bias aligns beautifully with the deregulatory agenda of the Republican Party. However, rather than owning up to this partisan slant, the justices tell us—and maybe themselves—that the major-questions doctrine is compelled by their understanding of the constitutional separation of powers. As then-Judge Kavanaugh put it in 2017, in United States Telecom Association v. FCC, the major-questions doctrine is “a separation of powers–based presumption against the delegation of major lawmaking authority from Congress to the Executive Branch,” and the separation of powers “is the great safeguard of liberty” in our constitutional framework. The justices who have most fervently embraced the major-questions doctrine also are justices who claim to interpret the Constitution in line with the original meaning of the text at the time of ratification. A wag might ask: If, as originalists posit, the meaning of the Constitution’s structural provisions was fixed in 1789, why did it take the Court until 2022 to fully articulate, or even name, the major-questions doctrine?

The text of the structural provisions of the Constitution does not tell us that Congress may not delegate major issues to administrative agencies unless it speaks super clearly. The conservative justices have taken the spare and enigmatic structural terms of the Constitution, identifying “Legislative” and “Executive” power, and projected into them 21st-century conservative anxieties about the regulatory state. Perhaps it is not surprising that these justices have apparently convinced themselves that the white, male, elite, educated, and propertied members of the founding generation, whom they frequently channel in pondering the Constitution’s mysteries, thought like they do about the government’s role in protecting us from harm at other people’s hands. The people left out of the constitutional process—women, people of color, people without property, people who were then property in the eyes of the law—probably had quite a different view about the need for an active government and the warrant for a capacious view of liberty. The conservative justices’ radical approach to the separation of powers effectively reenacts the original exclusion of these groups from our constitutional framework.

Click here to see original article

The Pandemic Exposed the Inequality of American Motherhood

In the early days of the pandemic, the outlook for women seemed bleak. Experts predicted that, faced with an uncertain economy in the midst of a public-health crisis, women would have fewer kids, accelerating America’s long-running drop in fertility. For those who already had children, researchers foresaw plunging employment. Schools and day cares were closing. Family members couldn’t come help with child care. It seemed clear that mothers would take on the majority of this additional labor, forcing many to scale back on or opt out of paid work entirely. American family life would be sent back to the 1950s.

Fortunately, these dire predictions never quite materialized. Women did shoulder most child care but, broadly speaking, they weren’t pushed out of the labor market at higher rates than men. Likewise, births did not crater. Instead they rose slightly in 2021. U.S.-born women had about 46,000 more babies than they would have had COVID never hit, according to one recent analysis.

But these broad figures, just like the sweeping prognostications that preceded them, flatten a more complex narrative about what the pandemic was like for women. The virus and its economic fallout affected women in the top tiers of society in very different ways than it did women at the bottom. Those without advanced degrees and high-paying jobs were pressed out of the labor force in far greater numbers, and experienced a slower recovery, than their more privileged counterparts. Similarly, a baby bump among highly educated women overshadowed a baby bust among their peers without a degree. Reducing all of these experiences to a single thread is impossible. There isn’t just one story about women during the pandemic; there are many.

[Read: The great pandemic baby bump]

For college-educated women, the tale is framed by the safeguards of privilege. Like their male peers, they were far less likely to lose their job than Americans without a degree. Though some stopped working in the spring of 2020, the group’s employment levels recovered pretty swiftly, according to a recent paper by Claudia Goldin, a Harvard economics professor. And although having children to care for did drive women out of the labor force, the effect seemed less pronounced among those who’d gone to college, Goldin found. In the spring of 2021, the percentage of highly educated mothers with at least one kid under 5 who were working was actually higher than it had been two years prior. And these women kept having kids. Their birth rate remained steady at the start of the pandemic, before increasing substantially in January of 2021 and jumping about 6 percent relative to their pre-COVID trend by year’s end.

The reasons for this resilience are fairly intuitive. Educated women—and, very likely, their partners—worked from home at higher rates, which not only kept them safer from infection and minimized the chances that they’d lose their job but also allowed them to better juggle care and work. In Goldin’s words, advanced schooling “inoculated” workers from the economic impact of the coronavirus. Thanks to generous government-aid programs and a booming stock market, savings and net wealth improved during the pandemic—in all income groups, but especially among more affluent households. This extra money may help explain the baby boom in the laptop class, Hannes Schwandt, one of the authors of the fertility paper, told me. For them, the coronavirus actually offered a good opportunity to start a family.

The narrative followed a much different arc for low-income women and those who hadn’t gone to college. The COVID-related drops in employment were far steeper and more persistent among those with less education, according to Goldin’s analysis. Mothers without a degree, particularly those with very young kids, had an especially sluggish recovery; they continued to lose jobs in 2021, even as employment exceeded pre-pandemic levels for highly skilled mothers of toddlers. Black women were also hit particularly hard. Other research has found similar disparities. For example, a preliminary version of a new study showed that school closures did push some women out of the labor force, but mostly those working low-paying jobs. And though some couples fell into a more traditional male-breadwinner/female-caretaker pattern in 2020, the share of dual earners remains below its pre-COVID rate only among families in which neither partner has a four-year degree. Fertility was also marked by such inequality. Births among women without a college degree declined during the pandemic and never fully recovered. Fertility among Black women also dipped in early 2021 and never entirely bounced back. For those women, there was actually a COVID baby bust.

Poor and less-educated women were not as well insulated from the pandemic’s blows. They left the labor force in greater numbers because they were much more likely to work in service jobs highly susceptible to layoffs, and more likely to face challenges with child care: Most low-income mothers couldn’t work remotely and care for their kids during the workday, and they had less saved to help them afford the ballooning prices of babysitting or day care, an industry that is still in shambles. Under circumstances like these, it’s not surprising that fewer poor women had kids.

[Read: America’s child-care equilibrium has shattered]

Though the fates of privileged and less-advantaged women are starkly different, neither of these stories is simple. Many women with relatively high incomes and higher education still faced the immense stress of working from home with kids underfoot. And although they did not leave the labor market in droves as expected, their careers may have suffered in less obvious ways. The strains of homeschooling while working even seem to show up in fertility data: The baby boom was driven largely by first births. Among women who already had two or more kids, births fell, though they largely bounced back by the end of 2021. By the same token, though outcomes were worse for poor and less-educated women, they still fell short of the catastrophic predictions of 2020. The fact that the baby bust wasn’t more pronounced among women who hadn’t gone to college may be a testament to the aid the government provided through much of the pandemic. And the slower employment recovery among poor mothers might be partly a result of lower-income women having more choices: The unusually strong labor market for workers without degrees, or the various forms of COVID aid, may have made surviving on one income easier and allowed mothers without promising job opportunities or quality child care to stay at home with their kids for a while.

Of course, these explanations reveal inequality of a more insidious sort: Whether working motherhood can be characterized as a privilege or a necessity, a possibility or a burden, is largely dependent on the kinds of work and child care available to you. Many American mothers want to work; others don’t but have to anyway. COVID conditions may have given some poorer women new flexibility, but these circumstances didn’t give them the kinds of opportunities their well-off peers fought so hard to keep. Ultimately, the only coherent story we have to tell about the impact of the pandemic on women is one we’ve heard before: It largely reflected, and in some ways deepened, the inequality long embedded in American motherhood.

Click here to see original article

Political Hobbyism Has Entered the Workplace

In a 2005 episode of The Office, Michael Scott, the office manager, requires his employees to choose an upside-down index card from a tray and place it on their forehead. The cards bear a racial or ethnic label—Black, Jewish, Italian, and so on—and Michael tells the employees to treat one another according to the label listed on the card and to “stir the melting pot” by playing to racial stereotypes. The scene, which ends with Michael getting slapped in the face, mocks corporate America’s ham-handed approach to diversity training. Back in 2005, almost no one saw the C-suite or the human-resources office as an engine of progressive change. Indeed, the idea that workers would look to their employers for leadership on any delicate social or political matters seemed risible.

Yet today, a new status quo has emerged.

I am a political scientist and am currently researching how business leaders and their companies shape American politics. But while interviewing dozens of executives from across the country, I could not help but notice the ways that American politics is also reshaping corporate life.

[Conor Friedersdorf: Should the professional be political?]

Donald Trump’s presidency led companies to start regularly issuing political statements on major developments in the news. In 2020, the murder of George Floyd, and the subsequent protest movement, prompted companies not only to incorporate more diversity, equity, and inclusion (DEI) initiatives into the workplace, but also to adopt “anti-racism” messaging, for which merely showing tolerance wasn’t enough. Participants are urged to actively promote anti-racist policy goals—rendering these sessions far more overtly political than their predecessors of the 1990s and early 2000s.

Although political chitchat has always been part of office culture, the volume of the discourse and the extent to which it is coming from management are departures from the past. As a senior manager at a New York insurance firm recently told me, “I probably get just as many emails” from the company’s executives “about social-justice or environmental stuff as I do about how the company is doing. And that’s just not how it was … That’s a major shift that’s only happened in the last two or three years.” Bosses across the country, particularly in white-collar workplaces, are pumping out tweets and press releases about the midterm election, abortion rights, and the war in Ukraine. They are hosting mandatory trainings and workshops that come uncomfortably close to the TV parody.

But if anything, the new normal likely hinders the cause of diversity and tolerance, while producing no other worthy social change. Mandatory workshops on anti-racism and LGBTQ rights are about as effective at eliminating bias as you’d expect if they were facilitated by someone from The Office. Political messages issued by corporations are intended to sound topical, progressive, and genuine, but come across to many listeners as tone-deaf, performative, and alienating. Companies, I think, should be politically and civically engaged, but they’re going about it all wrong.

At many white-collar jobs, workers have extra time on their hands. Social-media scrolling, gossip, needless group meetings, “quiet quitting”—the inefficiency of office culture is old news. But politics appears to be sucking up more of that time now than in the past.

Three factors are at play. First, the white-collar workforce has undergone a partisan realignment. Workers with four-year degrees now vote overwhelmingly for Democrats. Democratic voters now trust business more than Republicans do. Democratic workers are enthusiastic about businesses taking public stands on political priorities. CEOs themselves, who tended to be somewhat apolitical on social issues before Trump’s 2016 victory, have in some cases made headlines by becoming activists. And they have hired vice presidents and consultants who keep the company’s social mission high on the agenda. In short, white-collar businesses have become Democratic constituencies.

[Read: The tech industry joins the political fray]

Second, the long-running decline of civic life in America, likely exacerbated by COVID, means that many Americans who are cognitively engaged in politics lack any social organization—other than the office—through which they can channel their political energy. Many people who consider themselves political junkies don’t volunteer for candidates’ campaigns or advocacy groups. They aren’t active members of unions or religious communities or neighborhood associations.

CEOs are complicit in turning the office into a venue for political discourse. A real-estate developer in Georgia recently told me about how he gathers his team, including maintenance personnel as well as data analysts. (Because I conducted these interviews in my capacity as a political scientist, I am not identifying my interviewees by name, in keeping with ethics standards in social-science research.) They meet on Zoom, pick an issue in the news, and talk it through. These conversations are an attempt to push back against political polarization. “I [want] all of us to talk to each other as Americans and fellow citizens and being part of the same team,” the developer said. He described these meetings as therapy sessions through which he, the boss, in his own small way, can try to heal America’s political wounds.  

The third factor behind the politicization of the workplace is a cultural shift in corporate leadership and in employees’ expectations of their managers. If workers come to the office with low morale because of an election loss or Supreme Court decision, today’s bosses are not going to yell at them to buck up and get back to work. Bosses have learned to be empathetic leaders who need to care about what workers care about.

Since the Great Recession, the conventional wisdom among corporate recruiters has been that workers, especially young workers, want bosses who have a sense of mission and whose political positions align with their own views. In this account, socially conscious people don’t want to work for a company that cares only about money or that contracts with nasty clients or that donates to members of Congress who support the wrong positions. Workers know that companies can exert pressure on politicians. The company can have a bigger impact than the workers can have alone through their personal Facebook posts.

And yet politicizing the workplace—either to meet employees’ demands or to satisfy the CEO’s political goals—has obvious pitfalls. Not every worker or boss is good at respectful dialogue about political matters. A conservative executive in Texas told me this summer that he had to buy out his even more conservative business partner because the partner had embraced COVID conspiracy theories and engaged the staff in politically aggressive, emotionally obtuse conversations.

[Juliette Kayyem: Never go back to the office]

More fundamentally, the boss-employee relationship makes the workplace a difficult setting for an open conversation about politics. An office is not a community of equals. When a boss injects politics into a conversation, many employees feel compelled to nod along, which gives the boss a false impression that everyone feels the same way.

Feigning agreement with the boss extends beyond explicit political conversations and into politics-adjacent subjects such as diversity, equity, and inclusion. One executive told me he sees diversity differently from how his employer sees it. “We just like diversity in the way people look,” he said of his company, “not diversity in the way people think.” The firm, he argued, hires people from across the racial and ethnic spectrum, but they come from a narrow set of universities and tend to hold the same liberal viewpoints.

This man, a Republican, tends to keep his opinions to himself, and for good reason. In a 2021 Knight Foundation survey that I helped design, 57 percent of Democrats (and a much higher proportion of Black and Latino Democrats) said private employers should prohibit workers from expressing “political views that are offensive to some.” Most Republicans disagreed. Speaking honestly at a DEI training or in a political discussion is difficult if most of your co-workers think your views not only are wrong but perhaps should be banned from the office.

Some forms of political engagement at the office have distinct and understandable goals. Workers want to have a say in how the firm does business; employers want to show that they care about the demands of customers and staff. But some of today’s political office culture does not even pretend to be strategic. Workers might gather around a TV screen to commiserate during major news events or fish for approval by sharing news articles in the employee Slack channel. Such activity functions as group therapy during political ups and downs. It does not change election results. It is pure political hobbyism—a performative form of civic engagement that has become the white-collar set’s preferred approach to public affairs.

Outside white-collar office culture, different norms prevail. In my interviews with industrialists and retailers, a wildly different perspective is evident. “You are talking about a problem that is just utterly foreign to my little world,” an executive who oversees a chain of beauty salons told me recently. He describes his firm as a “working-class, southern, multicultural company” with an entirely female retail staff. He views political talk at work as a frivolous distraction.

Even so, this executive has a clear vision of his company’s civic mission: offering a path into the middle class for people without strong educational credentials. “I feel very good that there are 150 women, most of whom come from crappy backgrounds, who have a shot at owning a home, buying a car, going on vacation.” His retail employees—none of whom has a college degree, he says—earn up to $90,000 a year. He thinks they are “likely to become Republicans” because their foremost concern is about money and taxes. “Our workers are tied to their own productivity. And that clears away an awful lot of crap.”

Of course, I do not know whether his employees feel the way he feels. But I understand why this executive looks on bemusedly at his post-materialist big-city compatriots. How many management consultants, tech engineers, corporate attorneys, or investment bankers can argue so forthrightly that their own firms are making other people’s lives better?

I am deeply skeptical of what the current wave of white-collar political hobbyism will accomplish, especially when so many corporate pronouncements are clearly hot air. (Consider those companies that very briefly, and very loudly, swore off donations to politicians who voted against certifying the 2020 election, and then very quickly, and very quietly, went right back to contributing to them.) The shame is that businesses and their employees can involve themselves productively in politics. They can invest time in community organizations and business organizations that have concrete goals and strategies. Rather than playing to would-be activists on Slack, business leaders can get involved (and try to involve employees) in long-term engagement on education, housing, transit, and other issues central to a thriving economy. They can encourage diversity and mutual respect by inviting workers to collaborate on common goals, rather than through stilted training exercises better suited to The Office.

How has white-collar office culture become so political? Ultimately, through the good intentions of people who recognize that all is not well with America today. Channel those good intentions into strategic civic engagement, and a company can make a difference. But if, in the end, the goal is merely to cultivate a mild sense of political camaraderie so that a certain class of partisan employees can feel better about themselves, then the virtuous email from the CEO and a monthly guest speaker introduced by the VP for DEI will probably do the trick just fine.

Click here to see original article

The Next Afghan-Refugee Crisis Is Right Here in the U.S.

The night before the midterm elections, Jake Sullivan, the president’s national security adviser, addressed a packed room in the basement of the Council on Foreign Relations in New York City. The topic was billed as “Common Sense and Strategy in Foreign Policy.” For an hour, Sullivan held forth on a host of topics, including Ukraine, Taiwan, digital clean energy, and Iran. For the last 15 minutes, he took questions. When this wide-ranging tour of American foreign policy concluded, I felt as though I’d witnessed an episode of mass amnesia: Afghanistan wasn’t mentioned once.

America has a long, disastrous history of forgetting when it comes to Afghanistan. Abandoning the country to Islamic radicals in the 1990s after its war with the Soviets; deprioritizing our own war after 9/11 so we could pivot to Iraq—this willful forgetting has, again and again, bred disaster. This played out most recently last year, when the collapse of the Afghan government surprised many senior officials in the U.S. government. Today, this pattern of forgetting is poised to repeat. Without congressional action, the tens of thousands of Afghans we evacuated to the United States may be deported in the coming year, and very few in Washington seem to be talking about it. The cost of this apathy will be a second Afghan evacuation, equally disastrous, this time played out in reverse, with our allies shipped back to the Taliban-controlled Afghanistan they fled.

[From the September 2022 issue: I smuggled my laptop past the Taliban so I could write this story]

To understand how we arrived at this looming crisis, we have to go back to August 23, 2021, when, during the withdrawal from Kabul, the Biden administration authorized the use of humanitarian parole to temporarily expedite the entry of Afghans into the United States. The preexisting Special Immigrant Visa program—which can take three years from application to approval—had proved impracticably onerous, so humanitarian parole filled the gap and eventually enabled the administration to evacuate approximately 80,000 Afghans to the United States.

Although humanitarian parole accelerated their processing, the program didn’t provide resettlement services or a clear path to long-term residency for the new arrivals. Afghans have struggled with resettlement and with securing the necessary documentation to work or attend school, as well as access to a host of other necessities. And humanitarian parole extends for only two years. Those tens of thousands of Afghans we evacuated have been living under a cloud of uncertainty, and they will soon be subject to deportation unless Congress acts by adjusting their status. The Afghan Adjustment Act—a bipartisan, bicameral piece of legislation introduced this past August—aims to do just that. Astonishingly, it’s struggling to pass.

“It’s important we get this done,” Senator Chris Coons of Delaware, one of the bill’s co-sponsors, told me. “Our Afghan allies who fought alongside us and those who fled the Taliban deserve better than having to live with the uncertainty of whether they’ll be able to stay in the United States.” Although the bill doesn’t face opposition from Democrats, it does face competing legislative priorities within a caucus that isn’t eager to revisit the debacle of the Afghan withdrawal.

Republicans, who will soon hold a majority in the House of Representatives, are more eager to revisit the events of August 2021. They have signaled plans to hold hearings on the subject in the next Congress, which would frame the Biden administration’s efforts in Afghanistan in an unfavorable light, with a likely focus on the human cost of the collapse in Kabul and our ad hoc effort to evacuate our allies. It’s difficult to imagine Republicans holding those hearings while they are simultaneously deporting those very same allies. Nevertheless, Republican support for the Afghan Adjustment Act has proved uneven.

[Read: Afghanistan did not have to turn out this way]

Lindsey Graham of South Carolina is one of three Republican co-sponsors of the bill in the Senate. Within his party, legislation that expands immigration protections is generally a tough sell. “The Afghan Adjustment Act has received stiff opposition and legitimate criticism,” he told me. “However, I hope the critics understand that American service members who served in Afghanistan feel honor-bound to help their Afghan allies. They are right. The problems with the bill must be addressed, and I believe we can do that.” Graham remains optimistic that the bill will pass. “To turn our backs on this problem and those who provided essential support to the United States would be a stain on our honor and haunt us for generations. Finding a compromise that maintains our honor and assures our national security can and must be done.”

Outside Congress, the greatest proponents of the Afghan Adjustment Act have been, not surprisingly, veterans’ groups. Among them is With Honor Action, a nonprofit that supports the For Country Caucus, a bipartisan coalition of military-veteran legislators in the House. The co-founder of With Honor, Rye Barcott, a Marine Corps veteran, believes the Afghan Adjustment Act is likely to be blocked by both parties’ respective brands of dysfunction. “Political polarization contributes to and drives our amnesia. Congress can’t get out of its own way. There’s no special interest here, and so there’s no one who has this as their top legislative priority,” he told me. “The adults in the room all realize this is the right thing to do, but it’s not getting done. This is going to be the next betrayal, not only for the Afghans, but also for those who served alongside them.”

Although Barcott believes passage of the Afghan Adjustment Act is a moral imperative, he argues that it’s also in our “enlightened self-interest.” Last August, an International Rescue Committee report projected that Afghans evacuated to the U.S. would contribute nearly $200 million in taxes and $1.4 billion in earnings in their first year of work.

Representatives Peter Meijer, Seth Moulton, and Jason Crow—all co-sponsors of the Afghan Adjustment Act and members of the For Country Caucus—were among more than two dozen lawmakers who signed a pair of letters in the spring and early summer of 2021 warning of disaster in Afghanistan if protocols weren’t established to expedite the evacuation of our Afghan allies. Many of those same lawmakers are, right now, sounding a similar alarm, warning of a second crisis, this time entirely of our own making, if the status of our Afghan allies isn’t adjusted. But as the legislative agenda for the end of the year locks into place, their warnings seem to be going largely unheeded in Congress.  

“Just like with the first withdrawal, no one believes this could actually happen,” Barcott said. “People can’t imagine that we’re going to take all these Afghans, who we evacuated at great risk and expense, and put them on planes back to Afghanistan. But if Congress doesn’t act, that’s exactly what could happen.”

A lack of foresight plagued our war in Afghanistan from its start. Now it’s plaguing the war’s aftermath. “You’ve got to wonder,” Barcott said. “What will it take for people to care?”

Click here to see original article

Never Trump Means Never

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

Never Trump is—still—a movement that is about more than just one man. It stands in opposition to everything Donald Trump has done to American civic life, and rejects those who would wear his mantle.

But first, here are three new stories from The Atlantic.


Not Ever

In one of the most appalling appropriations of a political banner in years—or at least since Trump decided in 2012, after years of changing party registrations, finally to settle on calling himself a Republican—some of the conservatives hoping to salvage the GOP’s fortune after the 2022 midterms are trying to seize and redefine the term Never Trump to mean a rejection of “only Trump, and no other Republicans who are like him.” This is important not as some internecine fight among the right but because it is a preview of how many Republicans (and especially those coalescing around Florida Governor Ron DeSantis) intend to rehabilitate the GOP brand in 2024.

The strategy will be to make Trump the sin-eater for the entire party, designating him as the GOP’s sole problem, and then rejecting him—and only him. The goal will be to scrub away the stain of having accommodated Trump while pretending that the Republican Party is no longer an extension of his warped and antidemocratic views. This will require an extraordinary suspension of disbelief and an expenditure of gigawatts of political energy on the pretense that the past seven years or so didn’t happen—or didn’t happen the way we remember them, or happened but don’t matter because Trump, having escaped Elba to contest the primaries, will finally be sent to St. Helena after his inevitable defeat.

This will be the new Republican line, and it is nonsense.

As one of the original Never Trumpers—an appellation adopted by disaffected Republicans and conservatives who swore never to support Trump—I think I have a pretty firm handle on what the term means. I do not speak for every Never Trumper, but I am confident that virtually all of us would affirm that we were not just making a choice about a candidate but opposing the movement that coagulated around Trump. We did this not only by expressing disapproval—which is easy—but by actively voting for his Democratic challengers, which for some of us was harder to do but was part of actually “stopping” Trump. In this, we became a movement ourselves. We were not merely choosing one flavor of ice cream over another; we were examining our own beliefs, and then advocating for others to join us in defending those ideas in the public square.

We knew that Trump represented an existential threat to everything that we and millions of other Americans, regardless of party, cherished. He was an avowed enemy of the rule of law, cared nothing about fidelity to the Constitution, and could never be a responsible steward of the awesome powers of the presidency. (It is no accident that the first Never Trumpers were heavily concentrated among those of us with connections to national security.) We were certain that Trump would bring racism, misogyny, and religious bigotry to the White House. And we were right.

But Trump exceeded our worst fears. We expected him to bring a claque of opportunists and various other mooks and goons with him to Washington, but we underestimated the ability of the GOP’s immune system to fight off a complete surrender to Trump’s parasitical capture of the party. We appreciated the threat of Trump, but we were surprised by the spread of Trumpism—the political movement that arose as a malignant mass incarnation of Trump’s personality, based on racism, nativism, isolationism, the celebration of ignorance, and a will to power that was innately hostile to American institutions. Trumpism is now the only real animating force in Republican politics; indeed, DeSantis, the great GOP hope, is so much a Trump sycophant that he has even learned to stand and gesture like Trump.

The idea that Never Trump means more than the rejection of one vulgar and ignorant man—that it also means Never Trumpisminfuriates a lot of people on the right. (The folks over at National Review, some of whom have apparently jumped on the DeSantis bandwagon, have seemed particularly agitated in the past few days.) The immediate circumstance that precipitated all this online whining about the Never Trumpers and generated the sweaty attempts to seize their mantle was, of course, Trump’s dinner this weekend with an anti-Semite and a white supremacist. Top Republicans who should be desperate to scour the stink of Trumpism off the GOP but who fear Trump and his base once again went weak in the knees. Most stayed quiet; others employed careful circumlocutions. Mike Pence said Trump should “apologize” for the dinner, as if it were a faux pas. Senator John Thune blamed Trump’s staff—always a handy dodge in Washington.

Only a very few were specific and unequivocal. Senate Minority Leader Mitch McConnell finally weighed in today with a shot at Trump’s ambitions, saying that “there is no room in the Republican Party for anti-Semitism or white supremacy,” and that “anyone meeting with people advocating that point of view, in my judgment, are highly unlikely to ever be elected president of the United States.” But Senator Bill Cassidy was more direct: “President Trump hosting racist antisemites for dinner encourages other racist antisemites,” he tweeted. “These attitudes are immoral and should not be entertained. This is not the Republican Party.” Cassidy’s words are admirably clear, but while he argues that such attitudes are not the Republican Party, they are, in fact, espoused by people widely tolerated by the base of the Republican Party—starting right at the top with Donald Trump.

The Republicans know they have a problem. Many of them seem to believe their only recourse now is to say that they were all Never Trumpers in the hope that voters will somehow draw an unwarranted distinction between Trump and the party he has captured from top to bottom. But those of us who said “Never Trump” years ago—and meant it—know the difference.

Related:


Today’s News
  1. A bipartisan group of congressional leaders said it plans to pass legislation to avert a nationwide rail strike.
  2. The U.S. won its World Cup match against Iran, 1–0, advancing the American team to the knockout round of 16.
  3. Several states in the American South are at risk of flooding from severe storms today and tomorrow.

Dispatches

Explore all of our newsletters here.


Evening Read
A baby monitor with static on its screen
(Getty; The Atlantic)

I’m Scared of My Baby Monitor

By Damon Beres

You can now know everything about your baby at all times. An expectant parent of a certain type—cash-flush and availed of benzodiazepine, or maybe just fretful—will be dizzied by the options.

Consider the $300 “dream sock,” for sale again after a hiccup with the FDA, which latches on to your infant and beams numbers to your smartphone—numbers such as “110 beats per minute” observed from baby’s little heart, and “97% average O2” for the air inhaled by baby’s little lungs and distributed to baby’s little bloodstream. You might rent the Snoo, a popular bassinet that shimmies when your baby makes a peep, with various intensities depending on the nature of that peep. It transmits further health-tracking numbers to your mobile device; Snoo was rocking my child with Level 3 vibrations for 25 minutes last night, you will think to yourself, and seriously too. Many parents will use an app to parse the color of poop (you can generally rest easy, even when it’s green) and a smart thermometer that remains affixed below the armpit for up to 24 hours.

Read the full article.

More From The Atlantic


Culture Break
A family sits around a board game in a scene from 'Big Mouth'
Big Mouth on Netflix. (Netflix)

Read. “Father of Clarity,” a poem by Tim Z. Hernandez.

“Each day the same now: / I wake her up—she’s a woman / in the making, and me, / I’m still a boy, given this responsibility / of another …”

Watch. Big Mouth, on Netflix, is one of several modern animated sitcoms showcasing a well-adjusted father.

Play our daily crossword.


P.S.

Some folks on Twitter were surprised to find out that I am an avid computer gamer. (These must be new followers; I am a complete nerd about posting pictures of my personalized, lit-up gaming rig and I regularly opine about my favorite games.) But I understand the surprise: I am just shy of turning 62, and most people see me online as the stuffy, self-important curmudgeon who is constantly lecturing people about airline etiquette—which, I can’t lie, is also part of my personality.

I find computer games engrossing and relaxing, and though they have a reputation for eating up time, playing them clears my mind enough to get back to work. I am a fan of historical strategy simulations, postapocalyptic adventures such as the Fallout series—although I hope those remain science fiction—and the so-called world builders, where the player has to navigate decisions aimed at sustaining virtual communities. The one concession I make to my age and personality is that I abhor online multiplayer games; otherwise, you’ll find me at my desk refighting the Battle of Kursk, evading mutants in the wasteland, or deliberating how close I should put a lumber mill to a school. You’re never too old to have some meaningless fun.

— Tom

Isabel Fattal contributed to this newsletter.

Click here to see original article

Please Look at My Metal Credit Card

Although it may be difficult to imagine a universe in which George Clooney needs a little help charming women, that’s the case in Up in the Air, the 2009 movie in which he plays a frequent-flying HR consultant in charge of executing mass layoffs. In a Dallas hotel bar, he flirts with a comely business traveler played by Vera Farmiga, needling her over her preferred rental-car loyalty program; soon, the two are comparing mileage goals and flinging their respective stacks of bonus-rewards credit cards down next to their drinks. Eventually, Clooney seals the deal with a rare American Airlines ConciergeKey card, rendered in matte graphite among all the shiny plastic. Farmiga picks it up, complimenting its weightiness. “This is pretty fucking sexy,” she marvels. They retire to his hotel room.

During the 2000s, a metal credit card could have that effect on a person. In 2004, American Express swapped plastic for titanium in its invite-only, unlimited-spending Centurion Card, and one of the most successful credit-card marketing gambits in banking history was born—or, perhaps more accurately, was finally realized. After its plastic introduction in 1999, the Centurion Card—or the Black Card, in popular parlance—became a status symbol known far outside its rarefied clientele, largely thanks to countless namechecks in rap hits by artists including Lil Kim, Jay-Z, Lil Wayne, and Kanye West. Within just a few years, the card’s legend had grown to such mythic proportions—aided by the fact that almost no one had ever seen one in person—that it was somehow widely believed to be made of metal already.

In the time since, metal credit cards have become not only a reality, but a mundanity. Once limited to products like the Centurion that require proof of high net worth and a history of lavish spending, the cards are now available to pretty much anyone with passable credit. Even Venmo, the cash-swapping app, is enticing people to use their balance like a bank account with a metal debit card in pink or black. As a marketing play, the cards are brilliant. But they’re also an object lesson in the life cycle of the consumer status symbol. When everyone’s special, no one is.

Metal credit cards may have begun as markers of extreme wealth, but they were spawned by something far more pedestrian: consumer-loyalty programs. Frequent-flier miles are the most famous of these programs, but they’re everywhere now—hotels, clothing brands, electronics retailers, fast-food chains. They’re especially popular at the top of the glutted credit-card market, where people with good credit and a relatively high income need to be tempted to open and use new cards, even though doing so tends to be expensive and annoying. Promises of free plane tickets, iPhones, and points-accrual multipliers on dining and gas purchases can be enticing perks, but after a while, all the benefits of opening a new card can start to sound the same. Credit-card companies have tried to come up with different strategies to stand out, especially because these usual perks tend not to be part of the everyday user experience; you might cash in for a free plane ticket or an iPhone upgrade once every year or two, but those eventualities are hardly a constant reminder to pluck that card out of your wallet over all the others.

Enter metal. Many people in the credit-card industry point to 2016 as the year that metal cards went wild, thanks to the launch of the Chase Sapphire Reserve Card. The card was itself an upgrade from an existing—merely Preferred—product, and it came with a hefty $450 annual fee at the time of launch in addition to its promises of fast-accruing, easily redeemable points. The shopping public couldn’t get enough of it, according to Nick Ewen, the director of content at the travel-rewards website The Points Guy. So many people applied (Ewen among them) that Chase ran out of metal and had to mail temporary plastic cards. Ewen said that although he believed much of the card’s appeal was in the big bonus-points offer for new accounts and the company’s well-liked rewards program, the metal card wasn’t exactly unrelated to its success. “At the time, it was still enough of a novelty that when you would go and pay for something with the Chase Sapphire Reserve, you would get comments from the waiter or the cashier,” he told me. Elizabeth Crosta, a vice president of communications at American Express, told me that this is referred to in the industry as the plunk factor—a heavier card is more satisfying to plunk down on the table after dinner. It lands with more authority.

That kind of response to a card launch turned heads, Ewen said, and it didn’t take long before most issuers’ fanciest publicly available cards were metal. And then their next-fanciest. American Express, which had long kept metal cards for Centurion high rollers, began issuing less exclusive metal Platinum Cards in early 2017; in 2018, its Gold Cards also made the switch, with a limited-edition rose-gold option for early adoptees. For a long time, the credit-card industry looked at nonfunctional tweaks to the card itself—a college or sports-team logo, for example—primarily as a way to market mid-tier products to people with mediocre credit. When the Chase card became a massive hit, it was suddenly clear that affluent people, too, are delighted by the prospect of a special little card.

Unlike team-logo cards, though, metal cards aren’t meant to signal fandom or allegiance—they’re meant to signal status, and not just of the airline variety. Keeping meticulous track of points balances and bonus offers can pay real dividends when it’s time to redeem those rewards, but a card needs more than that to entice people whose hobbies don’t typically involve spreadsheets. When cleverly branded, credit cards have always made for status symbols so potent that they easily tip over into the absurd, or even parodic—the costume designer Lizzy Gardiner wore a dress made out of gold American Express cards to the 1995 Academy Awards. Lots of people are willing to shell out for things that project wealth and discernment to others. This is the principle on which the entire high-end-fashion industry is based, and metal cards are maybe most accurately described not as a financial tool, but as a luxury accessory.

In the fashion industry, the trendiest pieces—those that mark their owners most clearly as stylish and well connected—have a familiar trajectory. Eventually, a brand starts pumping out more and more of a once-rare item to capitalize on frenzied demand. Other designers riff on the things that made the design so successful in the first place. Less expensive brands and counterfeiters flood the market with knockoffs and fakes. Before you know it, the look is everywhere, and it doesn’t have much sociocultural meaning at all anymore. Those in the know are on to the next thing. Ewen said that he’s sick of metal credit cards: They’re now so ubiquitous that they’re not a reliable indicator of the most rewards-intense cards, they can’t be cut up when you get a replacement card, and carrying several of them at once can, in his experience, set off airport metal detectors. Not great for a frequent flier.

But it seems like metal cards aren’t so much falling out of favor as becoming the new normal, and credit cards, as physical objects, are likely to become more like luxury accessories, not less. Most recently, the credit industry has embraced a tactic beloved by the fashion industry: the drop, in which a small amount of limited-edition (and therefore special, if not always inherently so) items are made available to the lucky few who are able to snap them up. Earlier this year, American Express cut up one of Delta’s decommissioned Boeing 747s and used the metal to fabricate a series of cards available only to clients with the company’s highest-tier Delta rewards card, which costs $550 a year. The cards, which bore the image of the retired plane, were supposed to be available for sign-up for about seven weeks. They were gone much faster.

Click here to see original article