Because these details tend to get lost in the froth, let’s pause to note two extraordinary steps Donald Trump took in the past 24 hours.
One of them is literally unprecedented; the other is a sharp departure from modern norms. I’m not aware of any member of the governing GOP majority objecting to either of them.
(1) Declassifying FISA warrants and messages from FBI agents. Presumably because he thinks these messages might embarrass people he thinks of as enemies, on Monday Trump ordered the Director of National Intelligence and the Department of Justice (which includes the FBI) to make public “without redaction” a variety of text messages, reports, and even FISA warrants all involved in the Russian-influence probe.
Why did this matter? Because the FISA warrants, the FBI reports, and these other documents presumably contain details on how the government knows what it knows. Who its sources are, what informants and moles it has developed, which surveillance systems work, which enemy codes have been broken. Recall the familiar (though disputed and even disproved) claim that in World War II Winston Churchill let the Luftwaffe bombing of Coventry proceed — rather that evacuate the city, which could have tipped off the Germans to how much the British knew. Whether or not that story is correct (probably not), as a parable it illustrates how important protecting “sources and methods” can be. And in this case Trump decreed: I don’t care.
The “Gang of Eight” within the Congress is supposed to be the bi-partisan bulwark against misuse of the intelligence system. Today a “Gang of Four” — the Democratic half of the full-scale octet Gang — protested bitterly against Trump’s decision, and appealed to the FBI and intelligence establishment to ignore it, or slow it down.
“We write to express profound alarm at President Donald Trump’s decision on September 17, 2018 to intervene in an ongoing law enforcement investigation that may implicate the President himself or those around him,” the four Democrats said in their letter. (They are: Senate Minority Leader Chuck Schumer; House Minority Leader Nancy Pelosi; Ranking Senate member of the intelligence committee Mark Warner; and Ranking House member of the intelligence committee Adam Schiff. The four Republicans, who did not sign on, are: Senate Majority Leader Mitch McConnell; House Speaker Paul Ryan; Senate intelligence chairman Richard Burr; and House chairman (sigh) Devin Nunes.) The letter added:
“The action he has taken… is a brazen abuse of power. Any decision by your offices to share this material with the President or his lawyers will violate longstanding Department of Justice policies, as well as assurances you have provided to us.”
So let’s note for the long-term record: no previous president has done this; no minority-party “Gang of Four” has previously had to complain in such impassioned tones; and no majority-party “Gang of the Missing Four” has as distinctly averted its eyes.
Possible leaks of classified material were a huge theme in the past presidential campaign. The winning candidate has now ordained a leak dwarfing anything contemplated back then.
(2) Refugees. One of the glories of the United States, idealistically and in practical terms, is that it has opened its doors those persecuted or endangered in their homelands. As my wife, Deb, and I discussed at length in our book, once they arrive refugees are on average more entrepreneurial, more education-minded, and more law-abiding than the populace as a whole.
One of the stains on America’s record is when it has turned its back and closed its doors to those persecuted or endangered. Of course the MS St. Louisis the most notorious example, but every day there are similar cases.
There are limits to even America’s absorptive capacity, but every president in the modern era has set them higher than Donald Trump has now done. (You can see the historical patterns here. Two recent Atlantic posts go into the trends too, here and here.)
After the warfare in Vietnam and Cambodia, Jimmy Carter substantially raised refugee admissions, to well above 100,000 per year, and large numbers arrived as well early in Ronald Reagan’s term.
Through the first Bush era and the Clinton years, refugees from the former Soviet Union and the Balkans increased, and average annual levels were between 75,000 and 100,000.
Refugee ceilings fell immediately after the 9/11 attacks, but then rose through the George W. Bush and Obama eras, averaging around 75,000 annually. To put it in perspective: this is roughly 1/4500th of the existing U.S. population — a significant absolute number in international terms, but not among the leaders proportional to either population or GDP.
Donald Trump has now set the coming year’s ceiling at 30,000—a one-third cut from last year’s 45,000, and the lowest level since before Ronald Reagan’s time.
I won’t make any more of the moral or practical argument in favor of refugee admission at the moment. Instead I’ll point you to this report by Deb about how refugees have helped invigorate the town of Erie, Pennsylvania. (Plus this.) And I’ll point you to an interactive Esri map, which you can find here, which dramatizes how the flow of refugees into the United States has changed in recent years; where they have arrived; and how many of them (and from where) have settled in any given town, including yours.
Noted for the record, as Jews in America and worldwide are beginning the Yom Kippur fast, and with 49 days to go until the midterm elections.
Step one was running a search for “reeducation center” using Baidu, the Chinese equivalent of Google. He said that led him to news reports that described how local officials, under a policy known as qu jiduanhua gongzuo (“de-extremification work”), were “reeducating” Muslim ethnic minorities—notably Uighurs and Kazakhs—in the northwestern Xinjiang region, which Beijing has long viewed as a breeding ground for extremism and separatism. Step two was using that policy’s name as a search term in Baidu, which he said led him to government websites. Step three was seeing what those websites said about the centers’ activities and locations.
Our partner site CityLab explores the cities of the future and investigates the biggest ideas and issues facing city dwellers around the world. Gracie McKenzie shares today’s top stories:
The urban-rural divide has become a central trope—if not the central trope—in American culture today. But this narrative fails to capture the full complexity of economic life in America, Richard Florida writes.
“Amid a retail meltdown, the malls where teenagers used to hit up American Eagle and Orange Julius could morph into escapist domains for the elderly.” Here’s why a “memory town” could be coming to your strip mall.
Boston-area straphangers don’t have much reason to love their underfunded and often frustrating public-transit experience. But in the background remains a visual identity from the transit authority’s most optimistic (and well-funded) days.
Narrative is one of mankind’s sharpest tools. Many scholars argue that storytelling—our ability to invent fictions, and to collectively believe in them—is what ultimately distinguishes Homo sapiens from chimpanzees. You don’t have to look far to see that narrative is indeed the lifeblood of society; from politics to religion to capitalism to our own personal identity, the foundation of humanity is built upon stories.
Doug Passon, a defense attorney-turned-filmmaker, knows this better than most. In the criminal courtroom, he harnesses the power of storytelling to create sentencing mitigation videos, or emotionally rousing documentaries designed to appeal to a judge’s sense of empathy and humanize Passon’s clients. These biographical short films have one express purpose: to achieve a reduced prison sentence.
Lance Oppenheim’s short documentary, No Jail Time: The Movie, profiles Passon and his controversial courtroom tactic in all its variegated shades of gray. In the process, the film offers a meta-analysis of objectivity in the realm of narrative nonfiction. “Passon treats sentencing videos in an artful manner nearly indistinguishable from narrative-driven, fictional films,” Oppenheim recently told The Atlantic. According to Oppenheim, defense attorneys and sentencing video makers are increasingly drawing inspiration from true-crime entertainment, such as The Jinx and The Thin Blue Line, “to bend the rules of reality in the courtroom with visual storytelling.”
“I would argue that the whole genre of nonfiction filmmaking has largely been rooted in assembling and constructing the messier parts of reality into exciting ‘truths,’” Oppenheim continued, “and seeing this very practice take shape in the courtroom was fascinating to me.”
While videos have historically been permitted in the courtroom, the phenomenon of sentencing mitigation videos became prevalent in 2005, when the Supreme Court decided to relax evidentiary standards in United States v. Booker. This decision allowed courts to legally consider an offender’s “personal history and characteristics”; before the case, a judge’s access to information about a defendant’s past and criminal record was generally restricted. Today, jurors are given the ability to accept or reject the presentation of a sentencing mitigation video at their discretion, according to Oppenheim, who believes that the democratization of media-making will cause the practice to become more commonplace.
“In photography and film,” Oppenheim said, “there’s an elusive color tone that exists halfway between black and white, called middle gray. Just like the phenomenon of middle gray, I would argue that sentencing videos exist in an in-between space where legal conceptions of fact and fiction and right and wrong become amorphous.”
For Oppenheim, making No Jail Time and following Passon’s success in the courtroom revealed the unequivocal power of narrative. “I saw how effective [mitigation videos] are at illustrating a disturbing reality—one in which the power of a cinematic portrayal can alter one’s life.”
Americans as a whole don’t regularly wear sunscreen, but Americans of color especially don’t. This is striking given sunscreen’s wide-ranging benefits. It fades acne scars, which can last for weeks or even months. It staves off conditions that are caused or worsened by the sun, such as lupus, which is especially common among women of color. And it protects skin that becomes more photosensitive due to certain medications, including those for high blood pressure—a condition more likely to affect African Americans.
Then there’s skin cancer. While white people get diagnosed with skin cancer more than people of color, black people are less likely to survive that diagnosis, because physicians tend to catch their cancer in later stages.
Pervasive misconceptions about people of color not needing sunscreen are one factor that keeps them from applying it and not getting diagnosed early. But there may be another catalyst: Sunscreen often looks terrible on richly pigmented skin. YouTube videos like “Scale of 1-ASHY?!” and “We Put These Sunscreens to the Ashy Test” show women of color trying on different sunscreens that make them look like they’ve put on Phantom of the Opera masks.
These white, blue, purple, and even green masks appear thanks to certain ingredients. Sunscreen companies use various formulas to block two types of sun rays: Ultraviolet B rays, which can cause sunburn and skin cancer, and Ultraviolet A, which can accelerate sagging skin. Chemical sunscreen, a category of sunscreen that works by absorbing or reflecting rays, tends to protect best against UVB rays—though some formulations protect against both. Physical sunscreen, meanwhile, uses white compounds that are insoluble in water such as zinc oxide and titanium dioxide to sit on the skin and act as a physical barrier that deflects both UVA and UVB rays.
In order for their products to be certified as broad-spectrum in the U.S., sunscreen companies have to prove to the Food and Drug Administration that their sunscreen can block both UVA and UVB light. As a result, many use the physical ingredients. Physical ingredients are less likely to cause an allergic reaction: They’re “considered relatively inert, meaning we believe that they don’t really interact with your skin that much,” says Ginette Okoye, the chair of dermatology at Howard University Hospital. The problem is, physical ingredients are the ones that leave a white cast.
Because the FDA regulates sunscreen as a drug instead of as a cosmetic, U.S. companies are especially limited in their choices of protective ingredients, whether chemical or physical. The FDA has been criticized for being slow to approve new ingredients, especially compared to Asian and European countries which tend to have more ingredients available. Because of the long approval process, U.S. sunscreen companies can be laser-focused on just getting effective treatments out. Okoye points out that these companies don’t necessarily consider the appearance angle. “The purpose of sunscreen is for skin-care prevention. They measure success not by how it looks, but how it prevents skin cancer or sunburn,” she says.
But companies are starting to think more about addressing how their products feel and appear on skin of color. At first, “the larger companies never thought that people of color would spend their dollar on sunscreen, because people of color have a mentality of ‘Black don’t crack,’” says Shontay Lundy, the founder of Black Girl Sunscreen, a company that designs sunscreen specifically for black women. Now, both older, larger companies, such as CeraVe, Banana Boat, and Supergoop!, along with new black-owned skin-care companies—such as Black Girl Sunscreen and Bolden USA, a skin-care company for women of color—have started tinkering more with traditional formulas, with different complexions in mind.
Holly Thaggard, Supergoop!’s founder, points out that zinc oxide, one of the most popular ingredients used in physical sunscreens, for instance, comes in hundreds of different varieties. “If it’s our goal to bring something to market that is more skin-compatible with regard to color, we have some options,” she says. On top of using different iterations of zinc oxide, Supergoop! has also added a tint to its physical sunscreens to make them blend in better with a wider range of skin colors.
CeraVe has reduced the physical ingredient zinc oxide to nanoparticles in some of their products, which lessens the white cast left on skin. But doing so can lessen the zinc’s effectiveness, and Reddit threads say the sunscreen can still leave a cast. (CeraVe didn’t respond to a request for comment through their owner, L’Oréal.) Banana Boat, meanwhile, simply offers a variety of physical and chemical sunscreens instead of just one particular type. “We encourage consumers to select sunscreen based on their skin type, activity level, and exposure need,” says Samuel Vona, the director of research and development at Edgewell Personal Care, the company that owns Banana Boat.
Bolden USA and Black Girl Sunscreen have taken a distinct approach by completely removing active physical ingredients like zinc oxide from their products and opting for chemical sunscreens instead. After two years of going back and forth with labs and the FDA, the sisters Chinelo Chidozie and Ndidi Obidoa, the founders of Bolden USA, launched an SPF 30 broad-spectrum sunscreen this past summer, with avobenzone as one of the active ingredients. But avobenzone, which is used in Black Girl Sunscreen as well, can trigger allergic reactions in some people. “The difficulty in formulation is being able to please everybody,” Chidozie says. “You want to be able to meet the needs of the majority of the people and then for the specific cases that are reacting, you create workarounds.” The sisters suggest people try a patch test to see if the combination of ingredients works on their skin type first.
Cheryl Burgess, the medical director of the Center for Dermatology and Dermatologic Surgery in Washington, D.C., also has workarounds she shares with patients for avoiding—or at least reducing—white casts: “Put it in your palm, rub your hands together, and then put it on.” She also suggests that people mix liquid foundation with a sunscreen to make a tinted sunscreen.
Melanin does naturally protect against the sun’s harmful rays, but skin tone and the amount of sun exposure a person has are important factors. “When it comes to skin cancer, it’s not just about your race, it’s really about the way your skin reacts to the sun,” Okoye says. Even if two people are the exact same complexion, the person who is a professional tennis player, for example, would generally be more at risk than the person who is a computer programmer. But even indoors, windows don’t always block UVA rays, and phones and computers emit blue light, which can damage skin.
The latest sunscreen innovations are allowing sunscreen to protect a wider range of people, but some of the newer alternatives can get quitepricey, often starting at $20 for one to two ounces. They also typically aren’t sold in drug stores. The Bolden USA founders think the accessibility, and perhaps the prices too, will change with time. “Once they have a good following, mass retail usually follows,” she says.
Twitter is bringing reverse-chronological timelines back. It won’t be the new default, but CEO Jack Dorsey announced that you’ll be able to go back to the simplest way to organize a timeline with a setting change. Uncheck “show the best tweets first,” and out will go the algorithmically shaped experience—tweets from 4h ago, lingering with tweets from 10s ago—and in will come the old rhythm, the newest tweets first.
Reverse-chron was the schema of what was called Web 2.0. For a time everything was reverse-chron (except Wikipedia). Blogging was reverse-chron. Twitter was reverse-chron. It’s the logic of news: put the new up top. But in the Twitter context, reverse-chron also lets people beall together in real-time, watching this thing, the Emmys, the game, the dissolution of the republic, the hurricane, the hearing.
That was the original appeal of Twitter. It put the there in the web. Where was the internet happening? Right there, where all these people were processing it together. It could feel like the “internet reacted” all at once, all its peoples hashing it out.
It was different in the old days, though. Most everyone seems to agree on this. And maybe it was the mishmash of tweets that randomly passed through the tubes at the same moment that made it so.
Twitter always had a high modernist novel’s scope—you peer into the boxes, and see someone having tea, a war you should have known was going on, a parent’s take on a four-year-old, the latest ProPublica investigation, a screenshot of some idiot, a video of a black person being killed by police, an ad for Quiznos, and then Donald Trump tweeting about the television program he’s watching. The stack of information was contextless, traumatizing, and bizarre, but also energizing, the way a city makes you walk faster. It did that, but for your mind.
But as Twitter’s algorithm increasingly selected the most popular tweets to show you—which tended to be the ones that made you go “What! Ah! Ooooh! Eff that!” To pull down your thumb was to ingest different (quantitatively proven) emotional cues one after the other, your brain a player piano, simply responding to the notes in the feed. No one meant to build such a machine, but there it was. And it was addictive as hell.
At the same time, the things people said on Twitter became real things. Real historians extensively corrected people’s fantasies about the Confederacy on Twitter. People got hired and fired because of Twitter. Innovative companies’ share prices tanked when their CEOs said weird things on Twitter. And, of course, the President did things on Twitter.
This platform juices us up into strange emotional states, and now, whatever people say or do on the platform has ever-more real world consequences. “Never Tweet” was born, on Twitter.
Reverse-chron cannot reverse the development of the platform, nor the changes that have come to the world outside Twitter, the highkeying of everything. But maybe reverse-chron will ever so slightly push Twitter away from what it became and back towards something simpler. The most potent tweets will not all be stacked together. Twitter could still be the place that surfaces important topics that the mainstream media ignores, but with slightly less emotional whiplash. Twitter could feel slightly less like a battleground and more like a healthy corrective conversation. Poco a poco, change for the better?
None of it really does anything to the service itself. It doesn’t return Twitter to the edenic state I remember, and loved, the one that introduced me to new social worlds, brought my attention to important injustices, the one that Kathryn Schulz called “sentences with friends.”
Twitter has become like New York. You love it, you hate it, you can’t leave it, it makes you crazy, it’s getting you down, you leave it. Because the media is all there, and everyone on Twitter sort of becomes part of the media, when you leave, you write an essay detailing the euphoria, the sense of loss, the superiority you feel about those who have stayed, the shrinking halo of relevance that hurts like a phantom limb.
You go back, probably, shamefully re-install it in your mind, tweet a few times to see how many people make fun of you for quitting. But everyone forgot 4 minutes after you left, so, like, whatever.
For me, as the years have gone by, the specific stories, the jokes, the information, the wins—matter less and less. This haunts me. It makes me recall a line from Addiction by Design: Machine Gambling in Las Vegas by the MIT scholar Natasha Dow Schüll. She’s interviewing a compulsive gambler at a slot machine, and this woman tells her that she’s stopped caring about winning. “Why, then, does she play?,” Dow Schüll writes. “‘To keep playing—to stay in that machine zone where nothing else matters.’”
Nominally, I’m on Twitter to be informed, to catch potentially useful information, to see the world from other perspectives. All of which happens.
But, emotionally, I’m just on Twitter to be on Twitter. Whatever happened to me over the last ten years cannot simply be reversed by reverse chron. In real life, timelines are not so easily rearranged.
To understand why women overwhelmingly support a Democratic takeover of Congress—a landslide majority of 65 percent, according to the latest ABC News/Washington Post survey—it’s worth parsing some of the initial Republican responses to the sexual-assault allegations against Brett Kavanaugh. The remarks explain why, on the cusp of the first national elections of the #MeToo era, Republicans on the ballot are confronted with a gender gap that threatens to become an unbridgeable canyon.
After Christine Blasey Ford, a clinical-psychology professor, put her name to the accusation, announcing publicly that she’d passed a polygraph and had shared her story in a 2012 therapy session, Senator Orrin Hatch, a longtime member of the Senate Judiciary Committee’s all-male Republican contingent, told the cameras: “This woman, whoever she is, is mixed up.” He also said that even if the assault accusation were true, the past wouldn’t matter so much: “It would be hard for senators not to consider who he is today.”
His Republican colleague Bob Corker voiced sympathy for Kavanaugh, but none for his accuser: “I mean, I can’t imagine the horror of being accused of something like this.” Donald Trump Jr. joked on Instagram that Kavanaugh had merely had a schoolyard crush. And an unnamed lawyer close to the White House said that the alpha gender is under assault: “If somebody can be brought down by accusations like this, then you, me, every man certainly should be worried.”
After their initial defensive flurry, Republicans quickly recognized that ramming Kavanaugh’s nomination through without affording Blasey an opportunity to testify under oath would be politically suicidal. But even though they’ve hit the pause button and slated a public hearing for Monday, it’s likely that many women in the electorate have already gotten the message, one that mirrors the message they’ve received from Trump Republicans all along: that the ruling patriarchy does not respect, and indeed feels threatened by, the power of women.
Come November, these dynamics could have serious consequences for the Republicans on the ballot. The gender gap—essentially, the difference in the way men and women vote—has generally plagued the GOP at the national level since 1992, when, in the so-called Year of the Woman, Democrats won back the White House after 12 years in the wilderness. Bill Clinton was buoyed by strong female support, and the gap was even wider when he won reelection in 1996. That year, male voters split more or less evenly between Clinton and challenger Bob Dole, but women favored Clinton by 18 percentage points.
The gender split was mostly about policy—that’s why the female majority tended to vote Democratic back then. To name some examples: Women, unlike men, tended to support a more expansive role for the federal government. Women, unlike men, tended to believe more strongly in the importance of a government safety net, and they didn’t like when House Speaker Newt Gingrich targeted it for budget cuts. They also didn’t like when Republicans called for the abolition of the federal Department of Education.
At the same time, women were becoming more economically and professionally powerful, and Republican leaders “just didn’t get it,” as Jack Pitney, a former party strategist, told me in 1996. The same year, Alex Castellanos, a Republican consultant, told me (in a semi-joking manner): “Things were simpler back when the daddy bears brought home the income, and the mommy bears were the caregivers and interior decorators.”
Now, the exodus of women to the Democratic Party appears to be accelerating, and for a more profound cultural reason than policy differences: the belief that Trump and his male allies refuse to fully see them as equal human beings. Trump lost the female electorate by 12 percentage points (although he won white women). Meanwhile, a solid majority of men clearly didn’t care much that candidate Trump had allegedly abused, harassed, or groped almost 20 women; or that Trump responded by calling the women liars and threatening to sue them. The president was similarly hostile last winter to the multiple women who came forward to accuse Republican senatorial candidate Roy Moore of molesting them as minors.
That attitude has apparently prompted college-educated white women, in particular, to abandon the GOP in droves. Last year in Virginia, for example, they supported Democratic gubernatorial candidate Ralph Northam by 16 points, powering his victory, where in the governor’s race four years earlier, they’d split evenly between the parties. This female bailout has devastating implications for House Republican incumbents running this fall in suburban districts. House Democratic candidates are typically lucky to break even with college-educated white women—that’s what happened in 2016—but virtually all the 2018 polls report that Trump has turned them off en masse, primarily because of his style and behavior.
It’s the cultural chasm. The Washington Post, citing the polls, reported this summer: “Support for the Republicans among white women with a college degree drops off a cliff after 2016.” Steve Bannon, Trump’s exiled aide, has waved the flag of surrender: “The Republican college-educated woman is done. … They’re gone. They were going anyway at some point in time. Trump triggers them.”
Even Trump may recognize the perils of further alienating those formerly reliable Republican women. The president was uncharacteristically subdued on Monday, declining to attack Blasey’s credibility and signaling a willingness to extend the Kavanaugh confirmation timetable. “If it takes a little delay, it will take a little delay,” Trump said.
His caution reflects the Senate Republicans’ need to tread carefully, not just during the runup to Blasey’s sworn testimony, but during the Monday committee hearing, when the all-male Republican panel will be challenged to ask hard questions without reducing her to a “mixed up” female stereotype. Presumably, they will refrain from repeating the behavior of their colleagues in 1991, two of whom produced a fake affidavit claiming that Anita Hill—another college-educated woman—had surprised her law students by placing pubic hairs in their exam books. The 1992 Year of the Woman was partly a backlash against the smears Hill was forced to endure.
Mark Twain once wrote, “It is not worthwhile to try to keep history from repeating itself, for man’s character will always make the preventing of the repetitions impossible.” If the Republicans have any last-ditch hopes of narrowing the gender gap before the 2018 balloting begins, they’ll defy Twain’s treatise on human nature and give this latest professor her proverbial day in court.
Back in 2000, a group of mildly inebriated geneticists set up a lighthearted sweepstakes to guess how many genes the human genome would turn out to contain once it was fully sequenced. More than 460 bets were placed, and the lowest guess of 25,947 eventually won when the Human Genome Project was completed in 2003. Fifteen years later, the exact number of human genes is still being debated, with estimates ranging from 19,000 to 22,000. And regardless of the true count, it’s clear that many of these genes are largely unknown.
Since 2003, several researchers have noticed that scientists tend to study genes that are already well studied, and the genes that become popular aren’t necessarily the most biologically interesting ones. Even among genes, it seems, the rich get richer. This trend hasn’t changed in the past two decades, according to a new study from Thomas Stoeger of Northwestern University. Through a massive analysis of existing biomedical data, he found that he can predict how intensely a given gene is studied based on a small number of basic biochemical traits. Most of these, he says, reflect how easy a gene was to investigate in the 1980s and 1990s, rather than how important it is.
“People said that knowing all the genes was going to change everything,” says Luis Amaral, who led the new study. But the 16 percent of genes that were known in 1991 still accounted for half of all biomedical papers in 2015. By contrast, more recently discovered genes are more poorly known, and a quarter (27 percent) have never been the focus of a scientific paper. Based on current trends, Stoeger estimates that it would take at least five decades before every gene was characterized at the most basic level, let alone fully understood. “There’s a chance that we are missing out on a lot of interesting biology,” he says.
“I find that very depressing,” says Jay Shendure from the University of Washington. “It is stunning that we sit here 15 years after the Human Genome Project, and still know little to nothing about so many genes. In a world of finite resources, it does not make sense to invest equal effort in every gene. But it’s clear that something is amiss in the status quo of research allocation.”
In what Amaral describes as a “heroic effort,” Stoeger spent years collating information from dozens of databases about every known gene. Using machine-learning tools, he then showed that he could accurately predict how many papers have been published about a given gene using just 15 traits.
Some of these telltale traits—how often the gene is mutated, or the negative consequences of losing it entirely—certainly reflect the gene’s importance and its relevance to human disease. They’re the kind of characteristics scientists should be paying attention to.
But other traits—how big the gene is, how active it is, how many tissues it is active in, whether it produces proteins that are secreted from cells, whether those proteins are soluble in water, and more—reflect how amenable the genes are to experiments. Highly active genes, for example, are easier to detect using older methods. “That definitely had a substantial impact on whether you were even able to study a gene in [the 1980s and 1990s],” says Sharon Plon from Baylor College of Medicine. And those historical quirks are better at predicting how the National Institutes of Health currently allocates its money than thousandsof other features that more directly reflect what we now know about the role of genes in disease.
It’s possible, of course, that scientists have already identified all the really important genes, and are allocating their attention appropriately. There are good reasons, for example, why p53 is the most popular human gene: It protects our cells from cancer, and is itself mutated in half of all tumors. More broadly, Stoeger found that compared to the least popular genes, the most popular ones are three to five times more likely to have been linked to diseases in large studies, or to wreak havoc when they accrue incapacitating mutations. The problem is that those celebrity genes get 13 times more attention than their neglected counterparts. Scientists do tend to study important genes, Stoeger says, but even then, they do so disproportionately.
That’s partly because there are substantial barriers to studying something that no one else has studied before. A researcher might spend years trying to, for example, engineer a line of laboratory rodents that lack the gene in question. They might create bespoke antibodies or other chemical reagents that can help track or visualize the gene. This all takes time, money, and effort. “Many investigators identify an important gene and then spend their whole career studying it,” says Plon.
To do otherwise is risky. Stoeger showed that over the past two decades, junior researchers who focused their attention on the least studied genes were 50 percent less likely to eventually run their own lab. “Those people get pushed out of the biomedical workforce, and then don’t get a chance to set up a lab that explores some of the previously unknown biology,” he says.
Stoeger and Amaral “have done a remarkable job of comprehensively analyzing the reasons why many important genes are ignored,” adds Purvesh Khatri from Stanford University. “Their results underscore the need to change how we study human biology.”
Amaral blames the research imbalance on the erosion of funding from the National Institutes of Health, which forces scientists to compete for a dwindling number of grants and pushes them toward safer research. “When resources stop growing, the entire system is telling people not to take chances,” he says. The NIH does have grants that are meant to promote innovative, exploratory, high-risk research, but even these end up augmenting the same imbalances: Half of the papers that emerge from them still focus on the same 5 percent of well-studied genes. Even supposedly game-changing techniques like CRISPR have altered the landscape of gene popularity very little. “You get all these new tools but you end up using them on the same set of genes that you were using them on before,” says Amaral.
Within the past decade, only six genes have escaped the doldrums of obscurity and become newly popular, mainly because researchers recently realized that they are medically important. C9Orf27, for example, was recently identified as a common link between two neurodegenerative diseases—frontotemporal dementia and ALS. IDH1 is commonly mutated in brain cancers. SAMHD1 protects certain cells from HIV. “It’s clear that if sufficiently motivated, the field can tack,” says Shendure, “but I still would have expected more exceptions. We don’t want communism for genes, but we do want to lower the activation energy for intensively attacking the biology of genes that clearly merit more attention.”
Stoeger and Amaral have already created a wish list of genes that, based on their data, should be easier to study with modern methods, and are probably worthy of attention. They also think that agencies like the NIH should create grants that encourage junior scientists to pursue new and unpredictable lines of research, and, crucially, provide them with enough years of funding to offset the initial risk of heading down those paths. “If we don’t take targeted approaches to incentivize the study of unstudied genes, the system is not going to change,” Amaral says.
It’s Time for the Press to Stop Complaining—And to Start Fighting Back
Earlier this month, Chuck Todd, an Atlantic contributing editor and the moderator of Meet the Press, described how a nearly 50-year campaign of vilification has left many Americans distrustful of the media. His essay urged journalists—and readers—to reach for facts instead of talking points.
Alas, Chuck Todd’s lengthy view of the “problem” with journalism gives mere lip service to real experiences of those of us who have watched the “unforced errors” of mainstream media for 30 years. Mr. Todd’s explanation of our feelings is that we’ve been manipulated by slick-talking hucksters who exploited our stupidity and naiveté. (By the way, Chuck, I am not an old white guy. I am a middle-aged, black Ivy League graduate.) He leaves little room for the possibility that we are thoughtful, rational people.
Let Mr. Todd continue to explain the world in a way that helps him sleep at night, and avoid any accountability for his contributions to his plight. In the meantime, Trump will fool them again and get elected to another term. You still don’t understand us.
Sheldon L. Thorpe Stevens, Pa.
I agree with Chuck Todd that journalists must become more aggressive in speaking and writing the facts. I kept hoping he would address the lure of money that TV networks are making off this sensationalism to the detriment of simple truths. These corporate giants need to acknowledge their responsibility in promoting the likes of Donald Trump, who received so much free air-time during the campaign, and to this day floods the airwaves with his tweets and rants.
Mary Lou Williams Mission Viejo, Calif.
Chuck Todd closes with, “The truth is that most journalists, in newsrooms large and small across the country, are doing their best each day to be fair, honest, and direct.”
We live in an age of subjectivity and grade inflation. What this generation of journalists considers its best would have been considered unacceptable a generation ago. The standard of reporting facts and letting readers form conclusions is currently nonexistent in journalism.
Social activism has usurped humble objectivity as the highest human value. The exceptions are few and far between.
Scot Turner Sonora, Calif.
Once again I find myself writing after once again reading an overwrought assertion of virtue from a member of the media elite. Look, I agree with much of what Chuck Todd says about toxic right-wing media sources and irresponsible actors. And I definitely agree that the media should defend its coverage. That would be a step in the right direction because then at least we would be acknowledging that the coverage could use some explicit justification.
However, at the same time I found this piece a bit frustrating, as it repeated the same basic frame of the un-powerful media vs. the various suspect powerful forces arrayed against it. What this forgets is that there is a third group—the consumers of media—who are truly un-powerful, and see the media itself as very much a powerful, influential entity. The media should engage with the many viewers who perceive consistently biased and selective coverage and have thus lost a great deal of trust in the stories they read and the narratives that are presented to them.
As Mr. Todd admits, there are clear and consistent signals that people have lost faith in the media. It may be convenient to chalk this up to mind-control by the Rush Limbaugh contingent, but that would be a drastic oversimplification. We can often get defensive when we receive feedback in life, and I would guess it is very difficult for the media to accept that folks have legitimate criticisms. I hope our leading media sources increase clarity, accountability, and honesty in a way that will allow them to return to the positions of trust and respect they deserve.
Sam H. Maslin Seattle, Wash.
I have often wondered why the press has not stood up to the issue of “fake” news. Every informed person knows that reporters make mistakes. It happens. Good reporters correct them. In this climate of hatred, I find myself tired of listening as some reporters in the mainstream media try to be fair by having “the other side” appear so that they can just pontificate their bias and hate. Thank you, Chuck Todd. This message was long overdue.
Cheryl A. Principi Lake Worth, Fla.
I’m 36 years old, and I grew up a die-hard Democrat from a Democratic family in deep-blue Connecticut. I voted for Trump and Republicans down the line in 2016, and that was the first time I had cast a vote for anyone other than a Democrat in my entire life. The complete obliviousness to the built-in bias of the media is a huge part of the reason that I have switched my political affiliation. The fact that Chuck Todd could write this article in all seriousness and believe every word is a shocking display of the complete denial of reality.
Bradford C. Gauthier Thompson, Conn.
Don’t be defensive. As journalists, you should investigate the media and whether there is bias, or an agenda. (I have never understood how a liberal “agenda” would work. If there is an “agenda,” it would be agreed upon after some discussion, and people would be in place to see to it that it is implemented. Clearly, people in the media do not get together to work on an “agenda.”)
Investigate whether a journalist’s own experiences and thinking put a certain bias into their reporting. There is no other institution that will do such an investigation and it would make journalism stronger to understand this aspect of the profession.
Follow a reporter. Show how the information is gathered, edited, and reported. People in the public do not understand how this works. Perhaps, if they see it, more people will conclude that there is far less bias than they thought and no political agenda at all. Don’t do this just once. Do it many times. If you only do it once it will be too small of a sample and nothing will sink in with the public. As journalists, you can make the stories interesting, I’m sure. If you are just defensive, the defense won’t be credible.
Roger Zahn Park Rapids, Minn.
What Chuck Todd is really upset about is that we finally have a president who calls him and his colleagues out for their crap. The press did this to themselves by taking sides and attempting to influence outcomes instead of simply reporting the news fairly and accurately.
Journalism is dead in America.
James Davis Los Angeles, Calif.
It’s definitely time to fight back—to stand up for truth and transparency. How can we help defend your work? How can we do more than just sit here as the dark ash of angry media engulfs us?
As Hurricane Florence passes out of the area, the floodwaters from the record rainfall continues to threaten some parts of North and South Carolina, with major rivers expected to reach their peak flood levels in the next hours or days. For much of the past week, residents have been evacuating or fleeing from Florence, many of them carrying their beloved furry companions. Dogs, cats, and other animals that were left behind, or were caught in the storm, are now being rescued and cared for by owners, neighbors, and first responders.
Sometimes it begins with the toothpaste. Whenever I go back to the United States from Europe, where I’ve lived for more than half my adult life, I’ll often find myself in a jet-lagged fog at a huge American drugstore staring at the toothpaste aisle. Why? I ask myself, or anyone who’s around. Why are there so many kinds of toothpaste? Whitening, baking soda, clean mint, fresh mint, gel, paste, swirls of gel and paste, kids’ toothpastes, sensitive-teeth toothpaste. Why?
It’s not that there isn’t a variety of toothpaste in Paris, where I live. France is a developed country with a market economy—well, mostly a market economy—and its own large supermarket chains. But there’s something about the toothpaste aisles of the United States that I find emblematic of America’s over-the-topness—the dozens of varieties of everything—everything!—when fewer varieties might suffice—and that I always find jarring every time I go back. New York City may be the logical extreme of this.Isthere any other city in any other country on earth that’s so accustomed to shopping for anything at any hour of the day, and I mean in stores, not online?
Of course other things leap out, too, on first impact back in New York: glacial air conditioning; cars that seem as big as a studio apartment; chirpy customer service—any customer service; cheese that’s not so much cheese as oil poured into plastic; Amtrak, a national train service roughly twice as expensive and twice as slow as that of any self-respecting (albeit debt-ridden) European country. And then there are the significant social differences: gay people being super out and proud; people of color, and women, in positions of actual authority.
Landing from Europe in New York, where I lived for years and which remains a spiritual home, I always feel acutely how long I’ve been away. It takes days to adjust. So loud! So fast! So much energy! Why did an iced coffee and avocado toast (I know, playing to type here…) just set me back $17? Wait, what happened to the subway? Every visit back begins with a triple-whammy: jet lag, sticker shock, and status anxiety. Since I left, the real estate prices in New York have soared exponentially. (The salaries have not.) Everyone seems so driven, propelling themselves ever forward by force of will and ambition and grit and impatience, reinventing themselves, measuring themselves up against one another in ways you rarely see in Europe, or at least in continental Europe, where ambitions seem more circumscribed (or barriers to entry higher), even as state support makes a middle-class existence easier than it has become in America’s leading cities.
But the biggest adjustment is one that’s harder to describe: It has to do with time. Time spent talking at meals. Time in the morning. Appointments in France (and Italy, where I lived for years) are rarely before 9 a.m., more likely after 10 a.m. Power breakfast is not a thing. In Greece and Spain, lunch is well after 2 p.m., afternoon coffee sometimes around 7 p.m., and dinner even later. On a reporting trip to Moscow a few years ago, many of my appointments seemed to come through after 10 p.m. and emails would arrive in the middle of the night. Americans are more likely to be early risers, and have been power breakfasting for centuries. In the marvelous letters he sent home while researching Democracy in America, Alexis de Tocqueville was shocked by this. “We were quite surprised at first to see women appearing at the breakfast table with faces carefully made up for the day. We are told that this is customary in all private houses. Paying visits to a lady at 9 in the morning is not thought improper,” Tocqueville wrote home to his mother from New York in 1831.
He was also struck by American eating habits. “At first we found the absence of wine from meals a serious deprivation, and we are still baffled by the sheer quantity of food that people somehow stuff down their gullets. Besides breakfast, dinner, and tea, with which Americans eat ham, they have very copious suppers and often a snack. So far, this is the only respect in which I do not challenge their superiority; they, on the other hand, reckon themselves superior in many ways. People here seem to reek of national pride. It seeps through their politeness.”
I moved to Europe a decade ago this month, a few weeks before Lehman Brothers collapsed and the financial crisis hit. Many things have happened since then. Being an American abroad feels different now, under this administration, than it did under President Obama. That’s another story. For me, living away from home has become home, as much as homehome is home. There are many things I love about living in Paris—the vast expanse of sky, the functional metro, warm baguettes with demi-sel butter, unlimited cinema movies for €19.50 a month, conversations not dominated by Trump.
Still, I often miss being surrounded by the American enterprising spirit, a spirit I hope I’ll never shake. Occasionally, I’ll have hankerings for tastes from my childhood: Corn on the cob. Blueberries. Junior Mints. Thanksgiving stuffing. Everyone living in a new context has moments like this.
But this, too, has changed over the years. American foods, especially junk foods, have been creeping into Europe for years. You can buy Twix at the supermarket in Paris. And peanut butter. Nothing is supersized—yet. But if and when that happens, I wonder, will the toothpaste selection grow, too? And if so, will that make it harder for Americans to be homesick abroad, or easier for locals to be homesick at home?