The soybean is everything.
A crucial source of protein for a vegetarian diet, and yet the ultimate destination of the vast majority of soybeans harvested today is to be incorporated as feed for animals raised to slake the world’s hunger for cheap meat.
Appreciated for millennia for its precious ability to fix nitrogen in the soil, the soybean is a natural fertilizer that makes possible a sustainable system of crop rotation without additional outside inputs. And yet today the soybean is the flag-bearer for genetically modified industrial monocrop farming dependent on synthetic fertilizers and pesticides. No other GMO crop commands as much acreage as the soybean.
For most of its history, the soybean’s cultivation was confined to its ancestral homeland of East Asia, but it is now one of the most globalized commodities in the world; as crucial to the economies of Brazil and Argentina as it is to the American Midwest. No longer able to grow enough food to feed its own pigs and chickens, China is now the greatest importer of soybeans in the world.
The soybean is as green as it gets and yet the soybean is a driver of climate change. It is a vessel for fossil-fuel dependent, growth-obsessed techno-capitalism and yet also an essential piece of an organic, sustainable future.
The soybean -- fermented into a myriad of forms for thousands of years, its genetic code finally sequenced in 2010 -- is a biotechnological artifact.
The soybean is a text. The soybean is science fiction.
The soybean is everything.
---
In the upcoming weeks and months, I intend to tell a series of inter-connected stories about the soybean, sustainability, fertilizer, technology, trade, and capitalism. The line-up will no doubt shift and flow in unpredictable directions as I proceed, but at the moment, the characters and scenes include a Han dynasty noblewoman buried in the second century B.C., a British East India Company translator who corresponded with Benjamin Franklin about the tofu-making process, the Japanese invasion of Manchuria, an El Nino-precipitated failure of the Peruvian anchovy harvest that forced President Richard Nixon to freeze the export of American soybeans and resulted in Japanese financiers underwriting Brazil’s entry into soybean mass production, a Sichuanese entrepreneur who took advantage of economic reform to become an animal-feed millionaire, and the remaking of the global food system into a transnational capitalist enterprise predicated on the ownership and manipulation of genetic code.
One of the key intersecting themes will be an ongoing reappraisal of the “Green Revolution.” When I was a teenager struggling to make my voice heard at my grandfather’s dinner table, the Green Revolution was discussed in unambiguously positive terms. And why not? What’s not to like about vastly increasing rice and wheat and corn yields across the world, boosting food production in regions plagued by hunger and famine? I grew up regarding the Green Revolution as an example of American international leadership to be proud of; an ingenious, scientifically engineered escape from the Malthusian trap.
I doubt it is a spoiler to any of my readers to observe that the “successes” of the Green Revolution have been bitterly contested in recent decades. Whether we are looking at campesino displacement and habitat destruction in South America, or the requirements for expensive inputs of fertilizers and pesticides that force small farmers into penury, or the intellectual property implications of a handful of companies monopolizing seed DNA, just to dip one’s toes into this topic is to venture into a flame-ridden culture war battlefield. Choose your weapons: slow food or market-driven innovation! Organic manure or Roundup!
I did not bring up my grandfather’s dinner table at random. As I kept encountering references to how the philanthropic work of the Rockefeller Foundation had funded the Green Revolution in a handful of books I retrieved from the library last week, I started considering the probability that my initial positive feelings about the Green Revolution might be connected to the fact that my mother’s father spent 20 years working for the Rockefeller Foundation as their Director of Biological and Medical Research. His job: figuring out which projects to fund. If not personally involved in the Green Revolution, he was at least adjacent to its creation.
I did what I usually do when randomly curious. I googled “Robert S. Morison” and “Rockefeller Foundation.”
And I discovered something that left me staring at my computer screen in dumb astonishment; an article published just over a year ago on the Rockefeller Archive Center’s website titled, A Roomful of Brains: Early Advances in Computer Science and Artificial Intelligence.
The article tells the story of how the Rockefeller Foundation “funded a milestone: the Dartmouth Conference on Artificial Intelligence.” The story further recounts how the computer scientist John McCarthy coined the term “artificial intelligence” in the grant proposal that he sent to the Rockefeller Foundation requesting funding for the conference. (His co-writers were a who’s who of early computer science: “Marvin Minsky, who developed the idea of neuron nets and founded the MIT Computer Science and Artificial Intelligence Lab; Claude E. Shannon, pioneer of Information Theory; and Nathan Rochester, developer of the IBM 701, the first general purpose, mass-produced computer.”)
My grandfather was assigned the job of reviewing the proposal!
From the article:
At the time, the implications of the field were understood only by a handful of researchers. Even Weaver, an insider to applied mathematics, tempered his support for the proposal with restraint, although he authorized Robert S. Morison, Foundation Director of Biological and Medical Research, to act as he saw fit.
McCarthy requested $13,500 for a two-month conference, but Morison offered $7,500 for five weeks. As Morison explained,
“I hope you won’t feel we are being overcautious but the general feeling here is that this new field of mathematical models for thought, though very challenging for the long run, is still difficult to grasp very clearly. This suggests a modest gamble for exploring a new approach, but there is a great deal of hesitancy about risking any very substantial amount at this stage.”
(Robert Morison to John McCarthy, 1955)
The uncovering of unexpected connections is my gimmick, my schtick, my reason for being. But it was difficult to wrap my head around the fact that in a matter of seconds I had hopped from the soybean to the Green Revolution to the revelation that my grandfather approved the funding for a conference that historians of computer science later dubbed “the birthplace of AI.”
After all, it was just a few weeks ago that I discovered that the ChatGPT AI was hallucinating false attributions of book authorship to me. To echo my grandfather, this was “difficult to grasp very clearly.”
And that’s just the beginning. Ten years after my grandfather’s death in 1986, as part of the research for my first (and only) book, Bots: The Origin of New Species, I spent months researching the history of artificial intelligence. The importance of the Dartmouth conference wasn’t news to me. Yesterday afternoon I retrieved one of my sources, a book on the history of AI, and found that I had not only dog-eared the specific page discussing that conference, but I had also quoted in my book a sentence on that page reporting comments made shortly after the conference by one of the attendees, Oliver Selfridge:
“Selfridge believed an AI program should be like Milton's capital of Hell: a screaming chorus of demons, all yelling their wishes to a master decision-making demon."
Then there is the crazy-making fact that some 40 years after the conference, I offered John McCarthy a margarita on my back porch when his daughter Susan brought him to my summer party, and then proceeded to embarrass him by excitedly introducing him to a string of my other guests as “the guy who coined the term artificial intelligence!” without ever realizing that my grandfather was one of the first people to read the words “artificial intelligence” on a printed page.
I had to take some deep breaths, not to mention dramatically restructure my initial thoughts on all things soybean-related. This was nuts!
But the more I pondered it, the more sense it made. Who, after all, was the first person I knew who owned a personal computer? My grandfather, an early adopter well into his eighth decade. Some memories fade with time; but the experience of sitting at my grandfather’s desk, typing out BASIC programs and playing chess on his Radio Shack TRS-80, is crystal clear. My grandfather introduced me to the digital world.
And even that is just a sideshow. Much more important is the role my grandfather played in steering how I pursue “truth” as a writer and a reporter.
In an article I wrote about my grandfather’s relationship with the prospect of death for San Francisco Magazine five years ago, I included the following story:
I recall another conversation that took place the day before he died. My grandfather was sitting upright on his four-poster bed, his back supported by pillows, books and periodicals strewn by his side. His owlish face was paler than usual, but his mental acuity was intact. I cannot remember the exact subject we were discussing, but I’m pretty sure it was some event or person from ancient history. What I do recall is that he asked me to retrieve from his study the corresponding volumes from two different editions of the Encyclopedia Britannica, separated by nearly 30 years, and bring them to his bedside. He then had me read two articles on the same topic, one from each encyclopedia.
The passage of three decades had resulted in a starkly different interpretation of something that had happened thousands of years earlier. My grandfather mused on how the state of the art of our knowledge about ourselves and our world is constantly evolving, constantly constructed anew. This seemed to please him.
It might seem trite to observe that our understanding of things changes over time; but the point that stuck with me is that this process never ends; no snapshot of “the truth” about anything will ever capture its entirety. The synchronicity of this observation with the Daoist presumption that “the way that is the true way cannot be spoken” should be obvious.
As a reporter, I’m supposed to find out the truth, but every single time I get into the weeds of a story I discover that the more I report, the more complicated and messy and nuanced the story becomes. This is as true about artificial intelligence as it is about the Green Revolution. This will still be true when artificial intelligences start designing new soybeans– if they aren’t already.
My grandfather died before I owned my own personal computer, before I had an email address, before I published my first article that included the word “Internet.” But more than ever before, I feel him looking over my shoulder as I write these words on my laptop and prepare to fling them out into cyberspace. I’d like to believe he would get a kick out of this newsletter.
And since honoring one’s ancestors is one of the most important ritual responsibilities in Chinese civilization, I hereby dedicate this series on the soybean to him.
That connection is so cool
Watch out for “a myriad of ...”
I’ll leave it to you to look up 🤓
我想念街上臭豆腐的香氣