At around 3pm on October 24 1968, a sharply dressed executive from the computer manufacturer Control Data Corporation took the stage in the auditorium of the National Bureau of Standards in Gaithersburg, Maryland.
He was addressing the audience of the second annual symposium of the American Society for Cybernetics.
The society, a nexus of academics, spies, policymakers and businesspeople, was dreamt up a few years earlier by a CIA operative. It was designed to counter the USSR’s growing clout in computing and mastery of “cybernetics”, the precursor to today’s artificial intelligence. Consensus in the US of the late 1960s was fractured by foreign and domestic conflicts, but cybernetics promised to reassert control, deploying computers to tame the chaos and make life predictable again. The man from Control Data Corp, himself a CIA confidant, was there that day to sell a plan for what he called “communal information centers”, to make CDC’s supercomputers serve the public by providing news, recipes, public health monitoring, even dating advice. Computers, he told the audience, were going to be our “willing slaves . . . employed in the service of mankind”.
Until 3.32pm, visions of cybernetic paradise washed over the audience. But then a lanky man in his late thirties strode up to the podium. He was strikingly handsome, with a meticulously groomed goatee that gave him the look of a bohemian English professor crossed with a Nordic deity. Behind him was another man, a decade or so older, wearing an oversized checked blazer and round glasses.
“I don’t know who you are, sir, and this isn’t personal,” the second man told the speaker, as he grasped him by the elbow and moved him away from the microphone. “But I’m tired of listening to this.”
Avery Johnson and Warren Brodey made for an unlikely pair of rebels. They had been early members of the American Society for Cybernetics, but now they were leading a countercultural revolution in the small, staid world of computing. When they crashed the stage, they were hoping to stave off what they saw as an imminent catastrophe. They believed computer makers such as IBM and CDC were steering society down a perilous path.
At the time, these manufacturers and half a dozen others were in gentle competition to develop and sell their enormous mainframe machines, most of which were still programmed by punch cards and used for handling payroll and inventory. While the era of personal computing, tablets and smart appliances was still a distant dream, it was a period of intense excitement and experimentation. It is hard to imagine now, but in 1968 the basic question of what computers were actually for had no obvious answer.
Johnson and Brodey believed these companies had overlooked a crucial philosophical question about the technology they were working on: were computers really destined to be mere slaves, condemned to an eternity of performing repetitive tasks? Or could they be something more? Could they evolve into craftsmen? While slaves unerringly obey commands, craftsmen have the freedom to explore and even challenge directives. The finest craftsmen do more than just fulfil orders; they educate and enlighten, expanding our horizons through their skill and creativity. Johnson and Brodey wanted to wrest control away from those eager to mass-produce an army of subservient machines.
To bring their vision to life, in late 1967 they had established a clandestine, privately funded lab on Boston’s waterfront, aiming to personalise computing nearly a decade before Apple’s Steve Jobs and Steve Wozniak had the same idea. Their vision was bold, utopian and radical. Had they succeeded in swaying their peers, the tech we use today would look remarkably different.
But Johnson and Brodey were much more than just predecessors to the Apple founders. Johnson had a PhD in electrical engineering from MIT and had consulted at Nasa. An occasional sleepwalker, he would wake in the middle of the night, sit upright in bed and program an imaginary computer. A fan of beatnik poetry, skinny-dipping and luxury cars, he was also the heir to a sizeable fortune. His great-grandfather had founded what would eventually become the Palmolive Company.
Brodey, the ideologue, was an ex-psychiatrist with a penchant for the dramatic: he once brought a toy gun to a session with his analyst. He saw the rigidity and non-responsiveness of an industrial system of mass production as the cause of so many empty, conformist American lives. At first, computers seemed to promise to upend this status quo, but the more Johnson and Brodey observed them, the more they realised computers were just as likely to enforce conformity as do away with it.
Their vision of computing was not about prediction or automation. The tech they were building was supposed to expand our horizons. Instead of trusting a computer to recommend a film based on our viewing history, they wanted us to discover and appreciate genres we might have avoided before. Their tech would make us more sophisticated, discerning and complex, rather than passive consumers of generative AI-produced replicas of Mozart, Rembrandt or Shakespeare.
Over the past decade, I’ve tried to unravel the legacy of Brodey, Johnson and their lab. This June, I launched a podcast delving deeper into their story. My journey took me from Geneva to Boston, to Ottawa, to Oslo, where I hoped to recover an idiosyncratic, humanistic and largely forgotten vision. I wanted to understand when and how our digital culture veered off course.
What I discovered was that the types of interactivity, smartness and intelligence that are baked into the gadgets we use every day are not the only kinds available. What we now consider inevitable and natural features of the digital landscape are in fact the result of fierce power struggles between opposing schools of thought. With hindsight, we know that Silicon Valley ultimately embraced the more conservative path. The Homo technologicus it produced mirrors the Homo economicus of modern economics, valuing rationality and consistency, discouraging flexibility, fluidity and chance. Today’s personalised tech systems, once the tools of mavericks, are more likely to narrow our opportunities for creativity than expand them.
Consider the much-criticised ad for Apple’s latest tablet. In “Crush!” a colossal hydraulic press steadily obliterates a mountain of musical instruments, books, cameras and art supplies, to the strains of Sonny and Cher’s 1971 hit “All I Ever Need Is You”, leaving behind only an ultra-thin iPad. This one device, we are meant to understand, has within it all the capabilities of the demolished objects. We won’t be needing them any more.
Was there another way? Perhaps.
In the photos from the symposium, the only woman stands out: 23-year-old Sansea Sparling (née Smith), then the resident artist at Johnson and Brodey’s lab. Clad in a sleeveless, A-line dress with a bold black-and-white pattern, Sparling looks like the lone representative of the hippie generation at the event. But what was she doing at a male-dominated computer conference? Even for the late 1960s, her presence seemed improbable to me.
Last June, I travelled to New Haven, Vermont, to visit Sparling, who is now 79. Her home is nestled in an abandoned lime quarry, with water clear as a polished mirror reflecting dramatic rock faces. I wanted to hear the story of how she became entangled with Johnson and Brodey.
Sparling had grown up in a small town in Arkansas, so when she got the chance to study art in Boston she jumped at it. To support herself through school, she worked odd jobs, even as a waitress in Mafia joints. In Boston she moved in the same circles as Avery Johnson, whose house on Beacon Hill was a magnet for all sorts of bohemians. One day in 1968, Sparling told me, Johnson mentioned an intriguing opportunity. “He said, ‘I met a very interesting man who wants to start this project.’ And I asked, ‘What kind of project?’ He replied, ‘Well, I don’t know how to describe it.’”
Johnson said the project would be financed by a man named Peter Oser, a mysterious Swiss millionaire who was about to arrive in Boston. Oser’s background was illustrious. One of his great grandfathers was John D Rockefeller, once the world’s richest man. The other was Cyrus McCormick, the father of modern agriculture. Oser’s mother was so close to Carl Jung that she built him a guest house on their Los Angeles estate. The psychoanalyst owed much of his international fame to the support of Oser’s grandmother, whom he once called his most difficult patient.
Sparling met Oser the next day at 10am, and their conversation lasted until 2am. Gradually, the details emerged. “He wanted to fund an 18-month experiment with a lab of four to six people from diverse academic backgrounds,” Sparling said. What for? Something to do with expanding the “ecology of thinking”, a concept that initially perplexed her. Ecology is the study of the interconnectedness and diversity of living systems, “but I didn’t understand how that would ever relate to anything we could experiment with”, she said. Yet experiment they did.
It turned out there was a way for technology and ecology to coexist after all. The secret was the concept of “responsiveness”. Early, more conservative strands of cybernetics had fixated on the simple model of the adaptive thermostat, marvelling at its ability to maintain a preset temperature in a room. Our modern-day smart-home systems, which intuit our preferences and automate everything, quietly adapting to our needs, are just fancier versions of this idea.
But the rebels at the lab thought this kind of automation was the antithesis of true responsiveness. They saw human relations, art and identity as open-ended, always-evolving ecologies that could not be reduced to the thermostat’s simplistic model of optimisation. Can one really pinpoint the “right” cinema, music or loved one in the same way as the right temperature of a room? Today’s TV, music and dating apps seem to think so. The Boston contrarians did not.
Within a few months, a new space emerged on Lewis Wharf, then a gritty enclave in Boston’s North End. The Environmental Ecology Lab, as it was known, occupied a spacious third-floor loft in a granite building dating back to the 1840s. The white brick walls, creaky wooden floors and rustic beams gave it an old-world charm. Expansive windows framed views of the harbour, evoking the building’s storied past. Once a bustling warehouse, it had stored an array of goods, from blankets to Madeira wine, all of which were unloaded at the pier below.
The lab also boasted its own quasi-patron saint, Marshall McLuhan, the influential media theorist and friend of Brodey’s who had once visited. McLuhan promoted the idea of the “anti-environment”: distinctive spaces that illuminate overlooked elements of our everyday surroundings. The types of spaces, McLuhan would say, that “tell fish about the water”. The lab was one such anti-environment, promising to jolt its visitors out of the numbing uniformity of their everyday world. Everything was meant to shock, provoke and stimulate.
Visitors to 33 Lewis Wharf likened the experience to the mind-expanding effects of psychedelic drugs. Upon entering through an imposing metal door, visitors were confronted with huge, floor-to-ceiling cellophane bags suspended from above. These bags, equipped with sensors, would inflate and contract, demanding effort to push between them. Once past this barrier, the bewildered guests found themselves disoriented by an eclectic collection of objects. Two bulky, expensive computers, bristling with wires, stood out. The back half of a Ford car was a favourite place for brainstorming sessions. A giant glass dome resembling a bell jar served as a space for private conversations and, occasionally, for smoking dope (it was the 1960s, after all). And that’s not to mention the paintings, the musical instruments and the various other strange materials, including a giant slab of foam.
The ceiling was festooned with strips of Mylar, two video cameras, a parachute. A sofa-like structure, hung from overhead springs, was the venue for business meetings. Dubbed “The Cloud” (a name that would prove prophetic), the pie-shaped installation had six connected sections, ensuring that the way one person sat and moved affected the experience of everyone else. Underneath, sensor-operated coloured lights flickered in response to the movement. Sparling took on the challenge of ensuring The Cloud’s safe suspension. “Looking for eight-foot springs in Boston — it’s an interesting adventure,” she told me.
Her partner in decorating the lab was Oser, who funded the entire experiment and brought his stage design expertise to the project. This was just the latest chapter in Oser’s eclectic career. At Reed College in the late 1940s, he’d mingled with future Beat poets. He dabbled in Scientology when it was still called “dianetics” and once followed a Ouija board’s suggestion to move to Angola, where he ran a sawmill. He produced French New Wave films and owned a tech company in his native Switzerland.
Oser also had a deep love for science fiction. Frank Herbert’s novel Dune, with its expansive vision of ecology, left an indelible mark. He found a kindred spirit in Brodey, who was particularly struck by a single line in the book’s appendix. It described the “production and maintenance of co-ordinated patterns of greater and greater diversity” as a fundamental principle of life. Oser and Brodey envisioned their lab as the place to produce perpetual diversity machines, designed not to streamline but to enrich human experience.
Easier said than done. One of their first ventures into this strange territory was “the dancing suit”, a peculiar garment that would allow a dancer to influence the music they were listening to by changing their moves. They embedded copper wires into elastic bands, which were then sewn into a full-body leotard to capture every movement. “If I had a band from my forearm to my upper arm and I bent my elbow, that would stretch the band, and the copper would send a signal,” explained Sparling, who bravely volunteered to wear the suit for tests.
The signals were transmitted to the “squawk box”, which generated and altered the music based on how the dancer moved. The resulting sound was far from harmonious, but the experiment validated a broader philosophical point. Dancing became something qualitatively different when it could influence the music’s “ecology”. Establishing a two-way feedback loop between movement and sound redefined both art forms. While the person wearing the suit might not have transformed into a Rudolf Nureyev or Richard Wagner by the end of the session, they gained a more holistic understanding of both movement and sound. Whatever role the tech played in this case, it was certainly not that of a slave.
The next major project drew inspiration from Brodey’s professional traumas. Early in his career as a psychiatrist, he viewed psychiatric hospitals as prisons where patients endured brutal treatments with no recourse. When I met Brodey in 2014, he was in his early nineties; he is now 100. He recounted his clashes with hospital administrators as a psychiatrist. “They wanted me to give young people shock therapy,” he said. He refused and soon had to find another job.
Sparling remembers Brodey posing a question: “What if, instead of being strapped down on a hospital bed, patients could influence their environment while restrained, making it not just tolerable but even pleasant?” The thought led to a responsive restraint blanket, which would stay loose if the patient lay still but tighten with abrupt movements. They devised a method involving a chemical that expanded from liquid to gas upon skin contact.
The invention marked a new chapter for the lab, leading to what the founders called “flexware”, a fusion of hardware and software, combining physical materials with the customisation of a computer program. Soon, Brodey envisioned responsive chairs, mattresses and even baby bottles. Everyday objects could be freed from rigid materials, their forms following not just function, but use.
For 30 years, our understanding of technology and its role in our lives has been shaped by Silicon Valley’s dominant ideology, which I term “solutionism” in my book To Save Everything, Click Here. Solutionism posits that all problems, whether personal, social or political, can be solved through tech. Excited by advancements in computing, connectivity and profitability, tech founders have championed the idea that their products are the ultimate tool for fixing any social ill.
Solutionist thinking has given us frictionless smart cities, where sensors monitor everything from traffic to waste management. It has driven the development of wearable devices that track our health metrics and social media platforms that claim to enhance our personal connections. But it also reduces complex human experiences to data points and ignores the contexts in which problems arise. The relentless drive for optimisation threatens to produce ludicrous or terrifying outcomes of the kind satirised in TV shows such as Silicon Valley and Black Mirror.
My interest in the Boston lab was sparked by the hunch that its members were the early proponents of solutionist thought. But after speaking with the people who worked at Lewis Wharf, I realised they were in direct opposition to those kinds of smart technologies. Unlike some critics of Big Tech today, they did not champion a return to vintage or “dumb” tech. Instead, they envisioned a kind of digital smartness that remains almost unimaginable to us today. They saw people as fickle and ever-changing, qualities they did not view as flaws. In 2014, when I asked Brodey about the possibility that his responsive mattresses and chairs would be able to find an ideal position for each user, his response struck me: “That wasn’t our purpose,” he said. “There is no ideal anything, because we are constantly changing. We’re not like machines.”
He is right; machines we aren’t. But the wrong technologies can make us machine-like. And maybe they have. Perhaps this is the root of our discomfort about the direction of the digital revolution: that rather than making machines more human, it is making people more mechanical. Speaking at a 1967 conference, Brodey minced no words: “man becomes captured, captured behind the grid of what can be programmed into the machine . . . We have been captured by automobiles, by houses, by architecture, simplified to the point of unresponsiveness.”
The maddening efficiency of our digital slaves has obscured the idea that human agency depends on constant course correction. As Brodey noted in 1970, “Choice is not intellectual. It’s made by doing, by exploring, by finding out what you like as you go along.”
Sparling told me that a key question driving the lab’s work was, “What can we discover that allows the person in the loop to learn and progress with whatever they are trying to do?” The common thread uniting projects such as the dancing suit and the restraint blanket, she said, was their celebration of improvised learning — jazz style — as the core value that should underpin interactive tech.
“Imagine a future where your interface agent can read every newswire and newspaper, catch every TV and radio broadcast on the planet, and then construct a personalised summary.”
This visionary idea comes from the 1995 bestseller, Being Digital, by Nicholas Negroponte, who was once a protégé of Brodey. Negroponte is renowned for co-founding the MIT Media Lab, which he describes as a technological Bauhaus, blending art and computing. His work there profoundly influenced the digital revolution. He once joked about the Media Lab’s opening: “Our speaker was Steve Jobs, our caterer was Martha Stewart . . . I told both of them that we launched their careers.”
Negroponte, an early supporter and columnist for the techno-utopian Wired magazine, had a knack for outlandish visions of the future that resonated with his readers. He came up with a name for his newspaper-reading curatorial assistant, “The Daily Me”. He fantasised about its ability to “mix headline news with ‘less important’ stories relating to acquaintances, people you will see tomorrow, and places you are about to go or have just come from”. To a reader in 2024, it sounds a lot like the social media feeds we have come to love and hate, which function as reliably as any thermostat, their dependability rooted in the constant observation of our behaviour.
Negroponte was an early ally of the Lewis Wharf gang. As a young architecture professor, he visited the lab and attended seminars by Johnson at MIT’s Sloan School. Brodey, two decades his senior, was a crucial mentor. In an interview in October 2023 in London, he described Brodey to me as “one of the earliest and most important influences” on his thinking. Negroponte came from a wealthy Greek family. His father was a prominent figure in the shipping industry. Young Nicholas wanted to become a sculptor in Paris but he proposed to his father a deal: he would spend five years studying at MIT before pursuing his artistic dreams. He never made it to Paris. But he earned multiple degrees from MIT and became a professor there, only briefly pausing his academic career for a stint working at IBM in 1966, where, as Brodey put it to me, “there was money”.
Brodey described Negroponte’s mission at IBM as a search to find meaningful uses for their computers for architects, planners and designers. But by the late 1960s, Negroponte had shifted his tactics. Instead of pitching computers to the professional classes, he began touting the ways in which they could help ordinary users bypass these experts altogether. Negroponte built on many of the concepts from the Lewis Wharf crew. Influenced by their playful attitudes, he acknowledged that future smart technologies could be whimsical, imagining elevators with personalities ranging from courteous, to grumpy, to humorous. He also adopted Brodey’s ideas of “intelligent environments” and “soft architecture”, proposing that using sensors, computers and algorithms in built environments would allow automatic adjustments to our needs and habits. His goal was to achieve a level of customisation beyond human capability. Responsive, yes, but a far cry from what Johnson and Brodey were dreaming of.
By 1995, Negroponte had fully capitulated to the idea that human curiosity can be assessed, predicted and satisfied by clever programming, with a touch of algorithmically injected serendipity in the mix. This belief was the unifying theme of his work at the Media Lab. From there, the idea travelled to Silicon Valley, eventually finding its perfect manifestation in our social feeds and algorithmic playlists. By the time he wrote Being Digital, Negroponte had brokered a peace treaty between the slaves and the artisans. Recycling a metaphor he first used in the early 1970s, he argued that digital technologies should be like well-trained English butlers.
Sceptics among us would say that the world in 2024 bears more resemblance to Brodey’s gloomy prophecy of a mankind captured by machines than to the tech-utopians’ vision of armies of willing slaves working for us. When I visited Brodey in 2014, he lamented, “the alienation is pretty fucking complete at this point. And the computer has really done it in large measure.”
At the time, I didn’t press him on what he meant by that word, “alienation”. But over the past decade, we’ve stayed in touch, and the reason for Brodey’s disaffection has become clearer to me. Nearly half a century after he disrupted the cybernetics conference, his original concern has hardly been addressed. Instead, tech manufacturers, wearing a sort of countercultural camouflage, have sold personal computers to us as ever more human, ever more intimate, simply by shrinking the handsets and making them more affordable and user-friendly.
For Brodey, the epitome of an intelligent environment was the classroom, a space designed to ignite new desires. By contrast, Negroponte’s was the living room, a hub where we fulfil existing needs, for entertainment, shopping and occasionally work. Existing, as we now do, in Negroponte’s living room, we can only wonder about what our digital universe would look like if it had been modelled on a classroom.
Negroponte’s vision won the future largely because corporate America and the Pentagon favoured simple, utilitarian solutions. They had no interest in dancing suits and cloud sofas. Instead, they bestowed substantial funding on to the precursor to the Media Lab, the Architecture Machine Group, which expanded and prospered throughout the 1970s, taking grants from Darpa and other parts of the Pentagon to work on interactive projects such as early forms of virtual reality. Johnson and Brodey’s lab met a different fate, plagued by internal strife. The two men frequently clashed with Oser and Sparling, leaving other lab members stranded in between. Their grand ambitions compounded their troubles; they struggled even to make their prototypes functional.
Oser became deeply depressed by these setbacks. He died of a heart attack almost exactly a year after pulling his funding from the venture in 1970. Sparling became a metalworker and, later, a teacher. Johnson and Brodey continued their efforts into the early 1970s, growing increasingly radical. With no revenue from their business activities, Johnson had to foot all the bills, putting further strain on the partnership. They rejected funding not only from the military but also from MIT, viewing it as tainted by the Vietnam war. Their attempts to secure corporate backing failed too; few companies were interested in their responsive products.
Disillusioned, Brodey left the US, his five children and his ex-wife in 1973. He moved to Norway and lived as a Maoist. Within a few years, he was writing letters to his friend Marshal McLuhan from an iron foundry, where he took work as a manual labourer. His political awakening had led him to a hard truth: the diversity of choice that his lab had championed was not something that could be achieved through technology alone.
Despite the lab’s failure, Johnson and Brodey’s insights carry an important message. If we want technology that expands our choices, we must recognise that someone has to fund it, much as our governments fund public education or arts and culture. Achieving this on a massive scale would require an effort comparable with the one that initiated the welfare state.
Consider this. Dumping all the world’s classical music on to your Spotify playlist, no matter how refined its recommendations, won’t turn you into a connoisseur. Yet, isn’t there a way to harness the latest technologies to serve that mission? Here is a radical idea that Silicon Valley will not admit: technology is not just about freezing, stratifying and monetising existing tastes. It can also deepen, sophisticate and democratise them. This kind of post-solutionist approach seems more realistic than continuing to hope that legions of algorithmic slaves can solve all our problems. Despite the hype, generative AI — even if made widely accessible for free — is unlikely to spark a revolutionary wave of creativity and might, in fact, hinder it by depriving practising artists and educators of stable incomes. Making tablets ever thinner and more powerful won’t get us there either.
Perhaps Johnson and Brodey should have read the room back in 1968. After hijacking the podium and explaining their vision, they invited anyone interested to “come on, come up and stand here”. Only two people did. One of them, an elderly gentleman in a bow tie, a cybernetics grandee and former psychiatrist, only stepped up on to the stage to restore order. He reassured anyone interested in joining the cybernetic hippies that they could do so at lunch the next day, provided they promised not to throw any food. And so, the first — and last — cybernetic rebellion came to an end. It was a shortlived affair, but its lesson is clear: a tech with which we can truly interact is still a distant dream.
Evgeny Morozov’s podcast series “A sense of rebellion” is available at sense-of-rebellion.com
Follow @FTMag to find out about our latest stories first and subscribe to our podcast Life and Art wherever you listen
Read the full article here