Poetry News

Nick Montfort & Andy Fitch Talk About The Future

Originally Published: February 05, 2018

At BLARB, Andy Fitch talks to MIT Professor of Digital Media Nick Montfort about his book, The Future (The MIT Press, 2017), and such questions as "by whom does the future get made?" and "how might critical study harness the most liberatory aspects of this process?" " The book is about how imagination and invention can intervene to make what’s going to happen to us, to make that thing called 'the future,' in some ways better," says Montfort. A further excerpt from their conversation:

Italian Futurism, as you have indicated, provides an early-20th-century instance of language itself getting reshaped and newly mechanized as a result of technological change. You yourself at times have called for an “intellectual revolution” in the humanities, presumably with the arts included. More precisely, you have advocated for a revolution driven less by some tepid “digital humanities” topicality, than by systematically expanding our capacities for thinking, exploring, searching, researching. And as we continue to emerge from the long-term historical stasis that your book sketches (the eons of apparent atemporality in which humans thought they could perceive — or in which grammar and its denominations taught them to presume — fixed essences, identities, divine metaphysical harmonies), how might this intellectual revolution continue to refigure not just our definition of “human nature,” but our very prospects for self-reflection? Or what would it take, say, for computational arts or computational humanities to offer computer science new insights into itself? Or how might computer-science-enhanced humanism help us not only to predict and quantify particular future outcomes, but to conceive, visualize, clarify, and realize desirable social valuations? How could the humanities get back on the side, or have they ever been on the side, of helping to drive future-making rather than reacting to it (or even impeding it)?

Humanistic and educational ideas were very important to Seymour Papert, Alan Kay, and a lot of the other pioneers in computers and particularly in personal computing. That includes John Kemeny and Thomas Kurtz, who developed at Dartmouth the BASIC programming language. That was in the context of a mathematics department, but they were trying to educate people at a liberal-arts college, and enrich education to include computing. Of course BASIC became very prominent, and a well-used programming language among young people who had personal computers. So there’s probably at least one manifesto that replies to your question. I’ll be specific here rather than being bold and visionary. I’ll be more incremental in discussing this. In the humanities and arts, we have some digital projects going on here and there, but with computation involved in a very simple way that has not been revolutionary. Consider fields like computational linguistics, computational biology, or computational economics — or, in a different way, architecture. These are fields in which computing is essential. Not only is computing essential, but, for instance, if you’re writing an economics dissertation, you can’t ignore (even if you’re not doing computational economics yourself) the results that have been made in computational economics. And the same would happen with computational biology. You need to discuss it. You have to include it. The other example, just to put a point on it, would be if an architect building in a major world city said: “I’m going to build this building, and I won’t use a computer.” That would obviously be the most radical, perverse gesture one could make. In all these cases, the computer is not optional.

In the humanities and arts, it’s optional. You don’t have to say anything about the work of digital humanities when you’re doing your literature dissertation...

Read on at BLARB.