Science, The Only Frontier?

Post March 5, 2018 6:36 pm

A portrait of Vannevar Bush, seated at his desk, pen in hand. Credit: Wikimedia Commons.

 

As I was re-reading Vannevar Bush’s Science, The Endless Frontier, I was struck by the way that a shortage of scientists was being conceptualized in the US in the aftermath of World War II, in relation to how similar concerns about national innovation, economic growth, and the future of work are being conceptualized today. As a local example, I need only look so far as how MIT is reorganizing the freshman experience, trying to rethink what an undergraduate student at a science-and-engineering-oriented institution should come out knowing and being able to do. In this context, it seems a good moment to think about the way that higher education (and university research) is funded and structured. What should students be pushed to learn and study?

Vannevar Bush’s famous manifesto argued in July of 1945 that government should invest in the funding of basic research. One of his key aims was the production of a government body (the National Research Foundation, which eventually arose in quite a different form as the NSF) that would be responsible for this task. What I found so striking was that while Bush waxed lyrical about the importance of science in producing new industries, providing jobs, guaranteeing national health and well-being and the public good, his report contains a stern note of warning. He argued that his funding should not come at the expense of “the social sciences, humanities, and other studies so essential to national well-being”.

From the Report of the Committee on Discovery and Development of Scientific Talent, led by Mr. Henry Allen Moe, he quoted the following:

“As citizens, as good citizens, we therefore think that we must have in mind while examining the question before us – the discovery and development of scientific talent – the needs of the whole national welfare. We could not suggest to you a program which would syphon into science and technology a disproportionately large share of the nation’s highest abilities, without doing harm to the nation, nor, indeed, without crippling science. . . . Science cannot live by and unto itself alone. . .

“The uses to which high ability in youth can be put are various and, to a large extent, are determined by social pressures and rewards. When aided by selective devices for picking out scientifically talented youth, it is clear that large sums of money for scholarships and fellowships and monetary and other rewards in disproportionate amounts might draw into science too large a percentage of the nation’s high ability, with a result highly detrimental to the nation and to science. Plans for the discovery and development of scientific talent must be related to the other needs of society for high ability. . . There is never enough ability at high levels to satisfy all the needs of the nation; we would not seek to draw into science any more of it than science’s proportionate share.”

Notwithstanding any ulterior motives for such a perspective on funding science education, these paragraphs raise a question of deep contemporary import: can science live by and unto itself alone? Current arguments for the need for more STEM students, coupled with prioritizations of extended visas for foreign students with STEM degrees, and institutional funding distributions, certainly seem to suggest an imaginary in which all students would be in STEM fields. This division seems somehow optimal in a world of limited resources: after all, the Romantic Poets and scholarship on them (including mine) never put anyone on the moon, nor did they cure malaria, nor ensure full employment.

But what gets left behind in this idea of what schooling is about?

As universities experience increased funding pressure, and strategize about how to increase their relevance to the needs of business and prepare the 21st century labor force, they may be trapped in a reactionary, pragmatic posture. The debates about what universities are for are old, and certainly the options for education need to change to accommodate shifts in the organization of labor (as argued for example by Cathy Davidson). But despite assumptions to the contrary, schooling has never just been about producing good laborers. Oddly enough I come at this from the perspective of a scholar of AI and a burgeoning ethnomethodologist:  some of the same spurious notions of learning that were involved in the creation of brittle and limited AI systems are also behind in the idea that universities are knowledge factories, or even that they teach identifiable skills. But the social and cultural background that is expected of a manager, a parent, an instructor, a therapist, or a good neighbor, today and in the future, is enculturated by universities too. If concerns about smartphones and other computationally mediated alienation raised by Sherry Turkle and others are to be taken seriously, the university may indeed be part of a shrinking set of actual-world places in which that enculturation is performed. Furthermore, while we humanists might wish to support the arts for their own sake, there are practical reasons to provide humanities education to students as well:  I believe that humanistic study is at least one possible source for greater perspective, for finding deep, personal meaning in the world. If humanistic education is also, as I claim, about exposure and enculturation into a meaningful life, a life well-lived, this too is important for the labor market. Balanced, thoughtful, and well-adjusted people–not just people with the skills du jour–are important for well-functioning organizations and societies.

Scholars in the humanities are under pressure today to explain why there is value to what they do. I have tried to explore, in brief, some areas in which that value might be found. Seeing this cautionary note from the 1940s, however, leads me to ask how new this pressure on the discipline is. When did this begin? Is it possible to recover a previous time in which science and technology had close to its current industrial forms (it arguably did not in 1945), but alternatives to this work were still socially and intellectually valued as contributing to the well-being of the nation or its people? Where can we find the history to contextualize the position we find ourselves in today? And might this help us change the terms of the debate?

Leave a reply

required

required

optional


Trackbacks