The Tools of Neuroscience Experiment
John Bickle, Carl F Craver
& Ann-Sophie Barwich
Reviewed by David L Barack
The Tools of Neuroscience Experiment: Philosophical and Scientific Perspectives
John Bickle, Carl F Craver and Ann-Sophie Barwich (eds)
Routledge, 2022, £96.00
ISBN 9781032127996
Tools, Theories, and the Web of Knowledge: A Book Review in the Spirit of W V O Quine
This compendium is a fine collection of essays exploring the role of tools in the theory and practice of neuroscience. The book is divided in to five sections: ‘Research Tools in Relation to Theories’, ‘Research Tools and Epistemology’, ‘Research Tools, Integration, Circuits, and Ontology’, ‘Tools and Integrative Pluralism’, and ‘Tool Use and Development beyond Neuroscience’.
The book’s main focus is on what I will call the ‘tools-only thesis’. This is to be distinguished from the ‘tools-first’ and ‘tools-also’ theses. The tools-only thesis maintains that tools always and only drive theoretical development in neuroscience. Exemplifying this approach is Bickle’s chapter (‘Tinkering in the Lab’), in which he maintains that ‘in “wetlab” neurobiology, new tool development drives everything else’ (p. 13). He continues (p. 14):
The rejection of mainstream philosophy of science’s theory-centrism is […] an attempt to put theory in its proper place. Theory in wetlab neurobiology is dependent upon the development and ingenious use of new research tools. The genesis of every piece of theory just mentioned can be traced back directly to the development of some new research tools. Theoretical progress in this […] science […] is secondary to and dependent upon new tool development, both temporally and epistemically.
This emphasis, on the role of tools in preceding theoretical development both in time and in the generation of knowledge, frames the entire book. Bickle’s emphasis is on wetlab neurobiology—roughly, the exploration of the molecular and cellular processes in the brain. The thesis can be applied more generally, however. The message is clear: tools are more important than, and temporally and conceptually precede, the development of theories in neuroscience.
The four other chapters in the first section of the book serve as evaluation of Bickle’s tools-only thesis. One of the chapters, Barwich and Xu (‘Where Molecular Science Meets Perfumery’), is a strong defender of the thesis. They argue that SCAPE (Swept, Confocally-Aligned Planar Excitation microscopy) preceded and instigated a conceptual revolution regarding odour coding in the olfactory epithelium: the development of a new theoretical approach to odour coding. The other three chapters in the section are equivocal about the thesis. All three provide case studies drawn from the literature that challenge the thesis. However, all three also provide cases that support it. Johnson (‘Tools, Experiments, and Theories’) tackles Bickle’s claims with two case studies, gene targeting and long-term potentiation and optogenetics in the amygdala and anxiety. The first contravenes Bickle’s thesis, whereas the second supports it. Atanasova, Williams, and Vorhees (‘Science in Practice in Neuroscience’) discuss the case study of the Cincinnati water maze, used to test egocentric navigation, to explore the role of tools. This case is an instance of innovation in neuroscience from tool development. However, the invention of the tool was itself motivated by theory, and so the example both supports and detracts from the tools-only thesis. Hardcastle and Stewart (‘A Different Role for Tinkering’) use the recent COVID pandemic to illustrate both directions of influence: tools before theories and theories before tools. Accepting the treatments at face value, Bickle’s tools-only thesis is falsified, as there are clearly instances where theory drives tool development.
However, a weaker version is the tools-first thesis, which maintains that tools often, but not always, drive theoretical progress in neuroscience. And this thesis survives the various cases offered against the tools-only version of the tinkering thesis.
A notable point about the first section, true of the whole book, is the focus on science in practice. I commend this perspective. However, a focus on science in practice does not imply the absence of any role for a critical, even analytic, philosophy. Here I offer an objection to the tools-only or tools-first theses; indeed, perhaps a criticism of the project of dividing up scientific endeavour into tools and theories in the first place.
A lesson from Duhem, Quine, and numerous other philosophers of science is that confirmation and infirmation in science are holistic affairs. The entire body of our scientific knowledge goes up for test in any experiment. Observations relate this epistemic edifice to the world, and when the body of knowledge conflicts with observation, changes at any point in that edifice may be made to make that body of knowledge consistent with what the world offers.
Are tools part of that body of scientific knowledge? I think so. Certainly, some of the chapters in the first section (and throughout the book) support this; take, for example, Barwich and Xu who frame the importance of SCAPE in olfactory discovery as a literal part of the cognitive process (an extended cognition thesis with which I disagree and that they do not argue for; but about which they are notably and graciously transparent, stating (p. 111) that ‘for the analytically inclined philosopher, this discussion must remain wanting in conceptual detail and argumentative explication’). On such an extended cognition view, tools literally embody our knowledge. But one need not take on such an extended thesis to maintain that tools are the product of, and play some essential role in, scientific investigation. Just as we can change our theory, we can change our understanding of a tool to accommodate discordant observations. Further, tools are the product of our scientific knowledge, the result of applying what we have learned and our concepts and theories to the world. Due to both the contribution of our scientific knowledge to tool use and development, and the revisability of our understanding of tools, tools are part of (manifestations of?) our scientific knowledge.
But then granted that tools are part of our scientific knowledge, the role of tools in the scientific process is no different than the role of theories. Whenever tools or theories are developed, we adjudicate those tools or theories against our observations. And observations that disagree with our tools or theories can force a revision in any aspect of our scientific knowledge, tools and theories inclusive.
This point can be made more precise by a consideration of how tools result from theory. All of the examples offered in support of the tools-only or tools-first theses embody some theoretical knowledge or other. That theoretical knowledge may not be neuroscientific; for example, the development of the Tungsten recording microelectrode (discussed by Bickle) relied on an enormous and developed theory of metals. It might be objected that the tools-only or tools-first perspective regards neuroscience only, to the exclusion of other bodies of knowledge, such as those drawn from chemistry regarding metals. But one of the key points of the Duhem–Quine approach is that all of our scientific knowledge hangs together.
Suppose, for instance, that there had been some unexpected outcome from recording in the visual cortex of the cat—as indeed occurred with the discovery of orientation selective cells. An alternative (albeit highly imaginative) explanation is that Tungsten metal emits certain electromagnetic signals when placed in a particular neural context. Of course, this is a complex and speculative alternative. Further, it accommodates the results by changing our theory of metals, not our theories of the brain. But the fact that it gets ruled out without even warranting a thought merely illustrates the importance of background theory in the functioning of our tools. And, it illustrates how a change can be made elsewhere in the structure of scientific knowledge to accommodate a particular finding, if we are willing to subject our epistemology to unlimited (and perhaps unwarranted) violence. The fact that we rule out these alternatives without thought only illustrates how all science proceeds against some theoretical background or other; and the fact that Tungsten was used instead of, say, grass illustrates how theory drives tool development in neuroscience, albeit perhaps not neuroscientific theory (supposing one can distinguish what bits of our theories are neuroscientific and what bits are not).
The second section, ‘Research Tools and Epistemology’, discusses epistemic dimensions of the use of tools in neuroscience. Silva (‘Dissemination and Adaptiveness as Key Variables in Tools that Fuel Scientific Revolutions’) argues that cheap, easy-to-use tools disseminate more widely and, as a result, are better drivers of neuroscientific progress. These are features of tools that impact the development of neuroscientific knowledge, including the development of theories, further supporting the tools-first claim. However, cost and ease of use will vary across different contexts and can show the same relativity to our body of knowledge as other aspects of tools. Craver (‘Toward an Epistemology of Intervention’) discusses and describes epistemic norms for intervention, using optogenetics as a case study. He conceptually analyses interventions and identifies eleven dimensions of variation among interventions that are epistemically relevant. Tramacere (‘Triangulating Tools in the Messiness of Cognitive Neuroscience’) discusses triangulation—‘the use of multiple information sources to identify an object’ (p. 176)—as epistemic robustness: the reliability of connections between knowledge claims and the facts of the world that the claims are about. She defends triangulation against recent attacks on the concept. Nathan (‘Prediction, Explanation, and the “Toolbox” Problem’) notes that prediction is an overlooked goal of science, where prediction can occur in the absence of explanation. He illustrates the claim with three case studies (genome-wide association studies, biomarkers for illness, and reverse inference in fMRI). This poses the ‘toolbox’ problem: granted the obvious importance of prediction, how should the relationship between prediction and explanation be described? His solution is to note that prediction and explanation occur at different levels of description: ‘prediction and explanation pertain to different levels of description, depending on how much mechanistic information is provided’ (p. 212).
The third section, ‘Research Tools, Integration, Circuits, and Ontology’, explores the role that tools play in integrating different areas of neuroscience, illuminating the kinds that need to be described to explain the neural basis of mind. Colaço (‘How Do Tools Obstruct (and Facilitate) Integration in Neuroscience?’) follows O’Malley and Soyer (O’Malley and Soyer [2012]; O’Malley [2013]) in distinguishing method, data, and explanatory integration across fields. He also distinguishes between local and global integration. Colaço argues that tools can prevent integration, illustrating the claim with the case of molecular and cellular cognition (MCC) and cognitive neuroscience, which use very different tools to attempt to uncover the neural basis of cognition. In addition, in some cases, tools can help with local integration, such as in the case of CLARITY, a technique for rendering tissue transparent, for all three of method, data, and explanation. Parker (‘Understanding Brain Circuits’) lays out several criteria for ‘circuit understanding’, that is, understanding how neural circuits transform information for behaviour. Parker argues that circuits can’t be reduced to their molecular constituents, and that they are heterarchical, non-decomposable systems whose components act and interact in many different ways. This is well illustrated by ‘diaschisis’, the phenomenon where focal lesions in the brain give rise to widespread deficits. By Parker’s lights, understanding circuits involves the interdependent development of tools and theories. Burnston rounds out the section (‘Cognitive Ontologies, Task Ontologies, and Explanation in Cognitive Neuroscience’) with an argument against the standard explanatory framework in cognitive neuroscience, where psychological kinds are explanatory and the type of explanation is causal. Instead, he offers a competing picture where psychological kinds are heuristics used to guide research but which play neither explanatory nor causal roles. Instead, explanation of behaviour is accomplished by looking at the demands that tasks place on organisms and the functions performed by brain regions in different tasks. (I’m very sympathetic to the rejection of the standard framework, but I think Burnston misses wildly with the turn to a task-driven ontology for cognition; alas, for another time.)
The penultimate section, ‘Tools and Integrative Pluralism’, illustrates the role that tools have played in advancing the explanations provided by neuroscience. Favela (‘It Takes Two to Make a Thing Go Right’) maintains that improvement in mathematical tools is central to development in neuroscience. He argues by example, discussing the case of scale-free dynamics in neural activity, whose accurate description required the development of fractal concepts in mathematics. Finally, Prinz (‘Hybrid Brains’) uses several examples of the ‘dynamic clamp’, when in vitro neurons are wired together with a computer that measures and analyses voltage from neurons to provide real-time input back to those neurons. This allows precise perturbations and testing of theories of neuronal function. Advances in this technology have allowed such closed-loop feedback in in vivo systems. The final section, ‘Tool Use and Development beyond Neuroscience’, contains the contribution of Baxter (‘Beyond Actual Difference Making’). Baxter argues that close examination of loss of function studies in genetics reveals that previous accounts of the explanatory role of genes are insufficient. These loss of function studies, which rely on new tools, have led to the development of novel causal concepts that can better explain the role of genes.
In sum, this book is a timely contribution to debates surrounding the philosophy of neuroscience in practice. Some bold hypotheses are ventured and defanged; new analyses of concepts are offered that will help us analyse and understand neuroscientific experimentation and explanation; and the analysis of neuroscience—its tools, theories, and concepts—is advanced on multiple fronts.
David L Barack
University of Pennsylvania
dbarack@gmail.com
References
O’Malley, M. A. and Soyer, O. S. [2012]: ‘The Roles of Integration in Molecular Systems Biology’, Studies in History and Philosophy of Biological and Biomedical Sciences, 43, pp. 58–68.
O’Malley, M. A. [2013]: ‘When Integration Fails: Prokaryote Phylogeny and the Tree of Life’, Studies in History and Philosophy of Biological and Biomedical Sciences, 44, pp. 551–62.
Cite as
Barack, D. L. [2022]: ‘Bickle et al.’s The Tools of Neuroscience Experiment: Philosophical and Scientific Perspectives’,
BJPS Review of Books, 2022