Now Reading
What Are the Preconditions for Good Research in the Anthropocene Epoch?

What Are the Preconditions for Good Research in the Anthropocene Epoch?

Academic research seeks to discover new things and make testable predictions about our world and beyond. In this process, it sets standards for what constitutes a valid discovery. Credit: gastev/Flickr, CC BY 2.0

Can India’s education policy empower students to pursue questions in stem-cell biology after having graduated with degrees in computer science and music? It ought to.

Academic research seeks to discover new things and make testable predictions about our world and beyond. In this process, it sets standards for what constitutes a valid discovery. Credit: gastev/Flickr, CC BY 2.0
Academic research seeks to discover new things and make testable predictions about our world and beyond. In this process, it sets standards for what constitutes a valid discovery. Credit: gastev/Flickr, CC BY 2.0

These are hectic times for the Indian government, which promised change. While its dramatic surgical strikes on terrorist camps and the economy have taken centerstage, efforts – at least in words – towards transforming education and skill development have faded away from the mainstream discourse. For example, a new national education policy is being framed whose stated goals include societal improvement including affordable and girl education, as well as developing a workforce that is ‘world class’.

This article is not meant to address the question of how best to take education to all cross-sections of the society or how best to teach those who make it to a college. But it will state what academic research at the highest level in the sciences expects of its youngest members, namely graduate students coming in to pursue research typically leading to a PhD degree.

Why is academic research a good benchmark to evaluate high school and college education? Rigorous research requires two things: (a) the ability to think critically and creatively; and (b) the ability to get things done. These two skills would presumably be needed at the highest levels even in the commercial sector, and therefore providing an education that inculcates these two good habits in our young minds would be of more than just academic interest. That said, we will stick to the benchmarks that academic research in a field as interdisciplinary as biology seeks.

Do we need creative thinking? Do we need innovation? Over the last few centuries or even a couple of millennia – often referred to as the Anthropocene epoch – the genetic evolution of humans has been superseded to unprecedented levels by cultural evolution. We have changed the world we live in at such a fast rate that we are hard-pressed to not become outmoded in a matter of decades, years and subsequently, in today’s world of smartphones, months. This has come to pass because of the ability of humanity to innovate. In the long term and in retrospect, not a lot of good might have come of it, but what is done is done. How do we move forward?

Appreciate context-dependent truths

We need a thorough understanding of the horribly complicated web of interactions between the biological and the lifeless world before we can predict the ramifications of any perturbation to this system. And perturbations are coming in thick and fast, not least the chaotic kicks to the economy and the atmosphere in all hues and often without a full understanding of how this would unbalance the networks that sustain us. While the job of regulatory bodies is to ensure that disruptions are kept to a minimum, they would be better placed if armed with data or effective predictive models, which can happen only through an assembly of large numbers of research studies, often of academic nature.  

Academic research seeks to discover new things and make testable predictions about our world and beyond. In this process, it sets standards for what constitutes a valid discovery. Typically, a hypothesis is framed and experiments are performed to falsify the hypothesis. The hypothesis stands until falsified. The test of a hypothesis is often determined not only by the choice of experiments that are done to support or falsify it but also by how well the experiment is done. Otherwise, a sloppy experimenter might just falsify the theory of gravitation! Thus, a basic requirement of a researcher is to be able to design a scientifically valid experiment to test a hypothesis and to be able to execute it well. And in large part, how well one does these things is a product of one’s training at a junior level.

Even when a hypothesis is falsified, we do not throw the baby out with the bathwater. It often turns out that the hypothesis may not be universally true but is true in certain contexts. Take the case of an hypothetical drug that treats an illness that causes much morbidity. It undergoes rigorous clinical trials with many statistical controls in place and comes out with flying colours. It gets into regular practice and is a runaway success, until reports of its failure to treat certain patients emerge.

The universality of the theory that this drug would treat the named illness does not hold any more. But should one discard the drug? Probably not. Clinical researchers may come together to discover that those patients who were recalcitrant to treatment by this drug shared certain common characteristics revealed by blood tests. Academic researchers might take this further in demonstrating that these individuals share a genetic mutation that affects how the drug is metabolised in the body.

This notion of the context-dependent-truths of a hypothesis makes scientific research more complicated to interpret than we would all like, and puts more emphasis on critical thinking on the practitioner’s part, a skill that has to be developed from an early stage of a child’s life.

The rapid pace of our cultural evolution has placed us in a unique position when it comes to information gathering. The internet is vast and deep; more vast than deep in today’s fascination with 140 characters and scoop-whoops. But it is deep for those who seek it and that is often believed to be a good thing. However, one does wonder if ease of access to information is necessarily a positive development. For instance, can it stultify creative thinking?

When a child of today learns a fact and is curious to learn the ‘why’ of it, it is all too easy for her to find the answer on the internet – rather than to have to go through the process of thinking of possible routes to the problem, finding clues from the library, trying and piecing together an answer and finally testing if it holds. In this age of information overload, the child must either be a sage or the question must be at the cutting edge of science for old-fashioned thinking and grunt work to be what it takes for her to piece together a little puzzle. But children are not sages and we are not born thinkers operating at the cutting edge. What our educationists think of this problem and how we should deal with this is something worth answering.

Appreciate late-bloomers

The web of the internet and digitisation extends beyond spreading information. It extends to getting things done for us. In the biological sciences, there are readymade, easy-to-use software that enable us to get a lot of things done quickly. Their popularity has meant that we have produced a generation of graduates adept at pressing the buttons or typing commands that run the software – but with little understanding of the methods or assumptions underlying the calculations that the software does. For example, there is software that predicts whether and where a drug might bind to a protein. But these make assumptions that help simplify the many complexities of such a prediction into a tractable calculation. Many students of bioinformatics in our country routinely use such tools, but often lack even a basic understanding of the physical forces that determine such molecular interactions. Needless to say, the blind use of such software, producing pretty but not necessarily accurate outcomes, can be costly.

Finally, our educational system detests late-bloomers and transitioners. What are the chances of an exceptionally gifted student with a B.A. in economics transitioning to say chemistry without a detour to Western shores? Very small. Often, the choice of a first undergraduate course, made by a 17- or 18-year-old in the company of parents under a variety of societal pressures, is a point of no return. And that is bad.

Science often benefits from the humanities: questions on the place of science in our lives are often answered by history and philosophy, and a lack of appreciation of these disciplines can do more harm than good to a practising scientist. To my knowledge, only a few select ‘creamy layer’ institutions make an effort to bring in humanities training into courses in science. But this being limited to the ‘creamy layer’ is a major limitation because admission to these centres of excellence is often determined by a 17-year-old’s performance over the course of a few hours!

I know a fantastic American scientist who taught me many things in my PhD years. He is currently addressing advanced questions in stem-cell biology after having done his degrees in music and computer science. Can our education policy empower our students to do this within the country? That would be a big change, and a welcome one.

Aswin Sai Narain Seshasayee runs a laboratory researching bacterial biology at the National Centre for Biological Sciences, Bengaluru. Beyond science, his interests are in classical art music and history.

Scroll To Top