In this post I want to reflect on how technology changes what is expected from scholars and scholarship – and I include students and their work in this. It’s important here to remind ourselves that the emergence of Generative AI is not the first time that technology has changed these expectations, and in that reminder we should find some solace from the panic we might be feeling rising in our chests about GAI and the coming disruption of Higher Education. To illustrate my point I offer a personal account of what was expected from me as a doctoral student and how technology had altered that from what would have been asked or expected of me a generation or two before.

In the late 1990s I was working on my PhD (later amended and published as a book) looking at understandings of ‘desire’ in the Buddhist Pali Canon. Even without the commentaries and sub-commentaries the Pali Canon is substantial – 48 volumes in some estimates. Due to the oral transmission prior to recording there is a lot of repetition, but that didn’t help me! I wanted to know what words for desire were found in the texts, in what texts and contexts they occurred, and how they were used or omitted in key doctrinal statements. I was hoping to explore the extent to which the condemnation of desire in this corpus was potentially more subtle and complex than in many representations of Theravada Buddhism. If I had done the project in the 1870s I would have had to travel, learn scripts and spend years in Myanmar, or Thailand, and only hope to discover some of these things. It is likely my output would be a summary of my archival research. If I had studied in the 1970s I would have had access to the Pali Text Society editions of many key texts, still in Pali but rendered into Roman script (with few commentaries and no sub-commentaries). I would have needed access to a large library – London and Oxford libraries hold many of the key books – and to learn Pali. I could have produced a much larger survey and maybe even squeezed a little analysis into my work too – though it would still broadly be archival work, of necessity.
As it was the late 1990s when I was studying at the University of Sunderland, with Professor Peter Harvey, and he alerted me to the 1997 release of a CD-ROM -the Vipassana Research Institute (VRI) Chattha Sangaya (Sixth Council) Edition, which included commentaries and sub-commentaries, 135 volumes, in Roman, Burmese and Devanagri scripts. It was $200. I asked the University to help and after some mammoth form-filling the University sent off for a copy, which was with me in weeks. I had an intel 486dx4 based PC running Windows 95 (I couldn’t afford a Pentium processor) and popped in the precious disk. After a bit of working out the interface I was off. Within a week I had done the searching mentioned above, but much more fully and reliably. All of it. The results were still in Pali, not English and so there was some work to do, but I had completed what would have taken years, and travel around the world (1870s version!) or at least the UK’s libraries any time up to 1997, within a week. It is hard to convey just how much power a few simple searches had, and just how much work would have been necessary to get to the same point previously. That I had access to 135 volumes and results in seconds meant that the piece of flimsy shiny technology changed what was expected of me. I couldn’t hand in my week’s searches and some translations and think that was my PhD – even though that would have exceeded what I would have needed to do just a few years earlier. The technology had raised the bar, or at least shifted expectations substantially.
In my autobiographical example, my PhD supervisor expected me to now do something with the outputs from my searches. The technological advances meant that what would perhaps have been a substantial contribution to knowledge a few years ago was now done by a machine in minutes. It took a week as I had to work out how to use the searches – the results were almost immediate, with a bit of whirring from the CD-ROM drive. It turned out that, because of this access to technology it was now that the real hard work of the PhD began. I was expected to do a lot more reflection and analysis of the outputs from the searches. To expand the frame to see what other materials in other religious traditions may be pertinent, using my prior reading and knowledge. There had been a shift in favour of doing what (at that time) only a human could do. If I was doing the same study now – all the texts, and very many in English translation would fit on my phone via an app, and I could ask ChatGPT to offer contextual analysis in various ways. The ‘what only a human could do’ would have shifted a bit further. What would this have meant? Can this give me a glimpse of what students now might be asked to do more of, as GAI again moves the goalposts? I would have had to do more analysis, and importantly more synthesis.
Synthesising from multiple sources is often a feature of desired learning outcomes in final year undergraduate and postgraduate Quality Assurance paperwork – and is seen as a higher-level skill, drawing on Bloom’s taxonomy 9th cognitive domain). While Universities, again following Bloom, often place ‘evaluation’ above synthesis as the highest learning outcome, I want to dwell on ‘synthesis’ for a moment. The evaluation is an end point, a judgement, but in ‘synthesis’ we are doing the work that leads towards an evaluative claim and can learn from, or dispute, any judgements by engaging with the detail of our ‘synthesis’ work. What humans do in drawing together material is often about drawing things together from outside of the immediate thing being studied – about bringing in things we didn’t know it would be useful to know in addressing a topic. In scholarly endeavour we never quite know in advance what we’ll need to know in order to progress, or to illustrate our point. The things ‘only a human can do’ will be different in different academic disciplines, but the epistemic uncertainty of what we might need to synthesise seems a good start for some of them. In my case, I was unable to predict that poetry I had read at school, or novels I had randomly picked up from the selection my mum had stored in the room I was using, would spark a reflection on human desire that found its way into my thesis.
In reflecting on this anecdote in the context of our current concerns over Generative AI, there are clearly other examples of technology raising the bar of our expectations of scholarly submitted work. Word processors allow for neat, correctly spelt work to have become (largely) the norm. Newer tools like Grammarly represent another shift, and are part of the conversation about how much to is too much ‘assistive’ technology and which skills need to still be developed, even if they can be outsourced to an extent. Spreadsheets and other tools allow calculations and comparisons to be a minor part of large-scale quantitative work. Months of lab work can be reduced to days or less via a range of tools/systems* so students are judged on a hugely different set of skills to those they would be required to demonstrate prior to the tools taking work out of process. What only the human in the lab could do has shifted. And keeps shifting. This is going to have to be our reflection as we are forced to radically rethink how Higher Education works in a GenAI world – that bar on what only humans can do is rising, and we need to amend our teaching and assessment to match this. Quickly. And often.
—————————–
*I don’t know these areas so asked ChatGPT4: ‘what machines save time on lab analysis compared to 30 years ago?’
It told me:
The advancement of technology over the past 30 years has significantly impacted laboratory analysis across various fields, making processes faster, more accurate, and capable of handling complex analyses that were previously time-consuming or impossible. Here are some of the key machines and technologies that have transformed lab analysis:
1. **Automated Liquid Handling Systems**: These systems automate the pipetting, mixing, and transferring of liquids, greatly reducing the time and potential for error associated with manual handling. They’re crucial in high-throughput screening, drug discovery, and molecular biology experiments.
2. **Real-Time PCR (qPCR) Machines**: Compared to traditional PCR methods, real-time PCR allows for the simultaneous amplification and quantification of DNA samples, significantly speeding up genetic analysis and disease diagnosis.
3. **Next-Generation Sequencing (NGS) Technologies**: NGS platforms can sequence DNA and RNA much more quickly and cheaply than the first-generation sequencing machines of 30 years ago. They enable comprehensive genome sequencing, metagenomics studies, and personalized medicine approaches with a fraction of the time and cost.
4. **High-Performance Liquid Chromatography (HPLC) and Ultra-High Performance Liquid Chromatography (UHPLC)**: These technologies have evolved to provide much faster and more efficient separation, identification, and quantification of compounds in a sample than earlier chromatography techniques.
5. **Mass Spectrometry (MS) and Tandem Mass Spectrometry (MS/MS)**: Advances in mass spectrometry, including increased resolution, accuracy, and speed, have revolutionized the analysis of complex biological samples, enabling detailed proteomics, metabolomics, and lipidomics studies.
6. **Automated Cell Counters**: These have replaced manual counting under a microscope for many applications, providing faster and more accurate cell counts in research and clinical diagnostics.
7. **Flow Cytometers**: Modern flow cytometers can analyze thousands of particles per second, sorting and classifying cells and other particles based on size, granularity, and fluorescence. This capability is essential for immunology, cancer research, and stem cell studies.
8. **Digital Pathology Systems**: These systems, which digitize glass slides for analysis on a computer, have improved the speed and accuracy of pathology analyses. They enable easier storage, retrieval, and remote viewing of samples, enhancing collaboration and diagnostics.
9. **Robotics and Automated Synthesisers** for chemistry and pharmaceutical research have drastically reduced the time required for the synthesis and testing of new compounds, enabling rapid screening of large compound libraries for potential new drugs.
10. **Laboratory Information Management Systems (LIMS)**: While not a machine for direct analysis, the implementation of LIMS has significantly streamlined the management of samples, data, and workflows in laboratories, reducing manual data entry and errors, and increasing overall efficiency.
These technologies have collectively enabled breakthroughs in research, diagnostics, and treatment that were unimaginable 30 years ago, profoundly impacting science, medicine, and many industries.
Leave a reply to MH Thaung Cancel reply