Magic Tools and Research Integrity [Republica Repost]

Published in the Republica on March 23, 2021.
Plagiarism is a manifestation of a deeper problem in academia: Of publishing for the sake of publishing, and of rewarding it regardless. 


“Do I need to cite a source if a plagiarism detection tool doesn’t show that I’ve borrowed an author’s words?” asked a participant at a research workshop recently. “I will have to rewrite much of my article if that’s the case.”

I was not surprised. Instead, I started wondering where the question was coming from. In op-eds and other discussions, I’ve seen plagiarism treated as a problem of stealing words (rather than ideas). For instance, in a recent, highly nuanced, proposal for apology as a mode of redemption for those who have plagiarized in the past, the author casually claimed that there are now technological tools for “easily” identifying and preventing cases. Academic leaders and institutional policies alike, I remembered, exude the same incredible hope.

What’s even worse, issues about quality and integrity of research, not to mention its social value and responsibility, are overlooked in discussions of its originality. Across South Asia and the rest of the global south, there is an increasingly misguided focus on the product of publication—rather than on the ends to which it is a means—reflecting what current policies demand and reward. Even when “impact” is talked about, it simply refers to proxy measures of quality of the product, such as the number of citations (which may be mere name-dropping, including one’s own). Indeed, that is what “journal impact factor” means. When “quality” is used explicitly, that too simply means that the venue is “international” (or not locally located) or that the product is in English (instead of a local) language. If these critiques sound radical, it’s because the status quo is absurd. It is because it rewards publications that may have no significant value.

It is not just that someone can reap rewards by simply paraphrasing or summarizing others’ ideas. They can also make progress by fabricating or manipulating data. Either way, the magic of technology fails whenever scholars fail to ask what specific tasks specific technologies can do and how, where they can be bypassed, what to learn from using them.

Current policies can be gamed, or rendered socially useless, in more ways: Stealing ideas from unpublished sources, going beyond what the tools check against, using ghost writing service, adding authors to distribute credit, adding one’s name to research done by students, publishing in venues not publicly accessible, publishing in substandard journals created overnight, ginning up citation scores through citation cartels, doing pay to play publication with commercial publishers that put up a facade of review, and publishing with predatory venues that skip review entirely. What magic tool is going to address all of these?

In essence, when people neglect whether thieves are outsmarting their magic guards at the bank, or if the latter can use counterfeit bills printed right at home, systems based on technological measures miserably fail.

Symptoms versus issues

The real issue about originality is not stealing. It is not even honesty and intention. It’s the lack of inherent quality and value in the research, the missing relevance and impact of the publication, and the scanty seriousness and social accountability in the author. Plagiarism is one of many manifestations of a deeper problem in academia: Of publishing for the sake of publishing, and of it being rewarded, regardless. Without substance and integrity, publishing for publishing is misdirecting emerging scholars. And publication for jobs and promotion alone is not even pushing forward current knowledge.

I’m reminded here of a food science article that reported findings from a study of how long a vegetable lasted when moisture, temperature, and air flow were varied. When I did some layman’s research online, I was disappointed that the same effects on that vegetable had been documented for decades now. I didn’t just wonder who might use the article’s findings (or how they may access it) but why the research was done, other than for a line on CVs.

The emerging publication regime is perpetuating old problems and creating new ones. It is driving shallow, thoughtless objectives. It is discouraging more humble research projects, local needs and opportunities, and publications in local languages. If, for instance, a project on mental health support was better rewarded for reporting outcomes, the researcher would not just “find” that there is stigma against mental illness and just recommend that professionalized support is necessary. The author would go beyond stating the obvious, to show how and why something was done.

Current policies and discourses across the global south are aggravating the production of supposedly global-value scholarship that is locally junk and vice versa. An example of this is a whole set of articles with similar titles like: “Effectiveness of Information and Communication Technology in Nepal’s Higher Education.” Each article asks how effective ICT use was at a certain institution, finds and reports the same problems (again and again), and offers recommendations that are not based on the research. These articles may get published in international journals if they generate some interest (or sympathy) among editors/ reviewers, but they don’t add anything useful on the ground, or offer new perspectives to global scholarship on any issue involved.

The demand for international, indexed, high JIF, English-medium publications can help produce higher quality scholarship, at least by a minority of scholars. But the vast majority of scholars shouldn’t have to run after these proxies of useful knowledge. They should be rewarded for contributing to and improving local venues, for writing for readers of different languages, for advancing social justice. In fact, the few select scholars at the top of the pecking order should feel professionally, socially, and ethically responsible for doing the same.

Need for a richer ecology

Journal articles (especially “international”) have become a fashion, rather than means to higher ends. Not everyone should have to pursue this goal. For instance, a group of hydrologists studying the relationship between water usage and water table in Kathmandu Valley may not have all they need to publish in the Nature. But they should be rewarded for presenting their findings to the right place. When there is more reward on the application side, more scholars work with professionals and policy makers, trade groups and government bodies, for-and non- profit organizations.

A richer ecology of scholarship, shaped by fairer distribution of reward, would include edited books and special issues, op-eds and research reports, conference proceedings and papers, scholarly presentations and research training (given or received), field work and support for students, grants received or applied for, research initiatives that don’t result in publication, blogs and vlogs, participation in or contribution to research projects, and even evidence of continued study in the discipline. It would extend the formal and textual to the less formal and oral, material and multimodal, incidental and indigenous, artistic and performative, analytical and empirical, narrative and reflective, action-driven and community-based.

Many emerging scholars, as well as students, need mentoring, especially from those who keep publishing themselves. Support works best when those who preach also practice. Then there’s a need to promote collaboration. As with capitalism, where competition alone can lead to a few rich who accumulate most wealth at the top (unmoored from labor and from common good), competition alone can be counterproductive to knowledge production for social good. Giving scholars credit for the number researchers mentored, rewarding work done with students, and  recognizing good research and scholarship that may not end in publication are some ways of fostering collaboration and mentorship. And institutions must reward and support journals run by networks of scholars within disciplines and specialties, not departments/colleges or political unions.

Motivation, fostered by localization, best drives quality. Judging quality by social value also discourages cheating. Most importantly, all boats must be lifted, by increasing opportunities for women, scholars beyond the capital, and members of marginalized communities. Just a little extra support, trust, and resource can hugely boost talent among traditionally excluded scholars.

Working the magic

Plagiarism, if we look closer, reveals problems in the publication landscape, in the incentive policies, even in the vision of leaders. So, scholars do need to stoke the fire on current views about publication, on policies needing change, on superficial solutions. But we are also responsible to fill the gaps that we complain about, with our own actions and sacrifice.

Some scholars have already started recognizing and rewarding more diverse and more socially responsive scholarship by working at the levels of their departments and colleges. Networks of scholars are sharing support for publication. I’ve virtually assisted many initiatives in recent years, the latest involving a hundred scholars from around Nepal, with support from nearly another hundred scholars as mentors and resource persons.

What scholars cannot afford is gullibility. Of course, technology is advancing so rapidly that it’s easy to find it magical. But if we are to address the deeper causes of the many problems with research integrity, plagiarism being one, we must be the magicians who understand the tricks it seems to do—not the audience who are fooled by it.

Bookmark the permalink.

4 Comments

  1. Thakur Prasad Dhungel

    Loaded with ideas but in beautiful and convincing way

  2. Scholars to be magician… Convinced

  3. Prakash Chandra Giri

    Outstanding piece

Leave a Reply to Raju Kumar Rai Cancel reply

Your email address will not be published. Required fields are marked *