During the “Improving Trust through Communication” session of the 2021 AGM and International Conference “Trust in Science“, Lisa Herzog (University of Groningen) unveiled “Science with Society” – the SCISO Project.
The SCISO Project wants…
… to raise awareness among (young) scientists about the role of science in society, science ethics, and science communication as a way of building trust in science. It provides insights not only about how trust in science can be built and what some of the challenges for trustworthiness, but also about practical tools of science communication. The project will produce a series of short, accessible videos on topics such as scientific integrity, the role of values in science, the relation of scientific knowledge to other forms of knowledge, and principles of science communication.
The SCISO Project is …
… a collaboration between the Global Young Academy’s Working Group “Trust in (Young) Scientsts” and the German National Institute for Science Communication (NaWik), funded by the VW Foundation. Because of the Corona pandemic, not all videos are ready for the 2021 AGM, but a first set of videos is ready for you to watch here.
Watch the videos…
Video: Being responsible for your research
When toggling both at the same time, references are shown below the subtitles listing.
Science can cure diseases, create crops that are resistant to pests, and make our everyday life so much easier – if you believe that science can help save our planet, you aren’t wrong. But you need to be aware that in science, the good and the bad are often closer than it seems. The inventors of substances that protect you from the sun, didn’t foresee, that the same substances can harm the environment.
And of course, biological research that helps us understand how bacterial spores from the soil can infect cattle, is really important. But in 2001 this research was used for a bioterrorist attack. The artificial intelligence systems that researchers all over the world are currently working on can give senior citizens more autonomy and make driving much safer.
But they might also be used in autonomous weapon systems. This is a typical example for a what is called dual-use technology. Another example for dual use is rocket-technology, which carried men to the moon and is also used in intercontinental ballistic missiles. And by the way, don’t think this is only an issue for engineering – research from psychologists, for example, can be used to manipulate customers.
So, who is responsible for the many consequences that innovations can have? The scientist or engineer, who created the invention? The military who used it to kill and destroy? The politicians, who should have come up with laws and regulations that would have prevented the worst? Well, being a scientist yourself, you might have guessed the answer: Most ethicists agree that scientists and engineers are at least co-responsible.
After all they are the ones who know best about their inventions and who are therefore most capable of foreseeing any harm that could be caused by them. This is why they have to carry at least some co-responsibility for their inventions. Are you thinking about your own work? What kind of consequences it might have? Are you already feeling the weight of responsibility? You’re not sure how to shoulder it?
Well, there are some frameworks, developed for just that kind of situation. The European Responsible Research and Innovation Framework, for example. It rests on four pillars: Anticipation, Reflexivity, Inclusion and Responsiveness.
Let’s have a look at the first Pillar: Anticipation: That basically means you have to ask the “What if…?”-questions: What if by chance anything goes wrong with this experiment? What if someone else uses my innovation in a different way? There are many “What if?”-questions you can as about your project. Don’t hesitate to ask them all! If you are not sure about the answers to your “What if?”-questions or about the questions itself, talk about it with your colleagues and also with your family and friends.
Pillar Nr. 2: Reflexivity. It’s not just that you as a scientist have to reflect about your work and the impact of your work on society, there is a need for a broader reflexivity on an institutional level, where the division of labor, hierarchy within science and the responsibility of science as part of society in general are discussed and reflected.
The third pillar of the European framework on responsible innovation is important for almost all aspects of this ethics tutorial: Inclusion. This simply means you should include all stakeholders and the public in the process of research and innovation as early as possible. By doing so, you make sure that the values, needs and expectations of the society are addressed.
The last pillar is Responsiveness and that’s probably the hardest one to achieve, because you and the scientific institutions you are working for have to be able to change directions quickly in response to changing circumstances. That happened for example in many labs during the corona virus pandemic, but it shouldn’t require a crisis to do so.
You might think that this framework is way too idealistic and schematic. That it would never work in your country or in your research field? Well, you might be right. When it comes to responsible innovation there is definitely no “one size fits all”- solution. Scientists in India and Brazil, for example, have their own very interesting ideas about the issue, taking the specific situation and the intellectual traditions of their countries into account. And if you don’t like frameworks at all, you can learn from the example of other scientists. In 2006, nano researchers published a call for a research agenda with regard to potential risks in the nature magazine.
We’ve put the link on our website, check it out! But the most important thing is: You are a member of society, so the future problems your research might cause would affect you too. So, don’t be afraid of asking the hard questions about risks and values. And get lay people involved early on, because questions like these are never purely scientific.
References and more links (without any claim to completeness)On anthrax and the anthrax attacks
On scientists’ responsibilities concerning dangerous technologies
- Koepsell, David (2010). On Genies and Bottles: Scientists’ Moral Responsibility and Dangerous Technology R&D. Science and Engineering Ethics 2010 (16): 119-133
- Douglas, Heather E. (20009). Science, Policy, and the Value-Free Ideal. Pittsburgh: University of Pittsburg Press, chap. 4
- Somerville, M.A., & Atlas, R. M. (2005). Ethics: a weapon to counter bioterrorism. Science, 307, 1881– 1882.
- Guston, D. H., & Sarewitz, D. (2002). Real-time technology assessment. Technology in Society, 24(1–2), 93–109.
- Ehni, H.-J. (2008). Dual use and the ethical responsibility of scientists. Archivum Immunologiae et therapiae Experimentalis, 56, 147–152.
- Maynard, A. D. and J. Stilgoe, Eds. (2017). The Ethics of Nanotechnology, Geoengineering and Clean Technology. The Library of Essays on the Ethics of Emerging Technologies. London, Routlege.
- Institute of Medicine, National Academy of Sciences, and National Academy of Engineering (1995). On Being a Scientist: Responsible Conduct in Research, Second Edition. Washington, DC: The National Academies Press. https://doi.org/10.17226/4917.
- A historical case of insufficient ethical care: https://www.sciencehistory.org/distillations/the-death-of-jesse-gelsinger-20-years-later
On the ethics of “dual use” research
- Nixdorff, K., & Bender, W. (2002). Ethics of university research, biotechnology and potential military spin-off. Minerva, 40, 15–35.
- Miller, Seumas, and Michael J. Selgelid (2007) Ethical and Philosophical Considerations of the Dual-use Dilemma in the Biological Sciences. Science and Engineering Ethics 13, 523-580, https://link.springer.com/article/10.1007/s11948-007-9043-4
The European Responsible Research and Innovation Framework
- Jack Stilgoe, Richard Owen, Phil Macnaghten: “Developing a framework for responsible innovation”, Research Policy 42 (2013), 1568-1580 (this is the paper that describes the four pillars).
- See also https://ec.europa.eu/programmes/horizon2020/en/h2020-section/responsible-research-innovation.
- Online platform about the framework: https://www.rri-tools.eu/about-rri
Commentaries on that framework from various regions of the world.
- Zhao, Yandong and Miao Liao (2019), Chinese perspectives on responsible innovation. in: International Handbook on Responsible Innovation: A Global Resource, ed. by René von Schomberg and Jonathan Hankins (Edward Elgar), 426-440.
- Macnaghten, et al. (2014) Responsible innovation across borders: tensions, paradoxes and possibilities. Journal of Responsible Innovation, 1:2, 191-199, DOI: 10.1080/23299460.2014.922249.
- Srinivas, Krishna Ravi, and Poonan Pandey (2019): “Indian perspectives on responsible innovation and frugal innovation”, in: International Handbook on Responsible Innovation: A Global Resource, ed. by René von Schomberg and Jonathan Hankins (Edward Elgar), 455-473.
Nano scientists writing about ethical risks in their field
- Maynard, A., Aitken, R., Butz, T., et al. (2006), Safe handling of nanotechnology, Nature 444 (267-269), available at https://www.nature.com/articles/444267a
Other examples of scientists calling for ethical reflection about new technologies
- A call for a moratorium of research on heritable genome editing: Lander, E. et al. (2019), Adopt a moratorium on heritable genome editing, Nature 567, 165-168, available at https://www.nature.com/articles/d41586-019-00726-5
- A call for scientists, not industry, helping to write the rules for artificial intelligence: Benkler, Yochai (2019) Don’t let industry write the rules for AI. Nature, 569, 161 https://www.nature.com/articles/d41586-019-01413-1
- An ethics initiative for autonomous and intelligent systems, by the IEEE (Institute of Electrical and Electronical Engineers): https://ethicsinaction.ieee.org
Video: Trust in Science around the World
When toggling both at the same time, references are shown below the subtitles listing.
Subtitles: Script 4 Ethics Tutorial – Trust in Science around the World
No matter where you are watching this right now… here or here, or maybe even here. Nowadays science is everywhere. But what do people all over the world think about science and scientists? Well, that question was investigated in a survey from the Wellcome Trust in 2018. They asked over 140.000 People in 140 countries. And they got some encouraging results: For example: three quarter of the world population have confidence in their own health care system. But, wait a minute, does that mean, one quarter does not trust their doctors and nurses?
Let’s have a closer look. Germany is a country whose healthcare system did relatively well during the Covid 19 pandemic. Lisa, a philosopher and GYA member, lives there: “In Germany trust in science seems to be high according to surveys, but people get suspicious whenever there are controversial political topics, and they ask if some of the studies have been paid for by private interests and that’s of course a very good reason to ask critical questions.”
In Germany especially the pharmaceutical companies have to cope with a lot of distrust. That’s probably one reason why some parents don’t want their children to be vaccinated against measles. The same problem occurs in other countries as well, France, Ukraine and the Philippines for example. But in most countries of Latin America, the Anti-Vaccination Movement isn’t such an issue. At least, that’s what Clarissa – another GYA member – has experienced: “In Latin America we don’t have problems of anti vaccers -aof people thinking that vaccines are not good. On the contrary each time there is a new case of measles for example… A kid and a family from Europe for example in Costa Rica came to the airport in Costa Rica and it was a whole fuss around the country, because they were angry, they think, they are vaccinating their children, why do they have to have that risk in their country?” Costa Rica hasn’t had any case of measles for years until in 2019 a French Family brought the disease into the country. Understandably, Costa Ricans were upset that they had to deal with measles again. So, almost no problem with the Anti-Vaccination Movement in Costa Rica and other Latin American countries – at least so far. But that doesn’t mean all is well between science and society in this part of the world. Let’s have a look at Peru, Clarissa’s home country. “Sometimes in Peru we have Earthquakes and when that happens there are many physicists that go to the TV channel and they talk about what has happened and what are the implications of this but we also have some TV Programs that bring Shamans or people that are working with the Tarot and things like this. They are also invited to the TV and this evokes in the population to really be thinking that the shaman is gonna tell them when the next earthquake is gonna be.” People listening to religious leaders rather than to scientists – that’s a phenomenon that’s happening in many countries.
Let’s go to Nigeria, where Adewale, a GYA chemist, lives: “It has always been religious belief, you know, a lot of people they believe in God and they believe that most of the things you see happening around you are just miracles.” What is it about Science and Religion? The Wellcome Global Monitor has some interesting results on this topic: among people with a religious affiliation 55% would side with their religious teachings in a disagreement between science and their religion. And it’s not just Africa where religious beliefs can cause problems for scientists: Among people who say they have a religion the highest percentages of people who say that science has disagreed with their religious teachings are in the United States and Southern Europe. But there are also many people who think that science and religion are really about different things – so maybe there is no need for disagreement?
Aside from that, there are also some problems that all scientists all over the world are facing. At least everywhere where people own mobile phones with access to social media and the internet. “Sometimes it happens, that the Information, that is loaded by someone else, who is not expert in that area could be false. This is because on the internet any common person can upload anything and it is circulated within one click of a button. That is why I feel the trust in science is decreasing.” Shalini, a GYA member from India, works in food engineering. Not long ago, a question trended on Indian social media: Does ordinary wheat flour contain plastic? Videos showing kitchen experiments undertaken by ordinary Indians would have you think so. The scientific explanation was that a high amount of gluten, a naturally occurring protein in wheat, was responsible for the elasticity of wheat flour.
But despite this reasonable explanation people didn’t stop worrying about their flour. Flour companies had to implement all kinds of safety protocols and start media campaigns. The topic might be different from country to country, but the effects of false information on the internet are always harmful one way or another.
Fake news on social media, perceived conflicts between religion and science, anti-vaccination movements and people not knowing what science is about are just a few of the challenges scientists all over the world are facing. There are many more: For example, in some countries scientists aren’t allowed to work freely, in others, women are excluded from almost all scientific endeavors. And nearly everywhere young scientists have to work under an extreme amount of pressure.
So, what can we do about all this? How can we solve all these problems? First of all, there won’t be a one-size-fits-all solution, because the problems and circumstances differ from country to country. And second: You are not alone! Connect with your colleagues around the world. Find out what problems they have and be inspired by how they approach them.
In Nigeria at least Adewale is hopeful that trust in science will increase, because more and more people are getting access to a better education. “People get to now know, what science is all about as a result of this being taught at school , people know their basic science. And they can use this to explain happenings or things happening in their society.” And in many western countries, science fairs are a great success. They sometimes attract as many visitors as huge concerts by pop stars! And often the most popular presenters are young scientists – like GYA member Sophie, from France: “I like very much to interact with the public at small science fairs, where we show small chemical demonstration for example because I can see the passion and the curiosity in the eyes of the public. This is also why I feel that the public has an inherent trust somehow in younger scientists. I have the feeling, they don’t see us as establishment and that there is this easy contact that we can make together to discuss about the small experiment but also to discuss about questions such as climate change.”
So young scientists from all over the world, connect with your society – it’s you they gonna trust.
Wellcome Trust Monitor 2018
World Value Survey
- This is another international survey that gets updated regularly and that contains some questions relevant for the question of trust in science in different societies (check out items V119, V192, V193, V194, V195, V196, V197): http://www.worldvaluessurvey.org/WVSDocumentationWV6.jsp
Report on a study on view of pharmaceutical companies in Germany
Measle being reimported into Costa Rica
Some research on the perception of Science and Religion in Nigeria
- Bankole A. Falade and Martin W. Bauer, “ ‘I have faith in science and in God’: Common sense, cognitive polyphasia and attitudes to science in Nigeria,” Public Understanding of Science 27(1), 2018, 29-46.
Gluten in wheat flour
Role for young scientists
- In this brief video, for the 2020 digital GYA conference, Bruce Alberts, Chancellor’s Leadership Chair in Biochemistry and Biophysics for Science and Education at the University of California and former GYA Advisory Board member, speaks about the central role for young scientists to reach out to society https://www.youtube.com/watch?v=uWdPI_jx3tI&feature=youtu.be
Bonus item: Did you know that the “science of science communication” has become a field of its own, with conferences and journals? There, researchers explore all kinds of questions about the way in which science communication functions, including differences between different countries. You might want to browse journals such as, for example,
- Social Studies of Science – https://journals.sagepub.com/home/sss
- Science, Technology and Society – https://journals.sagepub.com/home/sts
- Public Understanding of Science – https://journals.sagepub.com/home/pus
- Journal of Science Communication – https://jcom.sissa.it
For example, they explore the differences between perceptions in cities and the countryside (e.g. Guenther, Lars, Peter Weingart & Corlia Meyer (2018): “Science is Everywhere, but No One Knows It”: Assessing the Cultural Distance to Science of Rural South African Publics, Environmental Communication, DOI: 10.1080/17524032.2018.1455724). And sometimes their results can be directly applied – for example, a study found that if you post selfies on social media, that can increase perceived trustworthiness. See here: https://phys.org/news/2019-05-scientists-selfie-garner.html. Here is a book (which you can download for free) about the research agenda in effective science communication: https://www.nap.edu/catalog/23674/communicating-science-effectively-a-research-agenda
Video: Conflict of Interest
When toggling both at the same time, references are shown below the subtitles listing.
Subtitles: Script 6 Ethics Tutorial – Conflict of Interest
Let’s start with some good news today: Scientists are influential people. Believe it or not, but there is a lot of evidence for this claim. For example, more and more scientists are advising politicians, stakeholders from the industry and other powerful people. But the same people often fear what science might have to say about a given issue. That’s why they might try to manipulate scientists. They wouldn’t do that if what scientists have to say didn’t have any impact. Here’s an example – and this story has been investigated in a lot of detail. It was thanks to some influential scientists that the tobacco industry was able to cloud everybody’s judgement on the fact that smoking can cause lung cancer. Although one has to admit that those scientists were also influential because the tobacco industry made them influential.
When in the early 1950s more and more scientific evidence was published that the increasing number of lung cancer cases was linked to smoking, tobacco industry executives panicked. But with the help of public relation specialists, they soon found a solution for their problem: They would support the views of a minority of scientists, who were skeptical of the causal relationship between smoking and cancer. By doing so they created a huge scientific controversy that wouldn’t otherwise have existed. Whenever the question of the health risks of smoking came up, they would point to this controversy and state that there are so many uncertainties, and the science isn’t clear yet.
With this strategy the Tobacco industry bought itself valuable time. New generations of smokers became addicted and regulatory interventions were delayed for decades. In the meantime, the tobacco industry earned billions, while many smokers died prematurely. And as if this wouldn’t be tragic enough, what’s even worse is that this strategy was picked up by others. Oil companies pay for assessments from scientists stating that fracking isn’t harmful for the environment.
Conservative foundations are fund scientists who claim that global warming is not caused by humans. And certain think tanks are support philosophers and sociologists whose work is in line with their political agenda.
And what about the scientists? What is their part in this game of money, power and influence? Well, the truth is that not all of these scientists act in bad faith – some genuinely believe in what they are doing. And yet, the fact that they – rather than colleagues with different lines of research – get money distorts the playing field. The results of studies sponsored by interest groups have been shown to be systematically skewed in their favor, a phenomenon that is called “the funding effect”. And that – of course – undermines societies’ trust in science.
So, how can scientists fix this? It might help to think about what “conflict of interest” means. Basically, it’s always about loyalty. And what does loyalty mean in the world of science? To whom or what is a scientist loyal?
Well, a scientist needs to be loyal to two things: to the search for truth, and to basic ethical principles. Anything that might affect a scientist’s loyalty to the search for truth and to basic ethical principles can cause a conflict of interest. If a scientist gets money from a corporation for an assessment and the results do not align with the corporation’s agenda, the scientist’s loyalty to the search for truth is in conflict with his loyalty to the company. But a conflict of interest is not always so easy to identify. It might just be a little bit of money given to a scientist with no visible strings attached. A scientist could, for example, be invited to a conference and a company pays for all the travel expenses, including a stay in a fancy hotel. But even something like this can affect a person’s integrity. After all, it’s natural to react with gratitude, and to feel obliged when receiving a gift. Even if the money from the company didn’t immediately affect the scientist’s judgments, accepting it can be risky.
If push comes to shove and there is a public controversy, the fact that the scientist has taken money from certain players might be used against her. And if the public finds out about something like this, it can lose trust in science.
So, as a scientist you should think twice before you accept contributions from any interest group. However, this is a very idealistic standard – in many countries there is not enough public funding of research projects available and scientists there might not have much of a choice. That can lead to really tough situations!
In any case this is not an issue that scientists should grapple with on their own. We need to have this debate in our institutions, in scientific associations, and also in the broader public. Where is the line between acceptable and unacceptable practices? And how can we protect science from being targeted by vested interests? Let’s start talking about it!
References and more links (without any claim to completeness!)
Accounts of the practices of the tobacco industry
- Brandt, Allan M. (2012). Inventing Conflicts of Interest: A History of Tobacco Industry Tactics. Am J Public Health 102(1), 63-71, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3490543/
- Oreskes, Naomi & Erik M. Conway. 2010. Merchants of Doubt. How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. London et al.: Bloomsbury
Early papers on the harmfulness of smoking
- Schrek, Robert et al. (1950), “Tobacco Smoking as an Etiology Factor in Disease. I. Cancer” Cancer Research 10(1), https://cancerres.aacrjournals.org/content/10/1/49.long
- Mills, Clarence A. and Marjorie Mills Porter (1950). Tobacco Smoking Habits and Cancer of the Mouth and Respiratory System. Cancer Research 10(9), https://cancerres.aacrjournals.org/content/10/9/539.long
- Wynder, E.L, and Evarts A. Graham (1950). Tobacco Smoking as a possible etiologic factor in bronchiogenic carcinoma. The Journal of the American Medical Association 143(4), https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2623809/pdf/15744408.pdf
On the concept of conflict of interest
- Dennis F. Thompson (1993). Understanding financial conflicts of interest. The New England Journal of Medicine 329, 573-576 https://www.nejm.org/doi/full/10.1056/NEJM199308193290812.
Distortion by unequal funding even if all scientists act in good faith
- Holman, Bennett / Bruner, Justin (2017). Experimentation by Industry Selection. Philosophy of Science 84, 1008-1019.
More reports of industry-sponsored research in various fields
- Rosner, David & Gerald Markowitz (1987). Deceit and Denial: The Deadly Politics of Industrial Pollution. Berkeley: University of California Press.
- Proctor, Robert N. (1996). Cancer wars: how politics shapes what we know and don’t know about cancer.
- Michaels, David (2008). Doubt is their Product. How Industry’s Assault on Science Threatens Your Health. Oxford: Oxford University Press.
- Wurster, Charles (2015). DDT Wars: Rescuing Our National Bird, Preventing Cancer, and Creating the Environmental Defense Fund. New York: Oxford University Press.
- Otto, Shawn (2016). The War on Science. Who’s Waging It. Why It Matters. What We Can Do about It. Minneapolis: Milkweed.
- Rabin-Havt, Ari (2016). Lies, Incorporated. The World of Post-Truth Politics. New York: Anchor.
- Johnson, David V. (2017). Academe on the Auction Block. The Baffler 36, https://thebaffler.com/salvos/academe-on-the-auction-block-johnson
- O’Connor, Cailin and James Owen Wetherall (2019): The Misinformation Age. New Haven: Yale University Press.
- Michaels, David (2020). The Triumph of Doubt: Dark money and the science of deception. Oxford: Oxford University Press.
The funding effect (a selection of studies taken from Michaels, The Triumph of Doubt, p. 144):
- Stelfox, H. T., G. Chua, G. K. O’Rourke, and A. S. Detsky (1998). Conflict of interest in the debate over calcium-channel antagonists. New England Journal of Medicine 338: 101–06.
- Koepp R, Miles SH. (1999). Meta-analysis of tacrine for Alzheimer disease: The influence of industry sponsors. JAMA 281 (24): 2287–88.
- Mandelkern M. 1999. Manufacturer support and outcome. J Clin Psychiatry 60(2): 122–23
- Vandenbroucke JP, Helmerhorst FM, Rosendaal FR. (2000). Competing interests and controversy about third generation oral contraceptives: BMJ readers should know whose words they read. BMJ 320 (7231): 381 – 82.
- Knox KS, Adams JR, Djulbegovic B et al. (2000). Reporting and dissemination of industry versus non-profit sponsored economic analyses of six novel drugs used in oncology. Ann Oncol. 11(12): 1591–95.
- Yaphe J, Edman R, Knishkowy B et al. (2001). The association between funding by commercial interests and study outcome in randomized controlled drug trials. Fam Pract. 18 (6): 565–68.
- Bekelman, J. E., Y. Li, and C. P. Gross (2003). Scope and impact of financial conflicts of interest in biomedical research: A systematic review. Journal of the American Medical Association 289: 454–65
- Lexchin J, Bero LA, Djulbegovic B et al. (2003). Pharmaceutical industry sponsorship and research outcome and quality: Systematic review. 326 (7400): 1167 – 70
- Montgomery JH, Byerly M, Carmody T et al. (2004). An analysis of the effect of funding source in randomized clinical trials of second generation antipsychotics for the treatment of schizophrenia. Control Clin Trials. 25 (6): 598 – 612.