Research Integrity Articles - TL;DR - Digital Science https://www.digital-science.com/tldr/articles/topics/research-integrity/ Advancing the Research Ecosystem Mon, 31 Mar 2025 14:00:31 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 TL;DR Shorts: Dr Danny Hillis on the Evolution of AI https://www.digital-science.com/tldr/article/tldr-shorts-dr-danny-hillis-on-the-evolution-of-ai/ Tue, 28 Jan 2025 01:40:24 +0000 https://www.digital-science.com/?post_type=tldr_article&p=75178 Welcome to week 17 of January 2025, the month that seems never to end - however, I have been reliably informed that this IS, in fact, the LAST week of the month so we thought we’d reward you with an exclusive TL;DR Long from Dr Danny Hillis. In this episode, Danny chats about the history of AI, from working with the field’s founding fathers to predictions that have come true, and what we can really expect from AI in the coming years.

The post TL;DR Shorts: Dr Danny Hillis on the Evolution of AI appeared first on Digital Science.

]]>
Welcome to week 17 of January 2025, the month that seems never to end – however, I have been reliably informed that this IS, in fact, the LAST week of the month. Since time appears to be standing still, we thought we’d reward you with something special! TL;DR Tuesdays are famed for our TL;DR Shorts, but Dr Danny Hillis, founder of Applied Invention, becomes only the second contributor in a year to be awarded an exclusive TL;DR Long – and our longest non-Speaker Series offering so far. To explain why he had so many thoughts, all I need to say is Artificial Intelligence. In this episode, Danny chats about the history of AI, from working with the field’s founding fathers to predictions that have come true, and what we can really expect from AI in the coming years.

Dr Danny Hillis talks about the history and future of artificial intelligence. Check out the video on the Digital Science YouTube channel: https://youtu.be/xH6-DUBKKEM

Although AI feels like a recent tech development, Danny reminds us that it has a long-established history. Danny worked alongside the likes of Marvin Minsky, and Claude Shannon – no, they’re not Bugsy Malone characters but are two of the team members who established the field of artificial intelligence. Working with them, Danny and the crew discovered that what they thought would be easy was much harder than expected, while what they were wary of was much easier to achieve. Pattern recognisers were developed with little effort, but creating a computer that could beat a human at Chess was much harder.

It turned out that the main barriers to success were a lack of data and, the most limiting factor of all, a lack of computational power. But that’s OK because Danny’s PhD focused on what would be required to build the biggest computer. He discussed his Thinking Machines in our Speaker Series chat which we shared last month.

Danny notes that today’s AI researchers are working on algorithms that are very close to the ones the team imagined back at the start of this area of research, however, he reminds us that we are still way off machines that can replace humans. While well-trained machines can carry out specific talks well, they are missing the critical thinking part of intelligence, however good they are becoming in mimicking intelligence, as evidenced in numerous case studies of AIs that hallucinate, or create solutions that look and sound right based on the fact that the machine has recognised patterns and attempts to apply those rules but that, without real meaning or understanding, are factually incorrect. Danny tells the story of how his granddaughter can recognise patterns in visiting contractors and become someone who sounds like an expert in moments, but scratch the surface and there is no real knowledge of the area to be able to make logical decisions. I too am reminded of the time I accidentally found myself co-piloting an island-hopper propellor plane across Belize, having curiously followed the actions of the pilot for the first two stops – but we’ll save that story for another time. The year is young, and we’ve got lots more to chat about, and many more stories to share.

Danny reflects that, while to experts it doesn’t feel like AI has moved on much since the development of supercomputational power, there is a change coming, as evidenced by the ever-increasing rate of development in the area. The difference this time around is funding, which is attracting the smartest minds in their droves, catalysing this progress by exploring the intuitive aspects of this technology.

To make this technology truly good, Danny firmly believes that a source of truth is required. One of his interests is building a knowledge graph of the provenance of information, which he further expanded on in last month’s Speaker Series. This would go some way to building technology that is as robust and trustworthy as possible, while attempting to eliminate biases or building on questionable knowledge that can bed into the foundations, creating points of future weakness and instability.

The great thing about building good technology is that it in turn starts to iteratively learn and teach itself, generating more knowledge, even about that knowledge itself. These are exciting times for AI, but public and research community engagement remains vital to ensure that developments do not double down on historically discriminatory narratives or unscientific knowledge that have no place in today’s society.

Subscribe now to be notified of each weekly release of the latest TL;DR Short, and catch up with the entire series here

If you’d like to suggest future contributors for our series or suggest some topics you’d like us to cover, drop Suze a message on one of our social media channels and use the hashtag #TLDRShorts.

The post TL;DR Shorts: Dr Danny Hillis on the Evolution of AI appeared first on Digital Science.

]]>
Science Journalism and Social Justice – meet Deborah Blum https://www.digital-science.com/tldr/article/meet-deborah-blum/ Tue, 07 Jan 2025 11:45:00 +0000 https://www.digital-science.com/?post_type=tldr_article&p=74968 In an era of rapid scientific progress and rampant misinformation, science journalism plays a crucial role in developing understanding and trust. In our first Speaker Series chat of 2025, and in a month that heralds much political interest in the role of truth and trust in society, Deborah Blum, a Pulitzer Prize-winning journalist and director of the Knight Science Journalism program at MIT, discusses the challenges and opportunities in science communication today. From the power of storytelling to the importance of science literacy and equity, Deborah highlights how journalists and scientists can collaborate to bridge the gap between research and society, ensuring science serves all communities and drives meaningful, impactful change.

The post Science Journalism and Social Justice – meet Deborah Blum appeared first on Digital Science.

]]>
In an age of rapid scientific advancements and an overwhelming volume of information, good science journalism has never been more important. Deborah Blum, a Pulitzer Prize-winning journalist and director of the Knight Science Journalism program at MIT, is leading the charge on this mission. Through her work and the skills she builds in other science journalists, she bridges the gap between science and society, helping to improve understanding, combat misinformation, and rebuild public trust in the scientific process.

In our first Speaker Series chat of 2025, and in a month that heralds much political interest in the role of truth and trust in society, Deborah delves into the challenges and opportunities facing science communication today. She explores the importance of storytelling, the necessity of improving scientific literacy in all, and the steps needed to build a future where science journalism drives meaningful dialogue and action. Her insights offer a vision of how journalists and scientists can work together to showcase the human side of science and ensure it serves all communities fairly and effectively.

Deborah chats with Suze about science journalism and social justice. See the full interview here: https://youtu.be/iXry3WOwG08

Deborah Blum is an acclaimed science journalist, Pulitzer Prize-winning author, and Director of the Knight Science Journalism program at MIT. With a career spanning decades, she has worked tirelessly to bridge the gap between science and the public through her compelling storytelling and her commitment to advancing science literacy. Her influential books, which include The Poisoner’s Handbook and The Poison Squad, explore the intersection of science, history, and societal impact. At the Knight Science Journalism program, Deborah leads efforts to train and support journalists worldwide, fostering a global community dedicated to improving the quality of science communication and addressing pressing challenges like misinformation and declining public trust in science.

The Role of Science Journalism

Science journalism plays an important role in making connections between scientific discoveries and people’s everyday lives. Deborah describes the role that science journalists play in helping to translate complex scientific ideas into stories that resonate with readers. Good storytelling can make even the most abstract research feel relevant and engaging.

In a world increasingly driven by scientific and technological progress, this connection is more important than ever. Deborah highlights that science journalism not only informs but also inspires public interest and action. By showing how science impacts issues like health, climate change, and technology, journalists help communities see the relevance of research in shaping our future. As she puts it, “We need to write about science and its impacts, right? We need to acknowledge that it has these social and cultural impacts. We need to illuminate those in all of their social justice issues.”

The Impact of the KSJ Program

At the heart of Deborah’s work is her leadership of the Knight Science Journalism (KSJ) program at MIT, where she has been Director for a decade, a role which she will be stepping down from in July 2025. The KSJ program is somewhat of a global engine for excellence in science communication. The program provides resources, fellowships, and a thriving community for journalists to deepen their expertise and broaden their perspectives.

But the KSJ program goes beyond training – it builds a community of professionals who share a commitment to thoughtful, accurate reporting. Deborah believes this community approach is critical in a world where misinformation spreads rapidly. When journalists feel supported and connected, they are better equipped to tackle tough stories and elevate public understanding. This global impact is reflected in the program’s alumni, who are shaping conversations about science across continents.

Challenges in Science Communication

Communicating science effectively isn’t without its hurdles. Deborah points out that misinformation is casting an ever-growing shadow, compounded by public scepticism and limited access to scientific education and critical thinking skills. We now live in an age where everyone has a platform from which they can share their thoughts, but not everyone values the accuracy of those thoughts. Deborah emphasised the need for journalists to cut through the noise with credible, engaging stories.

Another challenge is the disconnect between scientists and the public. Deborah argues that many researchers struggle to communicate their work in accessible ways, leaving gaps that can be filled by misunderstanding or fear. “The more people can see scientists as actual human beings next door, the better off we all are,” she insists. Building bridges between these groups is crucial to fostering a more informed and engaged society.

The Importance of Science Literacy

Science literacy is the foundation of informed decision-making, yet many people lack the tools to critically evaluate scientific information, with many people even being fearful and actively disengaged with science. Deborah shares her thoughts about the shortcomings of educational systems. She believes that by not teaching people how to think critically about science, we are doing a disservice to society. Without this foundation, the public is more vulnerable to the potentially negative consequences of pseudoscience and misinformation.

Deborah believes that improving science literacy requires a collective effort. Journalists, educators, and policymakers must work together to ensure that everyone has access to clear and reliable information. It is not only about teaching facts but also about teaching people how to think, and how to evaluate those facts for any inherent bias. “We don’t want everyone to be a scientist, but we do want everyone to know something about science and how to make decisions about science and to recognize that every time you fry an egg or make a cup of tea or peel a banana, you’re engaging with chemistry, right, in everyday science. And it makes the world that much more interesting.” By developing people’s ability to understand and question scientific claims, society can make better choices for the future, and hold bad actors accountable for bad science.

Building Trust in Science

Trust in science has been eroded in recent years, but Deborah sees this as an opportunity for change. Trust isn’t automatically granted – it is something that must be earned. Scientists and journalists both have roles to play in this process. Deborah encourages researchers to embrace transparency and share not just their successes but also their uncertainties and failures, to humanise their motivations and actions, and to showcase the more realistic side of the scientific process.

She also highlights the importance of equity in building trust. Science needs to serve all communities, not just a select few, so addressing social justice issues in science such as unequal access to education and healthcare can help rebuild trust in science’s potential to improve lives. Through honest and inclusive communication, science can regain its role as a trusted guide for society.

The Future of Science Journalism

Looking ahead, Deborah envisions a future where science journalism is more valuable than ever. She sees the next generation of journalists as not only storytellers but also advocates for social justice. She believes that the future of science journalism lies in showcasing the human side of science, and how it impacts people and communities.

Deborah encourages young journalists to be fearless in tackling big issues, from climate change to misinformation. We need journalists who are smarter, braver, and more curious. By embracing innovation and collaboration, science journalism can continue to be a powerful force for good, shaping public understanding and inspiring meaningful action in an increasingly complex world.

Deborah’s thoughts are a powerful reminder of the critical role that science journalism plays in shaping a better-informed, more engaged society. From her leadership of the Knight Science Journalism program to her advocacy for transparency, equity, and science literacy, Deborah tangibly demonstrates how storytelling can drive meaningful change. As we face global challenges like misinformation, climate change, and declining trust in science, her call for collaboration and innovation in both journalism and science communication is more relevant than ever. By fostering a new generation of journalists who are fearless, thoughtful, and socially conscious, Deborah is helping to build a future where science journalism not only informs but also empowers us to build a better world.

You can watch the full interview with Deborah on our YouTube channel, and check out our Speaker Series playlist on YouTube which includes chats with some of our previous speakers, as well as our TL;DR Shorts playlist with short, snappy insights from a range of experts on the topics that matter to the research community. We’ve even joined the podcast universe! Catch our 2025 Speaker Series season and our chat with Deborah on Spotify or Apple Podcasts.

With thanks to Huw James from Science Story Lab for filming and co-producing this interview. Filmed at the Knight Science Journalism offices at MIT in Cambridge, Massachusetts, USA in April 2024.

The post Science Journalism and Social Justice – meet Deborah Blum appeared first on Digital Science.

]]>
The 12 Days of DSmas https://www.digital-science.com/tldr/article/12-days-of-dsmas-2024/ Mon, 23 Dec 2024 12:34:07 +0000 https://www.digital-science.com/?post_type=tldr_article&p=74724 Every Muppets fan knows that Christmas is all about being revisited by people you've previously encountered. So from 25th December to 5th January we'll be sharing our 12 Days of DSmas. Check back daily as we share a Speaker Series 2024 chat each and every day. Happy Holidays from the Digital Science Thought Leadership Team!

The post The 12 Days of DSmas appeared first on Digital Science.

]]>

Every Muppets fan knows that Christmas is all about being revisited by people you’ve previously encountered. So from 25th December to 5th January we’ll be sharing our 12 Days of DSmas. Check back daily as we share a Speaker Series chat each and every day. Happy Holidays from the Digital Science Thought Leadership Team!

And if you just can’t wait, you can catch up on our entire 2024 Speaker Series season on-demand:

Merry Dr Chris Van Tulleken-mas! We chatted with Chris online about research integrity, impact, openness, and investigative research. Catch his interview here, and don’t forget to watch his Xmas Lectures on BBC for The Royal Institution this year!

As a Nobel laureate and former president of The Royal Society, Professor Venki Ramakrishnan has long played a role in shaping a more innovative, inclusive and impactful research culture, which we chatted about during his live Speaker Series lecture at the Ri. We went to Cambridge, UK to hear his thoughts on curiosity, competition and collaboration.

As Chief Publishing Officer at PLOS, Niamh provides business leadership for the entire PLOS portfolio to advance PLOS’s vision and mission. In this episode Niamh talks about the evolving landscape of scientific research and the push towards open science, including her journey from the early days of advocating for public access to research, to tackling current challenges like making science more inclusive and accessible.

Building communities is hard, but Alice Meadows has worked hard to make it look effortless. Here she is in Boston, MA, USA, telling us about the power of persistent identifiers.

It’s New Year’s Eve, and a time to reflect on the past and make plans for the months ahead. When we visited the Max Planck Institute in Berlin, Germany, we added to the echoes of amazing research conversations resonating around their iconic library when we chatted about the history, philosophy and future of research with Dr Maria Avxentevskaya and Dr Ben Johnson.

Happy New Year! We caught up with pro-skater Rodney Mullen at his home in Los Angeles, USA to hear his thoughts on why we need diverse minds to innovate in all walks – and ollies – of life. And, since it’s the new year and you’re probably feeling a little “sleep deprived”, you can also follow this up with his live Speaker Series lecture at the Ri.

If you’ve been eating as much cheese as this author, dearest gentle reader, you too will be experiencing a fascinatingly slippery grasp on reality – which brings us to Day 9’s featured speaker. “Is Maths Real?” was the question that Dr Eugenia Cheng posed in her live Speaker Series lecture at the Ri. I caught up with her ahead of her lecture in the iconic Faraday lecture theatre in London, UK to talk about why we need to break down barriers of knowledge in research, and reunite STEM and the humanities for impactful change.

2024 was a wild ride for global politics, and research is not immune to its changes. I caught up with Professor Jenny Reardon in Cambridge, UK, to learn more about how we can work with politics, and not against it, to provide solutions for everyone across the world, and where red tape remains to be overcome.

Our final Speaker Series guest of 2024 was Dr Danny Hillis. We visited the Applied Invention offices in Cambridge, MA, USA, where innovator, inventor, and Imagineer Danny shared his thoughts on how we can use novel technology to combat novel challenges in mis- and disinformation and make the most meaningful impact from data.

Catch up on our entire 2024 Speaker Series season on-demand and watch this space for our 2025 series featuring more impactful innovators from across the research landscape. Happy Holidays, and Happy New Year!

The post The 12 Days of DSmas appeared first on Digital Science.

]]>
Presenting: Research Transformation: Change in the era of AI, open and impact https://www.digital-science.com/tldr/article/presenting-research-transformation-change-in-ai-open-and-impact/ Mon, 28 Oct 2024 09:45:00 +0000 https://www.digital-science.com/?post_type=tldr_article&p=73965 Mark Hahnel and Simon Porter introduce Digital Science's new report as part of our ongoing investigation into Research Transformation: Change in the era of AI, open and impact.

The post Presenting: Research Transformation: Change in the era of AI, open and impact appeared first on Digital Science.

]]>
Research Transformation report graphic
Research Transformation: Change in the era of AI, open and impact.

As part of our ongoing investigation into Research Transformation, we are delighted to present a new report, Research Transformation: Change in the era of AI, open and impact.

Within the report, we sought to understand from our academic research community how research transformation is experienced across different roles and responsibilities. The report, which is a mixture of surveys and interviews across libraries, research offices, leadership and faculty, reflects transformations in the way we collaborate, assess, communicate, and conduct research.

The positions that we hold towards these areas are not the same as those we held a decade or even five years ago. Each of these perspectives represent shifts in the way that we perceive ourselves and the roles that we play in the community. Although there is concern about the impact that AI will have on our community, our ability to adapt and change is reflected strongly across all areas of research, including open access, metrics collaboration and research security. That such a diverse community is able to continually adapt to change reflects well on our ability to respond to future challenges.

Key findings from the report:

  • Open research is transforming research, but barriers remain
  • Research metrics are evolving to emphasize holistic impact and inclusivity
  • AI’s transformative potential is huge, but bureaucracy and skill gaps threaten progress
  • Collaboration is booming, but increasing concerns over funding and security
  • Security and risk management need a strategic and cultural overhaul

We do these kinds of surveys to understand where the research community is moving and how we can tweak and adapt our approach as a company. We were very grateful to the great minds who helped us out with a deep dive into what has affected their roles and will affect their roles going forward. Metrics, Open Research and AI are very aligned with the tools that we provide for academics, and the strategy we have to make research more inclusive, transparent and trustworthy.

The post Presenting: Research Transformation: Change in the era of AI, open and impact appeared first on Digital Science.

]]>
The TL;DR on… ERROR https://www.digital-science.com/tldr/article/tldr-error/ Wed, 25 Sep 2024 17:02:11 +0000 https://www.digital-science.com/?post_type=tldr_article&p=72358 We love a good deep dive into the awkward challenges and innovative solutions that are transforming the world of academia and industry. In this article and in the full video interview, we're discussing an interesting new initiative that's been making waves in the research community: ERROR.

Inspired by bug bounty programs in the tech industry, ERROR offers financial rewards to those who identify and report errors in academic research. ERROR has the potential to revolutionise how we approach, among other things, research integrity and open research by incentivising the thorough scrutiny of published research information and enhancing transparency.

Suze sat down with two other members of the TL;DR team, Leslie and Mark, to shed light on how ERROR can bolster trust and credibility in scientific findings, and explore how this initiative aligns with the principles of open research and how all these things can drive a culture of collaboration and accountability. They also discussed the impact that ERROR could have on the research community and beyond.

The post The TL;DR on… ERROR appeared first on Digital Science.

]]>
We love a good deep dive into the awkward challenges and innovative solutions transforming the world of academia and industry. In this article and in the full video interview, we’re discussing an interesting new initiative that’s been making waves in the research community: ERROR.

Inspired by bug bounty programs in the tech industry, ERROR offers financial rewards to those who identify and report errors in academic research. ERROR has the potential to revolutionize how we approach, among other things, research integrity and open research by incentivizing the thorough scrutiny of published research information and enhancing transparency.

I sat down with two other members of the TL;DR team, VP of Research Integrity Leslie McIntosh and VP of Open Research Mark Hahnel, to shed light on how ERROR can bolster trust and credibility in scientific findings, and explore how this initiative aligns with the principles of open research – and how all these things can drive a culture of collaboration and accountability. We also discussed the impact that ERROR could have on the research community and beyond.

ERROR is a brand new initiative created to tackle errors in research publications through incentivized checking. The TL;DR team sat down for a chat about what this means for the research community through the lenses of research integrity and open research. Watch the whole conversation on our YouTube channel: https://youtu.be/du6pEulN85o

Leslie’s perspective on ERROR

Leslie’s initial thoughts about ERROR were cautious, recognizing its potential to strengthen research integrity but also raising concerns about unintended consequences.

She noted that errors are an inherent part of the scientific process, and over-standardization might risk losing the exploratory nature of discovery. Drawing parallels to the food industry’s pursuit of efficiency leading to uniformity and loss of nutrients, Leslie suggested that aiming for perfection in science could overlook the value of learning from mistakes. She warned that emphasizing error correction too rigidly might diminish the broader mission of science – discovery and understanding.

Leslie: “Errors are part of science and part of the discovery… are we going so deep into science and saying that everything has to be perfect, that we’re losing the greater meaning of what it is to search for truth or discovery [or] understand that there’s learning in the errors that we have?”

Leslie also linked this discussion to open research. While open science encourages interpretation and influence from diverse participants, the public’s misunderstanding of scientific errors could weaponize these mistakes, undermining trust in research. She stressed that errors are an integral, even exciting, part of the scientific method and should be embraced rather than hidden.

Mark’s perspective on ERROR

Mark’s initial thoughts were more optimistic, especially within the context of open research.

Mark: “…one of the benefits of open research is we can move further faster and remove any barriers to building on top of the research that’s gone beforehand. And the most important thing you need is trust, [which] is more important than speed of publication, or how open it is, [or] the cost-effectiveness of the dissemination of that research.”

Mark also shared his excitement about innovation in the way we do research. He was particularly excited about ERROR’s approach to addressing the problem of peer review, as the initiative offers a new way of tackling longstanding issues in academia by bringing in more participants to scrutinize research.

He thought the introduction of financial incentives to encourage error reporting could lead to a more reliable research landscape.

“I think the payment for the work is the most interesting part for me, because when we look at academia and perverse incentives in general, I’m excited that academics who are often not paid for their work are being paid for their work in academic publishing.”

However, Mark’s optimism was not entirely without wariness. He shared Leslie’s caution about the incentives, warning of potential unintended outcomes. Financial rewards might encourage individuals to prioritize finding errors for profit rather than for the advancement of science, raising ethical concerns.

Ethical concerns with incentivization

Leslie expressed reservations about the terminology of “bounty hunters”, which she felt criminalizes those who make honest mistakes in science. She emphasized that errors are often unintentional.

Leslie: “It just makes me cringe… People who make honest errors are not criminals. That is part of science. So I really think that ethically when we are using a term like bounty hunters, it connotes a feeling of criminalization. And I think there are some ethical concerns there with doing that.”

Leslie’s ethical concerns extended to the global research ecosystem, noting that ERROR could disproportionately benefit well-funded researchers from the Global North, leaving under-resourced researchers at a disadvantage. She urged for more inclusive oversight and diversity in the initiative’s leadership to prevent inequities.

She also agreed with Mark about the importance of rewarding researchers for their contributions. Many researchers do unpaid labor in academia, and compensating them for their efforts could be a significant positive change.

Challenges of integrating ERROR with open research

ERROR is a promising initiative, but I wanted to hear about the challenges in integrating a system like this alongside existing open research practices, especially when open research itself is such a broad, global and culturally diverse endeavor.

Both Leslie and Mark emphasized the importance of ensuring that the system includes various research approaches from around the world.

Mark: “I for one think all peer review should be paid and that’s something that is relatively controversial in the conversations I have. What does it mean for financial incentivization in countries where the economics is so disparate?”

Mark extended this concept of inclusion to the application of artificial intelligence (AI), machine learning (ML) and large language models (LLMs) in research, noting that training these technologies requires access to diverse and accurate data. He warned that if certain research communities are excluded, their knowledge may not be reflected in the datasets used to build future AI research tools.

“What about the people who do not have access to this and therefore their content doesn’t get included in the large language models, and doesn’t go on to form new knowledge?”

He also expressed excitement about the potential for ERROR to enhance research integrity in AI and ML development. He highlighted the need for robust and diverse data, emphasizing that machines need both accurate and erroneous data to learn effectively. This approach could ultimately improve the quality of research content, making it more trustworthy for both human and machine use.

Improving research tools and integrity

Given the challenges within research and the current limitations of tools like ERROR, I asked Leslie what she would like to see in the development of these and other research tools, especially within the area of research integrity. She took the opportunity to reflect on the joy of errors and failure in science.

Leslie: “If you go back to Alexander Fleming’s paper on penicillin and read that, it is a story. It is a story of the errors that he had… And those errors were part of or are part of that seminal paper. It’s incredible, so why not celebrate the errors and put those as part of the paper, talk about [how] ‘we tried this, and you know what, the refrigerator went out during this time, and what we learned from the refrigerator going out is that the bug still grew’, or whatever it was.

“You need those errors in order to learn from the errors, meaning you need those captured, so that you can learn what is and what is not contributing to that overall goal and why it isn’t. So we actually need more of the information of how things went wrong.”

I also asked Mark what improvements he would like to see from tools like ERROR from the open research perspective. He emphasized the need for better metadata in research publishing, especially in the context of open data. Drawing parallels to the open-source software world, where detailed documentation helps others build on existing work, he suggested that improving how researchers describe their data could enhance collaboration.

Mark also feels that the development of a tool like ERROR highlights other challenges in the way we are currently publishing research, such as deeper issues with peer review, or incentives for scholarly publishing.

Mark: “…the incentive structure of only publishing novel research in certain journals builds into that idea that you’re not going to publish your null data, because it’s not novel and the incentive structure isn’t there. So as I said, could talk for hours about why I’m excited about it, but I think the ERROR review team have a lot of things to unpack.”

Future of research integrity and open research

What do Leslie and Mark want the research community to take away from this discussion on error reporting and its impact on research integrity and open research?

Leslie wants to shine a light on science communication and its role in helping the public to understand what ERROR represents, and how it fits into the scientific ecosystem.

Leslie: “…one of the ways in which science is being weaponized is to say peer review is dead. You start breaking apart one of the scaffolds of trust that we have within science… So I think that the science communicators here are very important in the narrative of what this is, what it isn’t, and what science is.”

Both Leslie and Mark agreed that while ERROR presents exciting possibilities, scaling the initiative remains a challenge. Mark raised questions about how ERROR could expand beyond its current scope, with only 250 papers reviewed over four years and each successful error detection earning a financial reward. Considering the millions of papers published annually, it is unclear how ERROR can be scaled globally and become a sustainable solution.

Mark: “…my biggest concern about this is, how does it scale? A thousand francs a pop, it’s 250 papers. There [were] two million papers [published] last year. Who’s going to pay for that? How do you make this global? How do you make this all-encompassing?”

Conclusion

It is clear from our discussion that ERROR represents a significant step forward in experimenting to enhance both research integrity and open research through this incentivised bug-hunting system.

Leslie has highlighted how the initiative can act as a robust safeguard, ensuring that research findings are more thoroughly vetted and reliable, but she does remind us that we need to be inclusive in this approach. Mark has also emphasized the potential for a tool like this in making publication processes more efficient – and even finally rewarding researchers for all the additional work that they’re doing – but he does wonder how this can scale up to foster a more transparent and collaborative research environment that aligns perfectly with the ethos of open research as well.

Leslie and Mark’s comments are certainly timely, given that the theme of Digital Science’s 2024 Catalyst Grant program is innovation for research integrity. You can find out more about how different segments of research can and should be contributing to this space by reading our TL;DR article on it here.

We look forward to exploring more innovations and initiatives that are going to shape – or shatter – the future of academia, so if you’d like to suggest a topic we should be discussing, please let us know.

The post The TL;DR on… ERROR appeared first on Digital Science.

]]>
Innovation and integrity across all research segments to safeguard the future of research https://www.digital-science.com/tldr/article/innovation-and-integrity-to-safeguard-the-future-of-research/ Wed, 18 Sep 2024 07:00:00 +0000 https://www.digital-science.com/?post_type=tldr_article&p=73293 Maintaining integrity and security is paramount in the ever-evolving landscape of research. While science, technology and medicine publishers have made significant strides in this area, the importance of innovative solutions in this space extends beyond publishing. It is within this environment that Digital Science is now seeking technology-driven ideas to safeguard research integrity and support trust in science. This is the focus of our Catalyst Grant round for 2024.

The post Innovation and integrity across all research segments to safeguard the future of research appeared first on Digital Science.

]]>
Maintaining integrity and security is paramount in the ever-evolving landscape of research. While science, technology and medicine (STM) publishers have made significant strides in this area, the importance of innovative solutions in this space extends beyond publishing.

It is within this environment that Digital Science is now seeking technology-driven ideas to safeguard research integrity and support trust in science. This is the focus of our Catalyst Grant round for 2024.

Digital Science is also one of the prime organizations behind the push for a new field of research integrity forensics, known as Forensic Scientometrics (FoSci).

Ensuring research integrity is a collective responsibility that benefits all segments of the research ecosystem, from individual researchers to governments to industrial organizations. Here are the key stakeholders and the opportunities for each to bolster research integrity and security.

Researchers

Innovative frameworks and tools help researchers maintain high ethical standards by preventing misconduct such as fabrication, falsification, and plagiarism. However, much more innovation is needed to uphold integrity in the research ecosystem, including solutions for academic institutions, governments, funders, and more. These innovations ensure the authenticity and reliability of research findings. Additionally, advanced training programs on research ethics and security protocols can equip researchers with the necessary knowledge and skills to conduct their work responsibly, fostering a culture of integrity from the outset.

Academic Institutions

Strengthened policies and procedures are essential for ensuring compliance with ethical standards and security protocols, significantly reducing the risk of breaches and misconduct, safeguarding the institution’s integrity, and supporting research and researchers. A strong commitment to high standards of integrity and security enhances an institution’s reputation, attracting top talent and funding, and solidifying its standing in the academic community.

Investing in infrastructure and resources to support research integrity and security such as secure data storage systems and comprehensive training programs is crucial for fostering a culture of integrity. Additionally, improved frameworks for safe and ethical collaboration with external partners facilitate partnerships with other academic institutions and industry, ensuring that these collaborative efforts adhere to high standards of integrity and security.

Publishing

Adopting advanced technologies to enhance the peer review process is crucial for ensuring the integrity and quality of published research. These technologies help to maintain rigorous review standards, upholding the credibility of scientific literature. Additionally, sophisticated detection tools are vital for preventing the publication of unoriginal or unethical work, and safeguarding the originality and integrity of research publications.

Improved mechanisms and processes for transparently handling retractions and corrections are necessary to maintain the credibility of scientific literature, grants and patents, ensuring that errors are addressed promptly and openly. Ensuring the security of submitted manuscripts and associated data is also a top priority, as well as protecting intellectual property and sensitive information, maintaining the trust of authors and readers alike.

Governments and Funders

Creating and enforcing robust policies and regulations is essential for promoting research integrity and security, ensuring public trust in funded research, and providing a clear framework for ethical conduct. Prioritising funding for projects and institutions that demonstrate firm commitments to integrity and security ensures that resources are allocated to trustworthy and responsible research endeavors.

Enhanced mechanisms for monitoring and auditing funded research are crucial for ensuring accountability and transparency in public funds and building public confidence in the research process. Furthermore, establishing international standards and agreements promotes global research integrity and security, facilitating cross-border collaborations and driving scientific progress on a global scale.

Corporate Industrial Research Organizations

Advanced methods for protecting intellectual property and proprietary data are crucial for maintaining a competitive advantage and ensuring compliance with legal requirements, safeguarding valuable research assets. Secure and ethical frameworks for collaborating with academic researchers ensure mutual benefits and adherence to integrity standards, driving innovation while upholding high ethical standards.

Balancing innovation with compliance is essential to ensure that cutting-edge research aligns with ethical and security standards, fostering a responsible and forward-thinking research environment. Developing comprehensive risk management strategies is also vital for mitigating potential breaches in research integrity and security, protecting the organization’s reputation and research investment.

Impact on the Research Community as a Whole

Trust and Credibility

Improved trust and credibility of research outputs benefit the entire research ecosystem. Enhanced public confidence in scientific findings drives support for further research and innovation.

Efficiency and Productivity

Streamlined processes and tools for ensuring research integrity and security lead to more efficient and productive research environments. This efficiency accelerates scientific discovery and application, the impact of which can be felt by everyone.

Global Collaboration

Harmonized standards and practices facilitate international research collaborations. These collaborations drive global scientific progress, and address pressing challenges that transcend borders.

Innovative solutions in research integrity are crucial for all segments of the research ecosystem. While STM publishers have pioneered efforts in this domain, the broader research community must continue to build on these foundations. By fostering a culture of integrity, we can ensure that research remains a trusted and vital force for advancing knowledge and improving lives worldwide.


The post Innovation and integrity across all research segments to safeguard the future of research appeared first on Digital Science.

]]>
TL;DR Shorts: Mariette DiChristina on Trust and Civic Science https://www.digital-science.com/tldr/article/tldr-shorts-mariette-dichristina-on-trust-and-civic-science/ Tue, 16 Jul 2024 10:45:00 +0000 https://www.digital-science.com/?post_type=tldr_article&p=72502 This TL;DR Tuesday we're talking about civic science - the democratisation of knowledge production through engaging with impacted communities to co-produce scientific solutions - as a tool for building trust in research with Boston University's Mariette DiChristina, Dean of the College of Communication.

The post TL;DR Shorts: Mariette DiChristina on Trust and Civic Science appeared first on Digital Science.

]]>
This week’s TL;DR Shorts episode features Mariette DiChristina, Dean of the College of Communication at Boston University, and former Editor-in-Chief of Scientific American, talking about the role of civic science in building trust in research.

Mariette reminds us of the role of communications in trust in research, something she touched on in a previous TL;DR Shorts episode. Mariette extends the idea from dissemination into true engagement – a two-way listening and learning process where communities impacted by science can themselves be involved in the creation and development of novel research.

Mariette DiChristina, Dean of the College of Communication at Boston University, discusses the role that civic science can play in helping build trust in research. See the full video on Digital Science’s YouTube channel: https://youtu.be/eEuYk-xgoiE

Also known as public engagement and patient and public involvement (PPI), civic science takes an interdisciplinary approach to integrate scientific research with community engagement and even participation, in order to more appropriately and impactfully overcome societal challenges. It does this through the democratisation of knowledge production by involving impacted communities in the scientific process and ensures that research best reflects the public’s actual needs and values. Civic science emphasises collaboration between scientists, policymakers, and the public to co-create solutions that are scientifically sound and socially relevant. This goes some way to enhancing and improving the transparency, accountability, and impact of scientific research, while also scaffolding a more informed and engaged society.

If you’d like to suggest future contributors for our series or suggest some topics you’d like us to cover, drop Suze a message on one of our social media channels and use the hashtag #TLDRShorts. Subscribe now to be notified of each weekly release of the latest TL;DR Short, and catch up with the entire series here.

The post TL;DR Shorts: Mariette DiChristina on Trust and Civic Science appeared first on Digital Science.

]]>
TL;DR Shorts: Dr Amy Brand on Trust in Research https://www.digital-science.com/tldr/article/tldr-shorts-dr-amy-brand-on-trust-in-research/ Tue, 21 May 2024 10:47:10 +0000 https://www.digital-science.com/?post_type=tldr_article&p=71799 In today's TL;DR Shorts episode we hear from Dr Amy Brand about the importance of transparency via provenance of published research information to help us create new research technology based on trustworthy knowledge.

The post TL;DR Shorts: Dr Amy Brand on Trust in Research appeared first on Digital Science.

]]>
Happy TL;DR Tuesday! This week’s TL;DR Shorts contributor is Dr Amy Brand, Director and Publisher at MIT Press. In this episode, Amy talks about the importance of knowing whether published research can be trusted, especially as we use this knowledge to build novel research technologies such as AI and knowledge graphs.

Amy believes that AI and other novel technologies can help us build and promote trust in science – something that is needed now more than ever before. Recent years have seen a backlash against experts, leading to a social and cultural shift in engagement with research, especially science. We also live in a culture riddled with inadvertent misinformation and the persistent poison of intentional campaigns of disinformation. However, as we continue to face ever-looming global challenges, we must utilise the most trustworthy knowledge to solve them.

Dr Amy Brand talks about how novel technologies like AI and knowledge graphs can help us better understand the integrity of published research information – watch this and other videos on Digital Science’s YouTube channel: https://youtu.be/2k2YxJXAMOU

Unfortunately, while a lot of new technology can help us determine the credibility of research information, there are many cases in which the application of novel tech such as AI has given us cause for doubt; through hallucinations in generative AI used in research and discovery, through to the limitations of an AI-driven peer-review process that doubles down on existing biases hidden within the training data used to build these programs.

Amy believes that by having complete transparency of the provenance of research information, architects of new technology and their users can better understand what underlies the information used to create it and build better and more appropriate tools that can be trusted, which can, in turn, help us develop new breakthroughs and discoveries in a trustworthy manner.

Subscribe now to be notified of each weekly release of the latest TL;DR Short, and watch the entire series here

If you’d like to suggest future contributors for our series or suggest some topics you’d like us to cover, drop Suze a message on one of our social media channels and use the hashtag #TLDRShorts.

The post TL;DR Shorts: Dr Amy Brand on Trust in Research appeared first on Digital Science.

]]>
TL;DR Shorts: Chris DiBona on integrity and openness in research https://www.digital-science.com/tldr/article/tldr-shorts-chris-dibona-on-integrity-and-openness-in-research/ Tue, 14 May 2024 10:30:00 +0000 https://www.digital-science.com/?post_type=tldr_article&p=71776 Today's TL;DR Shorts episode is brought to you by Chris DiBona, a world-renowned leader in Open Source Research, and covers many of the themes we are currently discussing across the research community, including open research, trust and integrity, and revamping research culture to make it more fit-for-purpose.

The post TL;DR Shorts: Chris DiBona on integrity and openness in research appeared first on Digital Science.

]]>
Today’s TL;DR Short is brought to you by Chris DiBona, a world-renowned leader in Open Source Research, and covers many of the themes we are currently discussing across the research community, including open research, trust and integrity, and revamping research culture to make it more fit-for-purpose.

Chris has worked in Open Source Research for many years, including almost two decades at Google as Director of Open Source, during which lead the way on licencing and created many community engagement programmes and also co-organised Science Foo Camp (Sci Foo).

Chris DiBona talks about finding balance between a more open research culture and the integrity of published research information – check out this and other videos on Digital Science’s YouTube channel: https://youtu.be/JGzqcEunu_0

Chris believes that we are in what he calls a “very dangerous time” with the replication crisis challenging research and translation into a tangible impact on society. Chris goes on to say that there is a very legitimate push towards more open ways of doing research, some of which we heard about in our chat with Dr Niamh O’Connor, Chief Publishing Officer at PLOS, in last week’s Speaker Series interview.

However, Chris reminds us that by opening research, we are also making it more vulnerable to the deliberate misuse of the system to propagate disinformation and the potential for research fraud. Chris ends today’s TL;DR Shorts by touching on the research incentivisation system which he feels places too little value on doing honest research, again a topic that Niamh touched on in her interview, and which our very own Dr Leslie McIntosh is tackling as a pioneer in the new field of Forensic Scientometrics.

Subscribe now to be notified of each weekly release of the latest TL;DR Short, and watch the entire series here

If you’d like to suggest future contributors for our series or suggest some topics you’d like us to cover, drop Suze a message on one of our social media channels and use the hashtag #TLDRShorts.

The post TL;DR Shorts: Chris DiBona on integrity and openness in research appeared first on Digital Science.

]]>
FoSci – The emerging field of Forensic Scientometrics https://www.digital-science.com/tldr/article/forensic-scientometrics/ Wed, 08 May 2024 09:00:00 +0000 https://www.digital-science.com/?post_type=tldr_article&p=71722 Our VP Research Integrity, Dr Leslie McIntosh, explains forensic scientometrics (FoSci) - the emerging field focused on inspecting and upholding the integrity of scientific research.

The post FoSci – The emerging field of Forensic Scientometrics appeared first on Digital Science.

]]>
Forensic Scientometrics (FoSci) graphic

Major news stories have recently covered journal editors being bribed, university presidents resigning for questionable research integrity standards, and undisclosed conflicts of interests by faculty and researchers. Each day it seems a new story about lapses in research integrity in science is published, which sometimes brings to light those individuals and groups discovering the misconduct.

This work of inspecting and upholding the integrity of scientific research has long been conducted in the background of science and scholarly communication, carried out by passionate individuals driven by ethics and principles to ensure the veracity of the scientific method or record and the downstream impacts on policy, practice, and theory. 

Groups and individuals, such as RetractionWatch, have monitored, collected, classified, and written about retractions for over ten years. Many of those doing the verifying have specialization detection – from nefarious networks to image manipulation to tortured phrases. Additionally, many organizations have developed specific offices and infrastructure to support research integrity – such as the US Office of Research Integrity at the National Institutes of Health, institutionally-based research integrity and security offices, as well as the newly formed research integrity offices located in major publishing organizations.  

Despite the growth of investigative research and methods on scientific misconduct, the discipline itself lacks a common definition and description of its field.  So, what do you call the collective work of analyzing publications,  data, images, and statistics to uncover errors, misinformation, or manipulations in scientific publications? We propose calling this emerging field forensic scientometrics – FoSci for short.

Research integrity experts call for new forensics discipline

Forensic scientometrics


Dr Leslie McIntosh

By embracing FoSci as a specialized and necessary field, we can galvanize interest, foster the development of a community of practice, and signal the importance of this crucial work

Dr Leslie McIntosh | VP Research Integrity | Digital Science

Why use forensics? Forensics refers to applying scientific knowledge and methods to matters of law, particularly for investigating crimes and providing evidence in legal proceedings.

First, there is an investigative nature of the work we do even if the results do not end up in the court of law. While fraud investigations and syndicate networks encompass legal realms a, the intentional manipulation of anything scientific (as in the process of scientific discovery or manipulation) does not have a special place. That doesn’t mean it shouldn’t be there.

Second, scientific publications are being used in the court of law as evidence, but the papers themselves and their veracity do not get scrutinized by expert scientometricians. A common belief is that a peer-reviewed academic paper indicates that the research has passed the scientific method stress test. Yet, peer-reviewers vet (or should) the scholarly question within the paper, not the weight of evidence in a societal context. 

Scientometrics involves the quantitative analysis of scientific publications and research outputs in this larger context. It encompasses the measurement and evaluation related to scientific activities, such as the impact of research, patterns of collaboration among researchers, citation analysis, and the productivity assessment and influence of individuals, institutions, or scientific journals. Scientometrics employs statistical and mathematical methods to derive meaningful insights into the structure and dynamics of scientific knowledge, contributing to our understanding of the scientific community’s development and impact over time.

As we navigate the evolving landscape of scientific inquiry, the emergence of forensic scientometrics as a distinct field reflects a collective commitment to upholding research integrity, from the pioneers who have tirelessly exposed misconduct to the institutional changes taking place, the journey towards a recognized field is well underway. In this era of increased scrutiny, defining and embracing forensic scientometrics – FoSci –  becomes essential in strengthening trust in and around science.

Bio

Leslie D. McIntosh, PhD is the VP of Research Integrity at Digital Science and dedicates her work to improving research, reducing disinformation, and increasing trust in science.

As an academic turned entrepreneur, she founded Ripeta in 2017 to improve research quality and integrity. Now part of Digital Science, the Ripeta algorithms lead in detecting Trust Markers of research manuscripts. She works with governments, publishers, institutions, and companies around the globe to improve research and scientific decision-making. She has given hundreds of talks, including to the US-NIH, NASA, and World Congress on Research Integrity, and consulted with the US, Canadian, and European governments. Dr. McIntosh’s work was the most-read RetractionWatch post of 2022. In 2023, her influential ideas on achieving equity in research were highlighted in the Guardian and Science.

Publications and Preprints

McIntosh, Leslie and Hudson Vitale, Cynthia. 2024. Forensic Scientometrics — An emerging discipline to protect the scholarly record. arXiv https://doi.org/10.48550/arXiv.2311.11344 

Porter, Simon and McIntosh, Leslie. 2024. Identifying Fabricated Networks within Authorship-for-Sale Enterprises. arXiv https://doi.org/10.48550/arXiv.2401.04022

McIntosh, L.D. and Hudson Vitale, C., 2023. Safeguarding scientific integrity: A case study in examining manipulation in the peer review process. Accountability in Research, pp.1-19. https://doi.org/10.1080/08989621.2023.2292043 

Blogs

McIntosh, Leslie D. (2024): FoSci – The Emerging Field of Forensic Scientometrics The Scholarly Kitchen

McIntosh, Leslie D. (2024): Science Misused in the Law 

The post FoSci – The emerging field of Forensic Scientometrics appeared first on Digital Science.

]]>