Welcome to the library and knowledge services Digital services.

The best AI productivity tools by category

Machine learning and deep learning techniques for medical science BY K. G. Gayathri Devi, Kishore Balasubramanian and Le Anh Ngoc [LINK TO BOOK]

Artificial intelligence in medicine by Niklas Lidströmer and Hutan Ashrafian [LINK TO BOOK]

Artificial intelligence : a guide for thinking humans by Melanie Mitchell [LINK TO BOOK]

AI in healthcare : how artificial intelligence is changing IT operations and infrastructure services by Robert Shimonski  [LINK TO BOOK]

Practical AI for healthcare professionals : machine learning with Numpy, Scikit-learn, and TensorFlow by Abhinav Suri [LINK TO BOOK]

Artificial intelligence in surgery : an AI primer for surgical practice by Daniel Hashimoto [LINK TO BOOK]

Deep medicine: how artificial intelligence can make healthcare human again by Eric J. Topol [LINK TO BOOK]

Live longer with AI : how artificial intelligence is helping us extend our healthspan and live better too by Tina Woods and Melissa Ream [LINK TO BOOK]

Artificial intelligence : a guide for thinking humans by Melanie Mitchell  - copies available in the library [LINK TO BOOK]

Predicting heart failure : invasive, non-invasive, machine learning, and artificial intelligence based methods by Kishor Kumar Sadasivuni [LINK TO BOOK]

 

Article: Robotics and robotic surgery: how to access better patient care by CHG Meridian UK, November 2020

Podcast: Is automation the future of stem cells? November 27, 2023 National Health Executive

 

AI Policies

Artificial Intelligence (AI) Governance Policy  October 2023 Humber and North Yorkshire [LINK]

Principles for trustworthy AI [LINK]

AI and Digital Regulations Service for health and social care [LINK]

The NHS AI Lab [LINK]

 

The best AI productivity tools in 2024 [LINK] The best AI productivity tools in 2024 | Zapier

Top 30 Artificial Intelligence (AI) Tools List [LINK]

 

House Of Commons: Artificial intelligence: A reading list. By Elizabeth Rough, Nikki Sutherland. 17 April 2024 [LINK TO REPORT]

Preparing the healthcare workforce to deliver the digital future. An independent report on behalf of the Secretary of State for Health and Social Care, February 2019 [LINK TO REPORT]

The National AI Strategy 2022 [LINK TO REPORT]

 

 

 

 

Accessibility - NHS Digital [LINK]

Digital accessibility standards [LINK]

Training eLearning [LINK]

Reachdeck [LINK]

The ReachDeck toolbar: Supports your web visitors to engage with your online content in a way that suits their needs. Features include text-to-speech, reading and translation support, helping you to create inclusive experiences for everyone.

The ReachDeck auditor: WCAG and ADA compliance is made easy with the ReachDeck auditor. 
The website checker identifies errors at WCAG Level A, AA and AAA quickly and at scale. It also provides you with a readability overview of your content.

The ReachDeck editor: Improves the quality and accessibility of written content. Grammar, spelling and readability errors are quickly identified; helping everyone in your organization to edit your content in line with best practices.

ReachDeck  Toolbar features include:
Text-to-Speech reads on-screen text out loud with read along highlighting
Translation allows words to be translated into multiple languages
Picture Dictionary displays word meaning through illustration
MP3 maker converts online content into MP3 files for easy listening
Screen Mask with reading pane reduces visual stress and improves focus
Text Magnifier magnifies text and reads it out loud. This increases accessibility of even the smallest web text
Webpage Simplifier creates a simplified view of a webpage and removes distracting content

PLEASE NOTE THE TRUST DOES NOT PERMIT THE USE OF AI TOOLS ON THE TRUST NETWORK. Some of these are all blocked. For more information please contact Information Governance team or the ICT Help desk.

LIST OF AI TOOLS (without the live links)

Learning zone

Resources and tools to enable knowledge and library services staff to understand and use AI.

Imaging and video analytics

 

Predictive analytics

 

 

Robotic process automation

 

Bias and other risks

Academics, clinicians, and politicians have raised concerns of bias in AI tools. There is a risk of propagating and automating existing biases.

AI tools may be at risk of exacerbating bias against people with an increased risk of harm; people who have been historically marginalised or subject to discrimination. Bias could cause harm to service users, such as underdiagnosing, or indeed overdiagnosing, vulnerable people.

Some GPT detectors, tools which check academic work for generative AI, have been found to be ineffective. This can result in false-positives; accusing students or academics of using generative AI when they have not. One study indicated that some GPT detectors are biased against non-native English writers, putting them at increased risk of being unfairly accused of using generative AI to cheat.

Mitigating and reducing bias could involve questioning companies that offer AI products; ensuring that their development teams are adequately diverse and well-treated, that they’re training tools with diverse data, and that their products are thoroughly tested for quality.

Inadequate data collected by hospitals may not be compatible with some AI tools, leading to poor or inaccurate results. Good data integrity is vital for the safe use of AI tools.

Other risks have also been identified:

  • unintentional plagiarism and copyright infringement
  • harmful to the environment
  • data privacy
  • security concerns
  • incorrect, misleading or inaccurate information
  • citation/reference inaccuracy
  • transparency and trustworthiness issues
  • ‘legal issues’

Large Language Models (LLMs) and healthcare

Famous and accessible products like ChatGPT, Bard, and Bing Chat are Large Language Models, although there are numerous other LLMs too. Language Models excel at language-based tasks, such as assisting with the creation of simple Boolean search strategies, drafting strategy and targeted marketing copy, and synthesising/summarising information.

While this list isn’t exhaustive, healthcare professionals may use LLMs for:

  • information retrieval
  • self-learning and further education
  • patient conversation simulations
  • patient information writing and health literacy support
  • article draft and evidence summary generation
  • research question generation
  • marketing
  • patient support, engagement and consultation
  • clinical decision support and point of care assistance
  • assisting with diagnosis
  • generation of clinical reports
  • explaining drug-drug interactions
  • policy drafting
  • streamlining repetitive administration tasks
  • drug lexicon/slang/synonym generation
  • guideline questions support

Hallucination and error

Hallucination is the presentation of information, usually by LLMs, which appears plausible but is erroneous (in other words, tools can make things up). LLM tools are generally programmed to respond to user input regardless of the prompt given, and sometimes this leads to generated responses which are incorrect.

Some LLMs cannot search the internet and may not be able to accurately provide answers to questions. Differentiating between tools which can provide answers to questions, and tools which cannot, can be important. Selecting the right tool for the right job is vital to ensuring proper and effective use of AI tools.

While hallucination is currently being investigated and mitigated by tech companies, developers are encouraging users to be vigilant and double-check responses.

Having a good understanding of LLM tools, and how to use them effectively, can cut down on hallucination. Asking LLM tools highly specific questions, even tools with internet search capabilities, can lead to erroneous responses.

Practical examples from Knowledge and Library Services

From making data more discoverable to rolling out LibKey Nomad to using NLP products in search, KLS professionals are already using AI tools to enhance their services.

Some Knowledge Specialists are also using Large Language Models to assist with building search strategies.

Others are training and advising healthcare professionals and other KLS colleagues about how to use generative AI safely and effectively.

Certain tools, like Claude and Perplexity, can be used to assist with the summarising of literature search results. Be careful not to include any personally identifying information, or paywalled information, in documents.

Prompting generative AI

Knowing how to prompt generative AI tools can help provide richer and more useful responses. Differentiating between various tools and their uses is also important; making sure that we’re using the right tools for the right job.

Not all AI tools can search the internet and provide an accurate answer to a question. Most large language models are better suited to generating language, for language-based tasks, such as generating synonyms for literature searching, or summarising the information you give them.

Setting up a good prompt can take a little practice, time and patience. Everyone has their own unique prompting style, much like we have unique searching styles. The CLEAR framework can help you write up new prompts.  

Setting up dedicated threads for specific tasks can save you time. You can set up a prompt, and ‘dip into’ threads, without having to re-write prompts every single time.

Here are some examples:

Google search strategy generation

“Hello! I am a search expert. I would like you to help me generate search strategies for Google, using Boolean AND/OR terms only. If I give you the search queries, can you generate the search strategy, using UK English [and US English]? No need to explain the strategy itself, I am an expert and I already understand how the search works.”

Using this prompt in ChatGPT or Bard will allow you to simply paste in search queries in the future, and the tool will generate search strategies each time.

As Google has Natural Language Processing algorithms, it will be more ‘forgiving’ of search strategies generated by LLMs.   

Search block generation

“Hello! I am a Knowledge Specialist. I work in the UK NHS. My job is searching for evidence-based information. I would like to use this thread to generate search blocks for advanced search databases. Please do not use Medical Subject Headings (MeSH) or any other search operators, just 'OR' and 'AND'. When I paste in medical topics, please generate the search strategies using both UK and US spelling, and various relevant synonyms. No need to explain the strategy, as I am an expert and do not require this information.”

Use this prompt for search strategies for advanced search databases.

Summarising

Some tools, like Claude, will allow you to upload documents to summarise. Uploading anonymised literature search results, or RefWorks bibliographies, will make things easier. Always prompt the tools to use UK English in their responses.

Don’t ask the tool to simply ‘summarise’ the information. The response will lack detail and the important information you have found! Instead, ask it questions about your search. Here’s an example:

“Drawing purely from the information in this document, please list the benefits of having a Knowledge Management Service in healthcare organisations, using bullet points and UK English.”

Asking it relevant questions will draw richer responses, and a more detailed summary. Remember to reference all the materials in your summary, and double check the information before sending to your users. It’s your responsibility to use these tools appropriately!

For more detailed information about using AI tools to summarise information, check out this blog.

Networks, courses and online learning

AI for Healthcare: Equipping the Workforce for Digital Transformation

e-learning on how artificial intelligence is transforming healthcare and how it can be used to support change in the healthcare workforce.

PGCert Clinical Data Science

This Clinical Data Science course from the University of Manchester was co-created with end users and industry partners to develop a flexible programme suitable for busy health and social care practitioners.

Federation for Informatics Professionals (FEDIP)

Members can get access to an information Hub, as well as other opportunities to network.

Current and Emerging Technology in Knowledge and Library Services

A community of practice for Knowledge and Library workers who are interested in emerging technologies and their practical uses. There is also a bank of resources for using AI and AI training materials for Knowledge and Library Service users.

Artificial Intelligence in Teaching and Learning

This subject guide will help students and staff explore a range of AI technologies and consider how these technologies might affect their teaching and learning practice.

Page last reviewed: 10 January 2024

Webinars and presentations

Other reading

Research

 

HEE: ChatGPT with Phil Bradley (2023)

 

​​​​​​​HEE: Artificial Intelligence: What is Our Role? with Dr Andrew Cox (2022)

 

AI and Ethical Awareness with Dr Georgina Cosma (2023)

 

House Of Commons: Artificial intelligence: A reading list. By Elizabeth Rough, Nikki Sutherland. 17 April 2024 [LINK TO REPORT]

Preparing the healthcare workforce to deliver the digital future. An independent report on behalf of the Secretary of State for Health and Social Care, February 2019 [LINK TO REPORT]

The National AI Strategy 2022 [LINK TO REPORT]

 

Assistive tools and technologies [LINK]

 

Web browser accessibility tools: 
Read text aloud - Each web browser has the ability to have text read aloud by your computer, but the functionality and method of doing this is dependent on the software being used. 

 

Access to speech recognition software - [LINK]
 

 

 

NHS England - Transformation Directorate [LINK]

NHS England - Cyber and data security [LINK]

HTN Health Tech News - Cyber Security [LINK]

NHS England -  Cyber and data security services and resources [LINK]

NHS England - Cyber Alerts [LINK]

NHS England - Cyber security services [LINK]

 

Cyber security professionals [LINK]

 

 

Advice & guidance - [LINK]

Artificial intelligence [LINK]

https://www.ncsc.gov.uk/guidance/asset-management

 

 

 

 

 

 

Five things you really need to know about AI - BBC  [ VIDEO ]

What's the future for generative AI? - The Turing Lectures with Mike Wooldridge [ VIDEO ]

The Turing Lectures by the Alan Turing Institute  [ VIDEO ]

The UK's national showcase of data science and AI | AI UK  [ VIDEO ]

 The Royal Institution [ VIDEO ]

 3 principles for creating safer AI | Stuart Russell [ VIDEO ]