Citizens’ Jury 2018

Group CHC Hub

17 members of the public took part in our Citizens' Jury in January 2018.

In January 2018, The National Data Guardian and Connected Health Cities commissioned a citizens’ jury to better understand what patient data sharing scenarios can be ‘reasonably expected’ in the eyes of the public when they are given the time and support to think through such complex matters. The jury was designed and run by Citizens Juries c.i.c. working in partnership with the Jefferson Center.  This piece of work originates from conversations about the extent to which implied consent can be used to support data sharing, particularly in the context of new technology and models of care.

Over the course of three days, a jury of 17 people was asked to consider the uses of the data in 10 different scenarios relating to a fictional patient called Anita who initially goes to the GP with an eye problem. The jury followed Anita and her data through various parts of the health and care system, and at each point was asked whether she would have reasonably expected privacy or sharing.

What the jury told us about their data sharing expectations

  • A majority of the jury said data sharing would be reasonably expected in all but one of the 10 scenarios (where Anita’s GP encounters her husband and discusses her case).
  • The jurors were very supportive of data sharing with implied consent for routine, direct care scenarios such as: a GP sending referral data to a hospital; a hospital doctor looking at the referral to triage; the patient’s case being discussed by a multi-disciplinary team to plan her care. The reasons they gave for supporting such information sharing were based on an understanding that better information sharing benefits those receiving care.
  • The jury also supported the use of data sharing for altruistic reasons other than direct care, such as developing new tools for health and care or helping with the diagnosis of others, citing the importance of contributing to future research, advancing the knowledge of health professionals, and doing something for the ‘greater good’.
  • While still within reasonable expectations by a majority, two scenarios that created a higher level of discomfort were related to administrative tasks carried out by non-clinical staff to support the delivery of individual care. They were concerned that the information shared with administrative staff might be disproportionate to the task – however, there was a recognition that the handling of patient information by non-clinical staff was necessary to run the system and make the most effective use of clinical resources.
  • The scenarios where information was sent to assist the diagnosis of another patient, or the sharing of data to enable a university to develop artificial intelligence software, also had comparatively higher numbers of jurors being unsure or expecting privacy at the beginning of the jury. By the end of the jury, when the uses and safeguards had been explained, a majority were comfortable with these uses.

In conclusion

The jury provided useful further insights into what members of the public expect in relation to data sharing, and what factors are important to citizens in regard to their confidential health information. It also underlined the importance of clear communication and transparency to gaining people’s trust and support: it was notable that, especially where information is being used in ways which go beyond traditional or well-known uses, the views of jurors changed during the three days as they had the chance to talk to ‘witnesses’ such as doctors, researchers, and administrators who use patient data day-to-day.

Jury design documentation

No Name Brief description Reviewed by

Oversight Panel?

File for download
A1 Jury specification A specification of the design for the two juries, including the jury questions and juror selection criteria. Yes A1 Reasonable expectations citizens jury specification v1
A2 Overview of 3-day activities An overview of the morning and afternoon activities for all three days of the citizens’ jury. Yes A2 Citizens jury overview of activities
A.3 Expert witnesses brief A brief provided to all six expert witnesses to guide their presentations to the juries, including a specific brief for each named witness. Yes A3 Brief for expert witnesses Jan 2018 jury v1
A4 Oversight panel brief A brief to the three members of the oversight panel (who are identified in the document) describing their role to monitor bias. Yes A4 Reasonable expectations citizens jury oversight panel brief v1
A5 Oversight panel signed questionnaires The set of forms completed and signed by two of the three members of the oversight panel with their assessments and statements on bias. The third panel member did not return the completed form. Yes A5 Oversight panel signed questionnaires
A6 Jury recruitment questionnaire The electronic form completed by people applying to be jurors. No A6 Jury recruitment questionnaire
A7 Start-of-jury questionnaire The questionnaire that all jurors completed at the start of day 1 of the jury process. Yes A7 Start of jury questionnaire
A8 End-of-jury questionnaire The questionnaire that all jurors completed at the end of day 3 of the jury process. Yes A8 end jury questionnaire
A9 Daily participant feedback form A form designed and used by the Jefferson Center to capture feedback from the jurors, particularly about potential bias, at the end of day 1 and day 2. No (standard Jefferson Center form) A9 Jefferson Center bias questionnaire


Jury materials

No Name Brief description Reviewed by

Oversight Panel

File for download
B1 Jurors ring binder contents A folder of materials produced by Citizens’ Juries c.i.c. and the Jefferson Center, printed out and provided in a ring binder to each jury member. It has 15 sections including a table of contents (section 0) and all the slides from the expert witnesses. All expert witness slides (sections 12-14) the Day 3 Case Studies with Reflection Sheets (section 10) and the simulation exercise (section 7) were reviewed. B1 Jurors ring binder
B2 Juror handouts Paper handouts provided to jury members during the course of the three-day event. The two case study tally sheets were reviewed. B2 juror handouts

Jury outputs

No Name Brief description File for download
C.1 Jurors’ report final A report of the jury conclusions produced by Kyle Bozentko of the Jefferson Center on day 3 of Manchester jury with the 17 jurors. C1 NDG CJ jurors report final
C2 Reasonable expectations report A report summarising the design and findings of the two citizens’ juries. C2_Reasonable expectations jury report_May18
C3 Citizens’ jury  wordcloud A word cloud diagram generated automatically using the answers given by the 17 jurors in the end of jury questionnaire to the question: “Overall, what was it like participating in the citizens’ jury over the three days? Please say 3 things in 3 words to sum up your experience.”  The larger the word in the diagram, the more people said that word. C3 citizens jury Jan 2018 wordcloud
C4 Start of jury questionnaire The results data from the start-of-jury questionnaire. C4 Start of jury questionnaire
C5 Day 1 and Day 2 bias questionnaires The results data from the juror day 1 and day 2 bias questionnaires. C5 Day 1 and Day 2 bias questionnaires
C6 Detailed jury results The results data from juror voting on jury questions, with reasons. C6 juror answers to jury questions with reasons
C7 End of jury questionnaire The results data from the end-of-jury questionnaires. C7 end of jury questionnaire
C8 Dataset of juror voting with reasons A spreadsheet containing the detailed data of how the 18 members of the jury voted on the questions along with the reasons for their answers.  This contains minority views – where fewer than 6 people voted for a reason – not shown in the jury report. (C2) C8 Dataset of juror voting with reasons



  1. Other than where specified, the main author of the documentation above was Dr. Malcolm Oswald, Director of Citizens Juries c.i.c.. Reviewers varied depending on the nature of the document.
  2. The Oversight Panel brief was to review the main jury design documentation, but not the jury outputs.