HEDS is part of the School of Health and Related Research (ScHARR) at the University of Sheffield. We undertake research, teaching, training and consultancy on all aspects of health related decision science, with a particular emphasis on health economics, HTA and evidence synthesis.
Showing posts with label Kate Ren. Show all posts
Showing posts with label Kate Ren. Show all posts

Tuesday, 31 July 2018

Short course: Expert Knowledge Elicitation

Image of Dr Kate Ren
Dr Kate Ren
Image of Dr John Stevens
Dr John Stevens


HEDS colleagues Dr John Stevens and Dr Kate Ren are involved in the facilitating a short course on Expert Knowledge Elicitation in September. 

Elicitation with SHELF: 3-4 September 2018 
Advanced Facilitation: 5th September 2018, repeated on 6th September 
 
Leopold Hotel, Sheffield, UK 
 
Course faculty
Professor Anthony O’Hagan, Professor Jeremy Oakley, Dr John Paul Gosling, Dr Kate Ren, Dr John Stevens 

Background
Decision analytic models such as economic models submitted to NICE and similar reimbursement authorities around the world often incorporate evidence in the form of expert opinion. In particular, when suitable data are lacking, analysts may be dependent on expert opinion to obtain appropriate values for model inputs/parameters. To quantify expert uncertainty about such quantities, a process of elicitation can be used to obtain suitable probability distributions following a process of elicitation. This practical course aims to provide participants with the skills required to elicit experts’ probability distributions about unknown quantities of interest. The course is based around the Sheffield Elicitation Framework (SHELF): a behavioural aggregation method for eliciting distributions from multiple experts. 

Who will benefit from the course?
The course is suitable for health economists, statisticians, systematic reviewers and decision-makers interested in the elicitation of experts' probability distributions about unknown quantities of interest to populate their models. The course is also suitable for researchers in other disciplines who wish to learn about expert elicitation. No previous knowledge of elicitation is assumed. 

Course structure
The first two days (September 3-4) will comprise a comprehensive short course, Elicitation with SHELF. The course will cover the principles of expert elicitation and the SHELF method, together with practical considerations in planning and running a SHELF elicitation workshop. 
Participants may choose to attend the first two days only.

The key role in the SHELF method is that of the facilitator. The Advanced Facilitation course will involve intensive, small group, hands-on sessions for participants who wish to be trained to act as facilitators. Each trainee will gain practical experience of facilitating a SHELF elicitation workshop, using carefully designed and realistic scenarios. Trainees will also gain experience in another important role, that of the recorder. Each day is open to a maximum of four trainees. To attend an Advanced Facilitation course, participants must have attended either the first two days, or a previous Elicitation with SHELF course.

Course fees are set out in the table. Early bird rates apply for registration up to July 31st.
Course 

Elicitation with SHELF - 
Early bird £500 Regular £575
Advanced Facilitation - Early bird £1450 Regular  £1600 

Course fees include lunch and refreshments. Participants will need to arrange their own accommodation. 

For further details and registration, please go to
http://www.tonyohagan.co.uk/shelf/CourseSep18.html

Wednesday, 6 June 2018

New methods paper on evidence synthesis with limited studies

Image of Dr. Shijie (Kate) Ren
Dr. Shijie (Kate) Ren 
Dr. Shijie (Kate) Ren has published a paper with colleagues from the School of Mathematics and Statistics and HEDS proposing an elicitation framework to capture external evidence about heterogeneity for use in evidence synthesis with limited studies.
Their proposed framework allows uncertainty to be represented by a genuine prior distribution, using empirical evidence and experts’ beliefs, and can avoid making misleading inferences. The method is flexible to what judgments an expert is able to provide. They have also provided R code for implementing their method.
Image of Medical Decision Making journal
© Sage Journals
Dr. Ren, who specialises in the application of Bayesian methods in health care evaluation, argues that, “Analysts often default to use a fixed effect model in evidence synthesis because there are too few studies to conduct a random effect model. The choice of which model to use should depend on the objective of the analysis and knowledge of the included studies.”
In the case where heterogeneity is expected, the proposed elicitation framework can overcome the problem of imprecise estimates of the heterogeneity parameter in the absence of sufficient sample data.
The article, published in Medical Decision Making, is available open-access and can be found at http://dx.doi.org/10.1177/0272989X18759488.

Monday, 14 May 2018

Dr Kate Ren to deliver talk at the Promoting Statistical Insight (PSI) Conference - 'Breaking Boundaries in Drug Development'

Image of Dr Kate Ren
Dr Kate Ren
Dr Kate Ren is to deliver talk at the Promoting Statistical Insight (PSI) Conference - Breaking Boundaries in Drug Development in Damrak, Amsterdam. Dr Ren's talk is titled: Evidence synthesis with limited studies: incorporating genuine prior information about between-study heterogeneity

Abstract:
Background: Meta-analyses using fixed effect and random effects models are commonly applied to synthesise evidence from randomised controlled trials in health technology assessment. The models differ in their assumptions and the interpretation of the results. Fixed effect models are often used because there are too few studies with which to estimate the between-study standard deviation from the data alone, but not that heterogeneity is unlikely to be expected.

Objectives: The aim is to propose a framework for eliciting an informative prior distribution for the between-study standard deviation in a Bayesian random effects meta-analysis model to genuinely represent heterogeneity when data are sparse.

Methods: We developed an elicitation method using external information such as empirical evidence and experts’ beliefs on the ‘range’ of treatment effects in order to infer the prior distribution for the between-study standard deviation. We also developed the method to be implemented in R.

Results: The three-stage elicitation approach allows uncertainty to be represented by a genuine prior distribution to avoid making misleading inferences. It is flexible to what judgments an expert can provide, and is applicable to all common types of outcome measure.

Conclusions: The choice between using a fixed effect or random effects meta-analysis model depends on the inferences required and not on the number of available studies. Our elicitation framework captures external evidence about heterogeneity and overcomes the often implausible assumption that studies are estimating the same treatment effect, thereby improving the quality of inferences in decision making. 

The conference is between 3-6 June and registration is now open. To register please click here

Sessions will include data transparency, the Asterix project in rare diseases, application and implementation of methodologies in statistics, missing data, study design role play, regulatory town hall and many more, with speakers from industry, academia and regulatory agencies. View the online draft scientific programme here.

Wednesday, 11 April 2018

New methods paper on Sampling Ordered Parameters in Probabilistic Sensitivity Analysis

Image of Dr. Shijie Ren
Dr. Shijie Ren

Dr. Shijie Ren (also known as Kate Ren) has recently published a paper with some colleagues from HEDS and the University of Glasgow proposing a new method of sampling ordered parameters for use in cost-effectiveness analysis. Their new approach, known as the difference method (DM), is designed to address the difficulties of sampling parameters with a constraint in probabilistic sensitivity analysis, e.g., sampling utility associated with different severity levels of a disease.

Dr. Ren, who specialises in the application of Bayesian methods in health economics, argues that “Typical sampling approaches often lack either statistical or clinical validity.  For example, sampling using a common number generator results in extreme dependence, and independent sampling can lead to realisations with incorrect ordering.”

The DM approach uses a “difference parameter” to sample the parameters of interest; a method which generates ordered parameters with greater validity for use in probabilistic sensitivity analysis.

The article, published in Pharmacoeconomics, is available open-access and can be found (along with a Microsoft Excel workbook to implement the method) at http://dx.doi.org/10.1007/s40273-017-0584-3