Optimising the analysis of stroke trials

Gray, Laura Jayne (2008) Optimising the analysis of stroke trials. PhD thesis, University of Nottingham.

[img]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Download (18MB) | Preview

Abstract

Most large acute stroke trials have shown no treatment effect. Functional outcome is routinely used as the primary outcome in stroke trials. This is usually analysed using a binary analysis, e.g. death or dependency versus independence. This project assessed which statistical approaches are most efficient in analysing functional outcome data from stroke trials.

Fifty five data sets from 47 (54,173 patients) completed randomised trials were assessed. Re-analysing this data with a variety of statistical approaches showed that methods which retained the ordinal nature of functional outcome data were statistically more efficient than those which collapsed the data into two or more groups. Ordinal logistic regression, t-test, robust rank test, bootstrapping the difference in mean rank, or the Wilcoxon test are recommended. When assessing sample size, using ordinal logistic regression to analyse data instead of a binary outcome can reduce the sample size needed for a given power by 28%. Ordinal methods may not be appropriate for trials of treatments which not only increase the proportion of patients having a good outcome but also have an increase in hazard, such as thrombolytics.

Adjusting the analysis performed for prognostic factors can have an additional effect on sample size. Re-analysing data from 23 stroke trials (25,674 patients), where covariate data was supplied, showed that ordinal logistic regression adjusted for age, sex and baseline stroke severity reduced the sample size needed for a given statistical power by around 37%. Alternatively trialists could increase the statistical power to find an effect for a given sample size, as it is argued that stroke trials have been too small and therefore underpowered.

Stroke prevention trials also routinely collect binary data, e.g. stroke/no stroke. Converting this data into ordinal outcomes, e.g. fatal stroke/non-fatal stroke/no stroke and analysing these with a method which takes into account the ordered nature of the data also increases the statistical power to find a treatment effect. This method also provides additional information on the effect of treatment on the severity of events.

Using ordinal methods of analysis may improve the design and statistical analysis of both acute and stroke prevention trials. Smaller trials would help stroke developments by reducing time to completion, study complexity, and financial expense.

Item Type: Thesis (University of Nottingham only) (PhD)
Supervisors: Bath, P.M.W.
Keywords: Outcome data from stroke trials, Statistical analysis of data, Ordinal methods of analysis
Subjects: W Medicine and related subjects (NLM Classification) > WL Nervous system
Faculties/Schools: UK Campuses > Faculty of Medicine and Health Sciences > School of Clinical Sciences
Item ID: 13981
Depositing User: EP, Services
Date Deposited: 11 Feb 2014 08:41
Last Modified: 16 Oct 2017 14:37
URI: https://eprints.nottingham.ac.uk/id/eprint/13981

Actions (Archive Staff Only)

Edit View Edit View