Read by QxMD icon Read


Rene Schmidt, Andreas Faldum, Robert Kwiecien
Traditional designs in phase IIa cancer trials are single-arm designs with a binary outcome, for example, tumor response. In some settings, however, a time-to-event endpoint might appear more appropriate, particularly in the presence of loss to follow-up. Then the one-sample log-rank test might be the method of choice. It allows to compare the survival curve of the patients under treatment to a prespecified reference survival curve. The reference curve usually represents the expected survival under standard of the care...
September 22, 2017: Biometrics
Iris Ivy M Gauran, Junyong Park, Johan Lim, DoHwan Park, John Zylstra, Thomas Peterson, Maricel Kann, John L Spouge
In recent mutation studies, analyses based on protein domain positions are gaining popularity over gene-centric approaches since the latter have limitations in considering the functional context that the position of the mutation provides. This presents a large-scale simultaneous inference problem, with hundreds of hypothesis tests to consider at the same time. This article aims to select significant mutation counts while controlling a given level of Type I error via False Discovery Rate (FDR) procedures. One main assumption is that the mutation counts follow a zero-inflated model in order to account for the true zeros in the count model and the excess zeros...
September 22, 2017: Biometrics
Julie McIntyre, Brent A Johnson, Stephen M Rappaport
Nonparametric regression is a fundamental problem in statistics but challenging when the independent variable is measured with error. Among the first approaches was an extension of deconvoluting kernel density estimators for homescedastic measurement error. The main contribution of this article is to propose a new simulation-based nonparametric regression estimator for the heteroscedastic measurement error case. Similar to some earlier proposals, our estimator is built on principles underlying deconvoluting kernel density estimators...
September 15, 2017: Biometrics
Liang Zhu, Ying Zhang, Yimei Li, Jianguo Sun, Leslie L Robison
Panel-count data arise when each study subject is observed only at discrete time points in a recurrent event study, and only the numbers of the event of interest between observation time points are recorded (Sun and Zhao, 2013). However, sometimes the exact number of events between some observation times is unknown and what we know is only whether the event of interest has occurred. In this article, we will refer this type of data to as mixed panel-count data and propose a likelihood-based semiparametric regression method for their analysis by using the nonhomogeneous Poisson process assumption...
September 15, 2017: Biometrics
Lu Tian, Haoda Fu, Stephen J Ruberg, Hajime Uno, Lee-Jen Wei
In comparing two treatments with the event time observations, the hazard ratio (HR) estimate is routinely used to quantify the treatment difference. However, this model dependent estimate may be difficult to interpret clinically especially when the proportional hazards (PH) assumption is violated. An alternative estimation procedure for treatment efficacy based on the restricted means survival time or t-year mean survival time (t-MST) has been discussed extensively in the statistical and clinical literature...
September 12, 2017: Biometrics
Anup Amatya, Dulal K Bhaumik
A unified statistical methodology of sample size determination is developed for hierarchical designs that are frequently used in many areas, particularly in medical and health research studies. The solid foundation of the proposed methodology opens a new horizon for power analysis in presence of various conditions. Important features such as joint significance testing, unequal allocations of clusters across intervention groups, and differential attrition rates over follow up time points are integrated to address some useful questions that investigators often encounter while conducting such studies...
September 12, 2017: Biometrics
Matthew R Schofield, Richard J Barker, Nicholas Gelling
The standard approach to fitting capture-recapture data collected in continuous time involves arbitrarily forcing the data into a series of distinct discrete capture sessions. We show how continuous-time models can be fitted as easily as discrete-time alternatives. The likelihood is factored so that efficient Markov chain Monte Carlo algorithms can be implemented for Bayesian estimation, available online in the R package ctime. We consider goodness-of-fit tests for behavior and heterogeneity effects as well as implementing models that allow for such effects...
September 12, 2017: Biometrics
Chi Hyun Lee, Jing Ning, Yu Shen
In clinical studies with time-to-event outcomes, the restricted mean survival time (RMST) has attracted substantial attention as a summary measurement for its straightforward clinical interpretation. When the data are subject to length-biased sampling, which is frequently encountered in observational cohort studies, existing methods to estimate the RMST are not applicable. In this article, we consider nonparametric and semiparametric regression methods to estimate the RMST under the setting of length-biased sampling...
September 8, 2017: Biometrics
Micha Mandel, Jacobo de Uña-Álvarez, David K Simon, Rebecca A Betensky
Doubly truncated data arise when event times are observed only if they fall within subject-specific, possibly random, intervals. While non-parametric methods for survivor function estimation using doubly truncated data have been intensively studied, only a few methods for fitting regression models have been suggested, and only for a limited number of covariates. In this article, we present a method to fit the Cox regression model to doubly truncated data with multiple discrete and continuous covariates, and describe how to implement it using existing software...
September 8, 2017: Biometrics
Dehan Kong, Joseph G Ibrahim, Eunjee Lee, Hongtu Zhu
We consider a functional linear Cox regression model for characterizing the association between time-to-event data and a set of functional and scalar predictors. The functional linear Cox regression model incorporates a functional principal component analysis for modeling the functional predictors and a high-dimensional Cox regression model to characterize the joint effects of both functional and scalar predictors on the time-to-event data. We develop an algorithm to calculate the maximum approximate partial likelihood estimates of unknown finite and infinite dimensional parameters...
September 1, 2017: Biometrics
Fan Wu, Sehee Kim, Jing Qin, Rajiv Saran, Yi Li
Survival data collected from a prevalent cohort are subject to left truncation and the analysis is challenging. Conditional approaches for left-truncated data could be inefficient as they ignore the information in the marginal likelihood of the truncation times. Length-biased sampling methods may improve the estimation efficiency but only when the underlying truncation time is uniform; otherwise, they may generate biased estimates. We propose a semiparametric method for left-truncated data under the Cox model with no parametric distributional assumption about the truncation times...
August 29, 2017: Biometrics
Yi-Hui Zhou, J S Marron, Fred A Wright
Genotype eigenvectors are widely used as covariates for control of spurious stratification in genetic association. Significance testing for the accompanying eigenvalues has typically been based on a standard Tracy-Widom limiting distribution for the largest eigenvalue, derived under white-noise assumptions. It is known that even modest local correlation among markers inflates the largest eigenvalues, even in the absence of true stratification. In addition, a few sample eigenvalues may be extreme, creating further complications in accurate testing...
August 29, 2017: Biometrics
Murray G Efford, Christine M Hunter
Sightings of previously marked animals can extend a capture-recapture dataset without the added cost of capturing new animals for marking. Combined marking and resighting methods are therefore an attractive option in animal population studies, and there exist various likelihood-based non-spatial models, and some spatial versions fitted by Markov chain Monte Carlo sampling. As implemented to date, the focus has been on modeling sightings only, which requires that the spatial distribution of pre-marked animals is known...
August 23, 2017: Biometrics
(no author information available yet)
No abstract text is available yet for this article.
August 21, 2017: Biometrics
Dan Jackson, Sylwia Bujkiewicz, Martin Law, Richard D Riley, Ian R White
Random-effects meta-analyses are very commonly used in medical statistics. Recent methodological developments include multivariate (multiple outcomes) and network (multiple treatments) meta-analysis. Here, we provide a new model and corresponding estimation procedure for multivariate network meta-analysis, so that multiple outcomes and treatments can be included in a single analysis. Our new multivariate model is a direct extension of a univariate model for network meta-analysis that has recently been proposed...
August 14, 2017: Biometrics
Veronika Skrivankova, Patrick J Heagerty
Clinical practice may be enhanced by use of person-level information that could guide treatment choice and lead to better outcomes for both treated individuals and for the population. The scientific challenge is to identify and validate those factors that can reliably be used to target treatment, and to accurately quantify the expected treatment benefit as a function of candidate markers. Our proposal is to explicitly focus on smooth non-parametric evaluation of a canonical single index score that estimates the expected treatment benefit associated with patient characteristics...
August 7, 2017: Biometrics
Sehee Kim, Douglas E Schaubel, Keith P McCullough
We propose a C-index (index of concordance) applicable to recurrent event data. The present work addresses the dearth of measures for quantifying a regression model's ability to discriminate with respect to recurrent event risk. The data which motivated the methods arise from the Dialysis Outcomes and Practice Patterns Study (DOPPS), a long-running prospective international study of end-stage renal disease patients on hemodialysis. We derive the theoretical properties of the measure under the proportional rates model (Lin et al...
August 3, 2017: Biometrics
Qingning Zhou, Jianwen Cai, Haibo Zhou
Epidemiologic studies and disease prevention trials often seek to relate an exposure variable to a failure time that suffers from interval-censoring. When the failure rate is low and the time intervals are wide, a large cohort is often required so as to yield reliable precision on the exposure-failure-time relationship. However, large cohort studies with simple random sampling could be prohibitive for investigators with a limited budget, especially when the exposure variables are expensive to obtain. Alternative cost-effective sampling designs and inference procedures are therefore desirable...
August 3, 2017: Biometrics
Yuping Zhang, Zhengqing Ouyang
We consider a research scenario motivated by integrating multiple sources of information for better knowledge discovery in diverse dynamic biological processes. Given two longitudinal high-dimensional datasets for a group of subjects, we want to extract shared latent trends and identify relevant features. To solve this problem, we present a new statistical method named as joint principal trend analysis (JPTA). We demonstrate the utility of JPTA through simulations and applications to gene expression data of the mammalian cell cycle and longitudinal transcriptional profiling data in response to influenza viral infections...
July 31, 2017: Biometrics
You Wu, Jeremy Gaskins, Maiying Kong, Susmita Datta
Phosphorylated proteins provide insight into tumor etiology and are used as diagnostic, prognostic, and therapeutic markers of complex diseases. However, pre-analytic variations, such as freezing delay after biopsy acquisition, often occur in real hospital settings and potentially lead to inaccurate results. The objective of this work is to develop statistical methodology to assess the stability of phosphorylated proteins under short-time cold ischemia. We consider a hierarchical model to determine if phosphorylation abundance of a protein at a particular phosphorylation site remains constant or not during cold ischemia...
July 25, 2017: Biometrics
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"