journal
MENU ▼
Read by QxMD icon Read
search

Lifetime Data Analysis

journal
https://www.readbyqxmd.com/read/28608228/joint-analysis-of-interval-censored-failure-time-data-and-panel-count-data
#1
Da Xu, Hui Zhao, Jianguo Sun
Interval-censored failure time data and panel count data are two types of incomplete data that commonly occur in event history studies and many methods have been developed for their analysis separately (Sun in The statistical analysis of interval-censored failure time data. Springer, New York, 2006; Sun and Zhao in The statistical analysis of panel count data. Springer, New York, 2013). Sometimes one may be interested in or need to conduct their joint analysis such as in the clinical trials with composite endpoints, for which it does not seem to exist an established approach in the literature...
June 12, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/28550654/censored-cumulative-residual-independent-screening-for-ultrahigh-dimensional-survival-data
#2
Jing Zhang, Guosheng Yin, Yanyan Liu, Yuanshan Wu
For complete ultrahigh-dimensional data, sure independent screening methods can effectively reduce the dimensionality while retaining all the active variables with high probability. However, limited screening methods have been developed for ultrahigh-dimensional survival data subject to censoring. We propose a censored cumulative residual independent screening method that is model-free and enjoys the sure independent screening property. Active variables tend to be ranked above the inactive ones in terms of their association with the survival times...
May 26, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/28536818/bayesian-bivariate-survival-analysis-using-the-power-variance-function-copula
#3
Jose S Romeo, Renate Meyer, Diego I Gallardo
Copula models have become increasingly popular for modelling the dependence structure in multivariate survival data. The two-parameter Archimedean family of Power Variance Function (PVF) copulas includes the Clayton, Positive Stable (Gumbel) and Inverse Gaussian copulas as special or limiting cases, thus offers a unified approach to fitting these important copulas. Two-stage frequentist procedures for estimating the marginal distributions and the PVF copula have been suggested by Andersen (Lifetime Data Anal 11:333-350, 2005), Massonnet et al...
May 23, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/27170333/mark-specific-additive-hazards-regression-with-continuous-marks
#4
Dongxiao Han, Liuquan Sun, Yanqing Sun, Li Qi
For survival data, mark variables are only observed at uncensored failure times, and it is of interest to investigate whether there is any relationship between the failure time and the mark variable. The additive hazards model, focusing on hazard differences rather than hazard ratios, has been widely used in practice. In this article, we propose a mark-specific additive hazards model in which both the regression coefficient functions and the baseline hazard function depend nonparametrically on a continuous mark...
July 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/26995733/analysis-of-two-phase-sampling-data-with-semiparametric-additive-hazards-models
#5
Yanqing Sun, Xiyuan Qian, Qiong Shou, Peter B Gilbert
Under the case-cohort design introduced by Prentice (Biometrica 73:1-11, 1986), the covariate histories are ascertained only for the subjects who experience the event of interest (i.e., the cases) during the follow-up period and for a relatively small random sample from the original cohort (i.e., the subcohort). The case-cohort design has been widely used in clinical and epidemiological studies to assess the effects of covariates on failure times. Most statistical methods developed for the case-cohort design use the proportional hazards model, and few methods allow for time-varying regression coefficients...
July 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/26993982/generalized-accelerated-failure-time-spatial-frailty-model-for-arbitrarily-censored-data
#6
Haiming Zhou, Timothy Hanson, Jiajia Zhang
Flexible incorporation of both geographical patterning and risk effects in cancer survival models is becoming increasingly important, due in part to the recent availability of large cancer registries. Most spatial survival models stochastically order survival curves from different subpopulations. However, it is common for survival curves from two subpopulations to cross in epidemiological cancer studies and thus interpretable standard survival models can not be used without some modification. Common fixes are the inclusion of time-varying regression effects in the proportional hazards model or fully nonparametric modeling, either of which destroys any easy interpretability from the fitted model...
July 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/26880366/landmark-estimation-of-survival-and-treatment-effects-in-observational-studies
#7
Layla Parast, Beth Ann Griffin
Clinical studies aimed at identifying effective treatments to reduce the risk of disease or death often require long term follow-up of participants in order to observe a sufficient number of events to precisely estimate the treatment effect. In such studies, observing the outcome of interest during follow-up may be difficult and high rates of censoring may be observed which often leads to reduced power when applying straightforward statistical methods developed for time-to-event data. Alternative methods have been proposed to take advantage of auxiliary information that may potentially improve efficiency when estimating marginal survival and improve power when testing for a treatment effect...
April 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/26423302/nonparametric-inference-for-the-joint-distribution-of-recurrent-marked-variables-and-recurrent-survival-time
#8
Laura M Yee, Kwun Chuen Gary Chan
Time between recurrent medical events may be correlated with the cost incurred at each event. As a result, it may be of interest to describe the relationship between recurrent events and recurrent medical costs by estimating a joint distribution. In this paper, we propose a nonparametric estimator for the joint distribution of recurrent events and recurrent medical costs in right-censored data. We also derive the asymptotic variance of our estimator, a test for equality of recurrent marker distributions, and present simulation studies to demonstrate the performance of our point and variance estimators...
April 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/28349290/exponentiated-weibull-regression-for-time-to-event-data
#9
Shahedul A Khan
The Weibull, log-logistic and log-normal distributions are extensively used to model time-to-event data. The Weibull family accommodates only monotone hazard rates, whereas the log-logistic and log-normal are widely used to model unimodal hazard functions. The increasing availability of lifetime data with a wide range of characteristics motivate us to develop more flexible models that accommodate both monotone and nonmonotone hazard functions. One such model is the exponentiated Weibull distribution which not only accommodates monotone hazard functions but also allows for unimodal and bathtub shape hazard rates...
March 27, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/28238045/estimation-of-the-cumulative-incidence-function-under-multiple-dependent-and-independent-censoring-mechanisms
#10
Judith J Lok, Shu Yang, Brian Sharkey, Michael D Hughes
Competing risks occur in a time-to-event analysis in which a patient can experience one of several types of events. Traditional methods for handling competing risks data presuppose one censoring process, which is assumed to be independent. In a controlled clinical trial, censoring can occur for several reasons: some independent, others dependent. We propose an estimator of the cumulative incidence function in the presence of both independent and dependent censoring mechanisms. We rely on semi-parametric theory to derive an augmented inverse probability of censoring weighted (AIPCW) estimator...
February 25, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/28224260/modeling-restricted-mean-survival-time-under-general-censoring-mechanisms
#11
Xin Wang, Douglas E Schaubel
Restricted mean survival time (RMST) is often of great clinical interest in practice. Several existing methods involve explicitly projecting out patient-specific survival curves using parameters estimated through Cox regression. However, it would often be preferable to directly model the restricted mean for convenience and to yield more directly interpretable covariate effects. We propose generalized estimating equation methods to model RMST as a function of baseline covariates. The proposed methods avoid potentially problematic distributional assumptions pertaining to restricted survival time...
February 21, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/28215038/variable-selection-and-prediction-in-biased-samples-with-censored-outcomes
#12
Ying Wu, Richard J Cook
With the increasing availability of large prospective disease registries, scientists studying the course of chronic conditions often have access to multiple data sources, with each source generated based on its own entry conditions. The different entry conditions of the various registries may be explicitly based on the response process of interest, in which case the statistical analysis must recognize the unique truncation schemes. Moreover, intermittent assessment of individuals in the registries can lead to interval-censored times of interest...
February 18, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/28168333/conditional-maximum-likelihood-estimation-in-semiparametric-transformation-model-with-ltrc-data
#13
Chyong-Mei Chen, Pao-Sheng Shen
Left-truncated data often arise in epidemiology and individual follow-up studies due to a biased sampling plan since subjects with shorter survival times tend to be excluded from the sample. Moreover, the survival time of recruited subjects are often subject to right censoring. In this article, a general class of semiparametric transformation models that include proportional hazards model and proportional odds model as special cases is studied for the analysis of left-truncated and right-censored data. We propose a conditional likelihood approach and develop the conditional maximum likelihood estimators (cMLE) for the regression parameters and cumulative hazard function of these models...
February 6, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/28132157/evaluation-of-the-treatment-time-lag-effect-for-survival-data
#14
Kayoung Park, Peihua Qiu
Medical treatments often take a period of time to reveal their impact on subjects, which is the so-called time-lag effect in the literature. In the survival data analysis literature, most existing methods compare two treatments in the entire study period. In cases when there is a substantial time-lag effect, these methods would not be effective in detecting the difference between the two treatments, because the similarity between the treatments during the time-lag period would diminish their effectiveness. In this paper, we develop a novel modeling approach for estimating the time-lag period and for comparing the two treatments properly after the time-lag effect is accommodated...
January 28, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/28091785/editorial
#15
EDITORIAL
Mei-Cheng Wang, Chiung-Yu Huang
No abstract text is available yet for this article.
January 16, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/28058569/regression-analysis-of-current-status-data-with-auxiliary-covariates-and-informative-observation-times
#16
Yanqin Feng, Yurong Chen
This paper discusses regression analysis of current status failure time data with information observations and continuous auxiliary covariates. Under the additive hazards model, we employ a frailty model to describe the relationship between the failure time of interest and censoring time through some latent variables and propose an estimated partial likelihood estimator of regression parameters that makes use of the available auxiliary information. Asymptotic properties of the resulting estimators are established...
January 5, 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/27388910/acceleration-of-expectation-maximization-algorithm-for-length-biased-right-censored-data
#17
Kwun Chuen Gary Chan
Vardi's Expectation-Maximization (EM) algorithm is frequently used for computing the nonparametric maximum likelihood estimator of length-biased right-censored data, which does not admit a closed-form representation. The EM algorithm may converge slowly, particularly for heavily censored data. We studied two algorithms for accelerating the convergence of the EM algorithm, based on iterative convex minorant and Aitken's delta squared process. Numerical simulations demonstrate that the acceleration algorithms converge more rapidly than the EM algorithm in terms of number of iterations and actual timing...
January 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/27086362/nonparametric-and-semiparametric-regression-estimation-for-length-biased-survival-data
#18
Yu Shen, Jing Ning, Jing Qin
For the past several decades, nonparametric and semiparametric modeling for conventional right-censored survival data has been investigated intensively under a noninformative censoring mechanism. However, these methods may not be applicable for analyzing right-censored survival data that arise from prevalent cohorts when the failure times are subject to length-biased sampling. This review article is intended to provide a summary of some newly developed methods as well as established methods for analyzing length-biased data...
January 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/27007859/joint-modeling-of-longitudinal-and-survival-data-with-the-cox-model-and-two-phase-sampling
#19
Rong Fu, Peter B Gilbert
A common objective of cohort studies and clinical trials is to assess time-varying longitudinal continuous biomarkers as correlates of the instantaneous hazard of a study endpoint. We consider the setting where the biomarkers are measured in a designed sub-sample (i.e., case-cohort or two-phase sampling design), as is normative for prevention trials. We address this problem via joint models, with underlying biomarker trajectories characterized by a random effects model and their relationship with instantaneous risk characterized by a Cox model...
January 2017: Lifetime Data Analysis
https://www.readbyqxmd.com/read/26759313/recent-progresses-in-outcome-dependent-sampling-with-failure-time-data
#20
Jieli Ding, Tsui-Shan Lu, Jianwen Cai, Haibo Zhou
An outcome-dependent sampling (ODS) design is a retrospective sampling scheme where one observes the primary exposure variables with a probability that depends on the observed value of the outcome variable. When the outcome of interest is failure time, the observed data are often censored. By allowing the selection of the supplemental samples depends on whether the event of interest happens or not and oversampling subjects from the most informative regions, ODS design for the time-to-event data can reduce the cost of the study and improve the efficiency...
January 2017: Lifetime Data Analysis
journal
journal
32387
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"