journal
https://read.qxmd.com/read/38642215/regression-analysis-of-doubly-censored-failure-time-data-with-ancillary-information
#1
JOURNAL ARTICLE
Mingyue Du, Xiyuan Gao, Ling Chen
Doubly censored failure time data occur in many areas and for the situation, the failure time of interest usually represents the elapsed time between two related events such as an infection and the resulting disease onset. Although many methods have been proposed for regression analysis of such data, most of them are conditional on the occurrence time of the initial event and ignore the relationship between the two events or the ancillary information contained in the initial event. Corresponding to this, a new sieve maximum likelihood approach is proposed that makes use of the ancillary information, and in the method, the logistic model and Cox proportional hazards model are employed to model the initial event and the failure time of interest, respectively...
April 20, 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38625444/partial-linear-single-index-transformation-models-with-censored-data
#2
JOURNAL ARTICLE
Myeonggyun Lee, Andrea B Troxel, Mengling Liu
In studies with time-to-event outcomes, multiple, inter-correlated, and time-varying covariates are commonly observed. It is of great interest to model their joint effects by allowing a flexible functional form and to delineate their relative contributions to survival risk. A class of semiparametric transformation (ST) models offers flexible specifications of the intensity function and can be a general framework to accommodate nonlinear covariate effects. In this paper, we propose a partial-linear single-index (PLSI) transformation model that reduces the dimensionality of multiple covariates into a single index and provides interpretable estimates of the covariate effects...
April 16, 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38565754/cox-model-inference-for-relative-hazard-and-pure-risk-from-stratified-weight-calibrated-case-cohort-data
#3
JOURNAL ARTICLE
Lola Etievant, Mitchell H Gail
The case-cohort design obtains complete covariate data only on cases and on a random sample (the subcohort) of the entire cohort. Subsequent publications described the use of stratification and weight calibration to increase efficiency of estimates of Cox model log-relative hazards, and there has been some work estimating pure risk. Yet there are few examples of these options in the medical literature, and we could not find programs currently online to analyze these various options. We therefore present a unified approach and R software to facilitate such analyses...
April 2, 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38512595/on-the-role-of-volterra-integral-equations-in-self-consistent-product-limit-inverse-probability-of-censoring-weighted-and-redistribution-to-the-right-estimators-for-the-survival-function
#4
JOURNAL ARTICLE
Robert L Strawderman, Benjamin R Baer
This paper reconsiders several results of historical and current importance to nonparametric estimation of the survival distribution for failure in the presence of right-censored observation times, demonstrating in particular how Volterra integral equations help inter-connect the resulting estimators. The paper begins by considering Efron's self-consistency equation, introduced in a seminal 1967 Berkeley symposium paper. Novel insights provided in the current work include the observations that (i) the self-consistency equation leads directly to an anticipating Volterra integral equation whose solution is given by a product-limit estimator for the censoring survival function; (ii) a definition used in this argument immediately establishes the familiar product-limit estimator for the failure survival function; (iii) the usual Volterra integral equation for the product-limit estimator of the failure survival function leads to an immediate and simple proof that it can be represented as an inverse probability of censoring weighted estimator; (iv) a simple identity characterizes the relationship between natural inverse probability of censoring weighted estimators for the survival and distribution functions of failure; (v) the resulting inverse probability of censoring weighted estimators, attributed to a highly influential 1992 paper of Robins and Rotnitzky, were implicitly introduced in Efron's 1967 paper in its development of the redistribution-to-the-right algorithm...
March 21, 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38478314/model-averaging-for-right-censored-data-with-measurement-error
#5
JOURNAL ARTICLE
Zhongqi Liang, Caiya Zhang, Linjun Xu
This paper studies a novel model averaging estimation issue for linear regression models when the responses are right censored and the covariates are measured with error. A novel weighted Mallows-type criterion is proposed for the considered issue by introducing multiple candidate models. The weight vector for model averaging is selected by minimizing the proposed criterion. Under some regularity conditions, the asymptotic optimality of the selected weight vector is established in terms of its ability to achieve the lowest squared loss asymptotically...
March 13, 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38436831/on-variable-selection-in-a-semiparametric-aft-mixture-cure-model
#6
JOURNAL ARTICLE
Motahareh Parsa, Seyed Mahmood Taghavi-Shahri, Ingrid Van Keilegom
In clinical studies, one often encounters time-to-event data that are subject to right censoring and for which a fraction of the patients under study never experience the event of interest. Such data can be modeled using cure models in survival analysis. In the presence of cure fraction, the mixture cure model is popular, since it allows to model probability to be cured (called the incidence) and the survival function of the uncured individuals (called the latency). In this paper, we develop a variable selection procedure for the incidence and latency parts of a mixture cure model, consisting of a logistic model for the incidence and a semiparametric accelerated failure time model for the latency...
March 4, 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38427151/a-bayesian-quantile-joint-modeling-of-multivariate-longitudinal-and-time-to-event-data
#7
JOURNAL ARTICLE
Damitri Kundu, Shekhar Krishnan, Manash Pratim Gogoi, Kiranmoy Das
Linear mixed models are traditionally used for jointly modeling (multivariate) longitudinal outcomes and event-time(s). However, when the outcomes are non-Gaussian a quantile regression model is more appropriate. In addition, in the presence of some time-varying covariates, it might be of interest to see how the effects of different covariates vary from one quantile level (of outcomes) to the other, and consequently how the event-time changes across different quantiles. For such analyses linear quantile mixed models can be used, and an efficient computational algorithm can be developed...
March 1, 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38403840/pseudo-value-regression-trees
#8
JOURNAL ARTICLE
Alina Schenk, Moritz Berger, Matthias Schmid
This paper presents a semi-parametric modeling technique for estimating the survival function from a set of right-censored time-to-event data. Our method, named pseudo-value regression trees (PRT), is based on the pseudo-value regression framework, modeling individual-specific survival probabilities by computing pseudo-values and relating them to a set of covariates. The standard approach to pseudo-value regression is to fit a main-effects model using generalized estimating equations (GEE). PRT extend this approach by building a multivariate regression tree with pseudo-value outcome and by successively fitting a set of regularized additive models to the data in the nodes of the tree...
February 25, 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38358572/the-built-in-selection-bias-of-hazard-ratios-formalized-using-structural-causal-models
#9
JOURNAL ARTICLE
Richard A J Post, Edwin R van den Heuvel, Hein Putter
It is known that the hazard ratio lacks a useful causal interpretation. Even for data from a randomized controlled trial, the hazard ratio suffers from so-called built-in selection bias as, over time, the individuals at risk among the exposed and unexposed are no longer exchangeable. In this paper, we formalize how the expectation of the observed hazard ratio evolves and deviates from the causal effect of interest in the presence of heterogeneity of the hazard rate of unexposed individuals (frailty) and heterogeneity in effect (individual modification)...
February 15, 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38466520/bias-of-the-additive-hazard-model-in-the-presence-of-causal-effect-heterogeneity
#10
RANDOMIZED CONTROLLED TRIAL
Richard A J Post, Edwin R van den Heuvel, Hein Putter
Hazard ratios are prone to selection bias, compromising their use as causal estimands. On the other hand, if Aalen's additive hazard model applies, the hazard difference has been shown to remain unaffected by the selection of frailty factors over time. Then, in the absence of confounding, observed hazard differences are equal in expectation to the causal hazard differences. However, in the presence of effect (on the hazard) heterogeneity, the observed hazard difference is also affected by selection of survivors...
April 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38238637/quantile-difference-estimation-with-censoring-indicators-missing-at-random
#11
JOURNAL ARTICLE
Cui-Juan Kong, Han-Ying Liang
In this paper, we define estimators of distribution functions when the data are right-censored and the censoring indicators are missing at random, and establish their strong representations and asymptotic normality. Besides, based on empirical likelihood method, we define maximum empirical likelihood estimators and smoothed log-empirical likelihood ratios of two-sample quantile difference in the presence and absence of auxiliary information, respectively, and prove their asymptotic distributions. Simulation study and real data analysis are conducted to investigate the finite sample behavior of the proposed methods...
January 18, 2024: Lifetime Data Analysis
https://read.qxmd.com/read/38150170/preface
#12
EDITORIAL
Jialiang Li, Stijn Vansteelandt
No abstract text is available yet for this article.
December 27, 2023: Lifetime Data Analysis
https://read.qxmd.com/read/38015378/a-bayesian-proportional-hazards-mixture-cure-model-for-interval-censored-data
#13
JOURNAL ARTICLE
Chun Pan, Bo Cai, Xuemei Sui
The proportional hazards mixture cure model is a popular analysis method for survival data where a subgroup of patients are cured. When the data are interval-censored, the estimation of this model is challenging due to its complex data structure. In this article, we propose a computationally efficient semiparametric Bayesian approach, facilitated by spline approximation and Poisson data augmentation, for model estimation and inference with interval-censored data and a cure rate. The spline approximation and Poisson data augmentation greatly simplify the MCMC algorithm and enhance the convergence of the MCMC chains...
November 28, 2023: Lifetime Data Analysis
https://read.qxmd.com/read/38007694/efficiency-of-the-breslow-estimator-in-semiparametric-transformation-models
#14
JOURNAL ARTICLE
Theresa P Devasia, Alexander Tsodikov
Semiparametric transformation models for failure time data consist of a parametric regression component and an unspecified cumulative baseline hazard. The nonparametric maximum likelihood estimator (NPMLE) of the cumulative baseline hazard can be summarized in terms of weights introduced into a Breslow-type estimator (Weighted Breslow). At any given time point, the weights invoke an integral over the future of the cumulative baseline hazard, which presents theoretical and computational challenges. A simpler non-MLE Breslow-type estimator (Breslow) was derived earlier from a martingale estimating equation (MEE) setting observed and expected counts of failures equal, conditional on the past history...
November 26, 2023: Lifetime Data Analysis
https://read.qxmd.com/read/37975951/assessing-model-prediction-performance-for-the-expected-cumulative-number-of-recurrent-events
#15
JOURNAL ARTICLE
Olivier Bouaziz
In a recurrent event setting, we introduce a new score designed to evaluate the prediction ability, for a given model, of the expected cumulative number of recurrent events. This score can be seen as an extension of the Brier Score for single time to event data but works for recurrent events with or without a terminal event. Theoretical results are provided that show that under standard assumptions in a recurrent event context, our score can be asymptotically decomposed as the sum of the theoretical mean squared error between the model and the true expected cumulative number of recurrent events and an inseparability term that does not depend on the model...
November 17, 2023: Lifetime Data Analysis
https://read.qxmd.com/read/37955788/bias-reduction-for-semi-competing-risks-frailty-model-with-rare-events-application-to-a-chronic-kidney-disease-cohort-study-in-south-korea
#16
JOURNAL ARTICLE
Jayoun Kim, Boram Jeong, Il Do Ha, Kook-Hwan Oh, Ji Yong Jung, Jong Cheol Jeong, Donghwan Lee
In a semi-competing risks model in which a terminal event censors a non-terminal event but not vice versa, the conventional method can predict clinical outcomes by maximizing likelihood estimation. However, this method can produce unreliable or biased estimators when the number of events in the datasets is small. Specifically, parameter estimates may converge to infinity, or their standard errors can be very large. Moreover, terminal and non-terminal event times may be correlated, which can account for the frailty term...
November 13, 2023: Lifetime Data Analysis
https://read.qxmd.com/read/37713017/cox-1972-recollections-and-reflections
#17
JOURNAL ARTICLE
David Oakes
I present some personal memories and thoughts on Cox's 1972 paper "Regression Models and Life-Tables".
September 15, 2023: Lifetime Data Analysis
https://read.qxmd.com/read/37659991/dynamic-treatment-regimes-using-bayesian-additive-regression-trees-for-censored-outcomes
#18
JOURNAL ARTICLE
Xiao Li, Brent R Logan, S M Ferdous Hossain, Erica E M Moodie
To achieve the goal of providing the best possible care to each individual under their care, physicians need to customize treatments for individuals with the same health state, especially when treating diseases that can progress further and require additional treatments, such as cancer. Making decisions at multiple stages as a disease progresses can be formalized as a dynamic treatment regime (DTR). Most of the existing optimization approaches for estimating dynamic treatment regimes including the popular method of Q-learning were developed in a frequentist context...
September 2, 2023: Lifetime Data Analysis
https://read.qxmd.com/read/37620504/causal-survival-analysis-under-competing-risks-using-longitudinal-modified-treatment-policies
#19
JOURNAL ARTICLE
Iván Díaz, Katherine L Hoffman, Nima S Hejazi
Longitudinal modified treatment policies (LMTP) have been recently developed as a novel method to define and estimate causal parameters that depend on the natural value of treatment. LMTPs represent an important advancement in causal inference for longitudinal studies as they allow the non-parametric definition and estimation of the joint effect of multiple categorical, ordinal, or continuous treatments measured at several time points. We extend the LMTP methodology to problems in which the outcome is a time-to-event variable subject to a competing event that precludes observation of the event of interest...
August 24, 2023: Lifetime Data Analysis
https://read.qxmd.com/read/37581774/bayesian-semiparametric-joint-model-of-multivariate-longitudinal-and-survival-data-with-dependent-censoring
#20
JOURNAL ARTICLE
An-Min Tang, Nian-Sheng Tang, Dalei Yu
We consider a novel class of semiparametric joint models for multivariate longitudinal and survival data with dependent censoring. In these models, unknown-fashion cumulative baseline hazard functions are fitted by a novel class of penalized-splines (P-splines) with linear constraints. The dependence between the failure time of interest and censoring time is accommodated by a normal transformation model, where both nonparametric marginal survival function and censoring function are transformed to standard normal random variables with bivariate normal joint distribution...
August 15, 2023: Lifetime Data Analysis
journal
journal
32387
1
2
Fetch more papers »
Fetching more papers... Fetching...
Remove bar
Read by QxMD icon Read
×

Save your favorite articles in one place with a free QxMD account.

×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"

We want to hear from doctors like you!

Take a second to answer a survey question.