I've yet to do a post on IPTW regressions, although I have been doing some applied work using them. I have found similar results comparing nerual network, decision tree, logistic regression, and gradient boosting propensity score methods in applied examples. This paper provides more robust results using simulation.
Lee BK, Lessler J, Stuart EA (2011) Weight Trimming and
Propensity Score Weighting. PLoS ONE 6(3): e18174.
doi:10.1371/journal.pone.0018174
“Propensity score weighting is sensitive to model
misspecification and outlying weights that can unduly influence results. The
authors investigated whether trimming large weights downward can improve the
performance of propensity score weighting and whether the benefits of trimming
differ by propensity score estimation method. In a simulation study, the
authors examined the performance of weight trimming following logistic
regression, classification and regression trees (CART), boosted CART, and
random forests to estimate propensity score weights. Results indicate that
although misspecified logistic regression propensity score models yield
increased bias and standard errors, weight trimming following logistic
regression can improve the accuracy and precision of final parameter estimates.
In contrast, weight trimming did not
improve the performance of boosted CART and random forests. The performance of
boosted CART and random forests without weight trimming was similar to the best
performance obtainable by weight trimmed logistic regression estimated
propensity scores. While trimming may be used to optimize propensity score
weights estimated using logistic regression, the optimal level of trimming is
difficult to determine. These results indicate that although trimming can
improve inferences in some settings, in order to consistently improve the
performance of propensity score weighting, analysts should focus on the
procedures leading to the generation of weights (i.e., proper specification of
the propensity score model) rather than relying on ad-hoc methods such as
weight trimming.”
No comments:
Post a Comment