Browsing by Author "Ensor, Katherine B."
Now showing 1 - 20 of 45
Results Per Page
Sort Options
Item A comprehensive approach to spatial and spatiotemporal dependence modeling(2000) Baggett, Larry Scott; Ensor, Katherine B.One of the most difficult tasks of modeling spatial and spatiotemporal random fields is that of deriving an accurate representation of the dependence structure. In practice, the researcher is faced with selecting the best empirical representation of the data, the proper family of parametric models, and the most efficient method of parameter estimation once the model is selected. Each of these decisions has direct consequence on the prediction accuracy of the modeled random field. In order to facilitate the process of spatial dependence modeling, a general class of covariogram estimators is introduced. They are derived by direct application of Bochner's theorem on the Fourier-Bessel series representation of the covariogram. Extensions are derived for one, two and three dimensions and spatiotemporal extensions for one, two and three spatial dimensions as well. A spatial application is demonstrated for prediction of the distribution of sediment contaminants in Galveston Bay estuary, Texas. Also included is a spatiotemporal application to generate predictions for sea surface temperatures adjusted for periodic climatic effects from a long-term study region off southern California.Item A spatiotemporal case-crossover model of asthma exacerbation in the City of Houston(Wiley, 2021) Schedler, Julia C.; Ensor, Katherine B.Case-crossover design is a popular construction for analyzing the impact of a transient effect, such as ambient pollution levels, on an acute outcome, such as an asthma exacerbation. Case-crossover design avoids the need to model individual, time-varying risk factors for cases by using cases as their own ‘controls’, chosen to be time periods for which individual risk factors can be assumed constant and need not be modelled. Many studies have examined the complex effects of the control period structure on model performance, but these discussions were simplified when case-crossover design was shown to be equivalent to various specifications of Poisson regression when exposure is considered constant across study participants. While reasonable for some applications, there are cases where such an assumption does not apply due to spatial variability in exposure, which may affect parameter estimation. This work presents a spatiotemporal model, which has temporal case-crossover and a geometrically aware spatial random effect based on the Hausdorff distance. The model construction incorporates a residual spatial structure in cases when the constant assumption exposure is not reasonable and when spatial regions are irregular.Item A time series approach to quality control(1991) Dittrich, Gayle Lynn; Ensor, Katherine B.One way that a process may be said to be "out-of-control" is when a cyclical pattern exists in the observations over time. It is necessary that an accurate control chart be developed to signal when a cycle is present in the process. Two control charts have recently been developed to deal with this problem. One, based on the periodogram, provides a test based on a finite number of frequencies. The other method uses a test which estimates a statistic which covers all frequency values. However, both methods fail to estimate the frequency value of the cycle and are computationally difficult. A new control chart is proposed which not only covers a continuous range of frequency values, but also estimates the frequency of the cycle. It in addition is easier to understand and compute than the two other methods.Item An approach to modeling a multivariate spatial-temporal process(2000) Calizzi, Mary Anne; Ensor, Katherine B.Although modeling of spatial-temporal stochastic processes is a growing area of research, one underdeveloped area in this field is the multivariate space-time setting. The motivation for this research originates from air quality studies. By treating each air pollutant as a separate variable, the multivariate approach will enable modeling of not only the behavior of the individual pollutants but also the interaction between pollutants over space and time. Studying both the spatial and the temporal aspects of the process gives a more accurate picture of the behavior of the process. A bivariate state-space model is developed and includes a covariance function which can account for the different cross-covariances across space and time. The Kalman filter is used for parameter estimation and prediction. The model is evaluated through the prediction efforts in an air-quality application.Item An examination of some open problems in time series analysis(2005) Davis, Ginger Michelle; Ensor, Katherine B.We investigate two open problems in the area of time series analysis. The first is developing a methodology for multivariate time series analysis when our time series has components that are both continuous and categorical. Our specific contribution is a logistic smooth transition regression (LSTR) model whose transition variable is related to a categorical variable. This methodology is necessary for series that exhibit nonlinear behavior dependent on a categorical variable. The estimation procedure is investigated both with simulation and an economic example. The second contribution to time series analysis is examining the evolving structure in multivariate time series. The application area we concentrate on is financial time series. Many models exist for the joint analysis of several financial instruments such as securities due to the fact that they are not independent. These models often assume some type of constant behavior between the instruments over the time period of analysis. Instead of imposing this assumption, we are interested in understanding the dynamic covariance structure in our multivariate financial time series, which will provide us with an understanding of changing market conditions. In order to achieve this understanding, we first develop a multivariate model for the conditional covariance and then examine that estimate for changing structure using multivariate techniques. Specifically, we simultaneously model individual stock data that belong to one of three market sectors and examine the behavior of the market as a whole as well as the behavior of the sectors. Our aims are detecting and forecasting unusual changes in the system, such as market collapses and outliers, and understanding the issue of portfolio diversification in multivariate financial series from different industry sectors. The motivation for this research concerns portfolio diversification. The false assumption that investment in different industry sectors is uncorrelated is not made. Instead, we assume that the comovement of stocks within and between sectors changes with market conditions. Some of these market conditions include market crashes or collapses and common external influences.Item An Old Dog Learns New Tricks: Novel Applications of Kernel Density Estimators on Two Financial Datasets(2017-12-01) Ginley, Matthew Cline; Ensor, Katherine B.; Scott, David W.In our first application, we contribute two nonparametric simulation methods for analyzing Leveraged Exchange Traded Fund (LETF) return volatility and how this dynamic is related to the underlying index. LETFs are constructed to provide the indicated leverage multiple of the daily total return on an underlying index. LETFs may perform as expected on a daily basis; however, fund issuers state there is no guarantee of achieving the multiple of the index return over longer time horizons. Most, if not all LETF returns data are difficult to model because of the extreme volatility present and limited availability of data. First, to isolate the effects of daily, leveraged compounding on LETF volatility, we propose an innovative method for simulating daily index returns with a chosen constraint on the multi-day period return. By controlling for the performance of the underlying index, the range of volatilities observed in a simulated sample can be attributed to compounding with leverage and the presence of tracking errors. Second, to overcome the limited history of LETF returns data, we propose a method for simulating implied LETF tracking errors while still accounting for their dependence on underlying index returns. This allows for the incorporation of the complete history of index returns in an LETF returns model. Our nonparametric methods are flexible-- easily incorporating any chosen number of days, leverage ratios, or period return constraints, and can be used in combination or separately to model any quantity of interest derived from daily LETF returns. For our second application, we tackle binary classification problems with extremely low class 1 proportions. These ``rare events'' problems are a considerable challenge, which is magnified when dealing with large datasets. Having a minuscule count of class 1 observations motivates the implementation of more sophisticated methods to minimize forecasting bias towards the majority class. We propose an alternative approach to established up-sampling or down-sampling algorithms driven by kernel density estimators to transform the class labels to continuous targets. Having effectively transformed the problem from classification to regression, we argue that under the assumption of a monotonic relationship between predictors and the target, approximations of the majority class are possible in a rare events setting with the use of simple heuristics. By significantly reducing the burden posed by the majority class, the complexities of minority class membership can be modeled more effectively using monotonically constrained nonparametric regression methods. Our approach is demonstrated on a large financial dataset with an extremely low class 1 proportion. Additionally, novel features engineering is introduced to assist in the application of the density estimator used for class label transformation.Item Association of Out-of-Hospital Cardiac Arrest with Exposure to Fine Particulate and Ozone Ambient Air Pollution from Case-Crossover Analysis Results: Are the Standards Protective?(James A. Baker III Institute for Public Policy) Raun, Loren; Ensor, Katherine B.; James A. Baker III Institute for Public PolicyAbout 300,000 cardiac arrests occur outside of hospitals in the United States each year; most are fatal. Studies have shown that a small but significant percentage of cardiac arrests appear to be triggered by exposure to increased levels one of two air pollutants: fine particulate matter and ozone. We analyzed seven key studies to determine if Environmental Protection Agency (EPA) standards protect the public from out-of-hospital cardiac arrests (OHCA) triggered by exposure to fine particulate matter and ozone. Using Houston, Texas, data, we found evidence of an increased risk of cardiac arrest on the order of 2% to 9% due to an increase of fine particulate levels (a daily average increase of 10 µg/m3) on the day of, or day before, the heart attack. The EPA fine particulate standard of 35 µg/m3 (35 micrograms per cubic meter of air) therefore does not effectively protect the public from OHCA triggered by exposure to fine particulates. However, the EPA’s ozone standard does appear to adequately protect public health from OHCA triggered by exposure to ozone.Item Autocorrelated data in quality control charts(1994) Hood, Terri Frantom; Ensor, Katherine B.Control charts are regularly developed with the assumption that the process observations have an independent relationship. However, a common occurrence in certain industries is the collection of autocorrelated data. Two approaches are investigated that deal with this issue. The time series approach is based on modeling the data with an appropriate time series model to remove the autocorrelative structure. The EWMA approach is based on modeling the observations as a weighted average of previous data. The residuals from the two approaches are plotted on control charts and the average run lengths are compared. Both methods are applied to simulations that generate in-control data and data that have strategically located nonstandard conditions. The nonstandard conditions simulated are process change, linear drift, mean shift, and variance shift. It is proposed that the time series approach tends to perform better in these situations.Item Bayesian graphical models for biological network inference(2013-11-20) Peterson, Christine; Vannucci, Marina; Ensor, Katherine B.; Kavraki, Lydia E.; Maletic-Savatic, Mirjana; Stingo, Francesco C.In this work, we propose approaches for the inference of graphical models in the Bayesian framework. Graphical models, which use a network structure to represent conditional dependencies among random variables, provide a valuable tool for visualizing and understanding the relationships among many variables. However, since these networks are complex systems, they can be difficult to infer given a limited number of observations. Our research is focused on development of methods which allow incorporation of prior information on particular edges or on the model structure to improve the reliability of inference given small to moderate sample sizes. First, we propose an approach to graphical model inference using the Bayesian graphical lasso. Our method incorporates informative priors on the shrinkage parameters specific to each edge. We demonstrate through simulations that this method allows improved learning of the network structure when relevant prior information is available, and illustrate the approach on inference of the cellular metabolic network under neuroinflammation. This application highlights the strength of our method since the number of samples available is fairly small, but we are able to draw on rich reference information from publicly available databases describing known metabolic interactions to construct informative priors. Next, we propose a modeling approach for settings where we would like to estimate networks for a collection of possibly related sample groups, where the sample size for each subgroup may be limited. We use a Markov random field prior to link the graphs within each group, and a selection prior to infer which groups have shared network structure. This allows us to encourage common edges across sample groups, when supported by the data. We provide simulation studies to illustrate the properties of our method and compare its performance to competing approaches. We conclude by demonstrating use of the proposed method to infer protein networks for various subtypes of acute myeloid leukemia and to infer signaling networks under different experimental perturbations.Item City-Specific Air Quality Warnings for Improved Asthma Self-Management(Elsevier, 2019) Raun, Loren H.; Ensor, Katherine B.; Pederson, John E.; Campos, Laura A.; Persse, David E.INTRODUCTION: This study presents a framework for identifying "high-risk" days for asthma attacks associated with elevated concentrations of criteria pollutants using local information to warn citizens on days when the concentrations differ from Environmental Protection Agency Air Quality Index (AQI) warnings. Studies that consider the unique mixture of pollutants and the health data specific to a city provide additional information for asthma self-management. This framework is applied to air pollution and asthma data to identify supplemental warning days in Houston, Texas. METHODS: A four-step framework was established to identify days with pollutant levels that pose meaningful increased risk for asthma attacks compared with baseline. Historical associations between 18,542 ambulance-treated asthma attacks and air pollutant concentrations in Houston, Texas (2004-2016; analyzed in 2018), were analyzed using a case-crossover study design with conditional logistic regression. Days with historically high associations between pollution and asthma attacks were identified as supplemental warning days. RESULTS: Days with 8-hour maximum ozone >66.6 parts per billion for the 3 previous days and same-day 24-hour nitrogen dioxide >19.3 parts per billion pose an RR of 15% above baseline; concentrations above these levels pose an increased risk of 15% (RR=1.15, 95% CI=1.14, 1.16) and 30% (RR=1.30, 95% CI=1.29, 1.32), respectively. These warnings add an additional 12% days per year over the AQI warnings. CONCLUSIONS: Houston uses this framework to identify supplemental air quality warnings to improve asthma self-management. Supplemental days reflect risk lower than the National Ambient Air Quality Standards and consecutive poor air quality days, differing from the AQI.Item Cross-Disciplinary Consultancy to Enhance Predictions of Asthma Exacerbation Risk in Boston(Health Policy and Administration Division UIC School of Public Health, 2016) Reid, Margaret; Gunn, Julia; Shah, Snehal; Donovan, Michael; Eggo, Rosalind; Babin, Steven; Stajner, Ivanka; Rogers, Eric; Ensor, Katherine B.; Raun, Loren; Levy, Jonathan I.; Painter, Ian; Phipatanakul, Wanda; Yip, Fuyuen; Nath, Anjali; Streichert, Laura; Tong, Catherine; Burkom, HowardThis paper continues an initiative conducted by the International Society for Disease Surveillance with funding from the Defense Threat Reduction Agency to connect near-term analytical needs of public health practice with technical expertise from the global research community. The goal is to enhance investigation capabilities of day-to-day population health monitors. A prior paper described the formation of consultancies for requirements analysis and dialogue regarding costs and benefits of sustainable analytic tools. Each funded consultancy targets a use case of near-term concern to practitioners. The consultancy featured here focused on improving predictions of asthma exacerbation risk in demographic and geographic subdivisions of the city of Boston, Massachusetts, USA based on the combination of known risk factors for which evidence is routinely available. A cross-disciplinary group of 28 stakeholders attended the consultancy on March 30-31, 2016 at the Boston Public Health Commission (BPHC). Known asthma exacerbation risk factors are upper respiratory virus transmission, particularly in school-age children, harsh or extreme weather conditions, and poor air quality. Meteorological subject matter experts described availability and usage of data sources representing these risk factors. Modelers presented multiple analytic approaches including mechanistic models, machine learning approaches, simulation techniques, and hybrids. Health department staff and local partners discussed surveillance operations, constraints, and operational system requirements. Attendees valued the direct exchange of information among public health practitioners, system designers, and modelers. Discussion finalized design of an 8-year de-identified dataset of Boston ED patient records for modeling partners who sign a standard data use agreement.Item Denoising Non-stationary Signals by Dynamic Multivariate Complex Wavelet Thresholding(SSRN, 2020) Raath, Kim; Ensor, Katherine B.; Scott, David W.; Crivello, AlenaOver the past few years, we have seen an increased need for analyzing the dynamically changing behaviors of economic and financial time series. These needs have led to significant demand for methods that denoise non-stationary time series across time and for specific investment horizons (scales) and localized windows (blocks) of time. Wavelets have long been known to decompose non-stationary time series into their different components or scale pieces. Recent methods satisfying this demand first decompose the non-stationary time series using wavelet techniques and then apply a thresholding method to separate and capture the signal and noise components of the series. Traditionally, wavelet thresholding methods rely on the discrete wavelet transforms (DWT), a static thresholding technique that may not capture the time series of the estimated variance in the additive noise process. We introduce a novel continuous wavelet transform (CWT) dynamically-optimized, multivariate thresholding method. Applying this method we are simultaneously able to separate and capture the signal and noise components while estimating the dynamic noise variance. Our method shows improved results when compared to well-known methods, especially for high-frequency signal rich time series, typically observed in finance. Supplementary materials for your article are available online.Item Denoising Non-Stationary Signals via Dynamic Multivariate Complex Wavelet Thresholding(MDPI, 2023) Raath, Kim C.; Ensor, Katherine B.; Crivello, Alena; Scott, David W.Over the past few years, we have seen an increased need to analyze the dynamically changing behaviors of economic and financial time series. These needs have led to significant demand for methods that denoise non-stationary time series across time and for specific investment horizons (scales) and localized windows (blocks) of time. Wavelets have long been known to decompose non-stationary time series into their different components or scale pieces. Recent methods satisfying this demand first decompose the non-stationary time series using wavelet techniques and then apply a thresholding method to separate and capture the signal and noise components of the series. Traditionally, wavelet thresholding methods rely on the discrete wavelet transform (DWT), which is a static thresholding technique that may not capture the time series of the estimated variance in the additive noise process. We introduce a novel continuous wavelet transform (CWT) dynamically optimized multivariate thresholding method (WaveL2E). Applying this method, we are simultaneously able to separate and capture the signal and noise components while estimating the dynamic noise variance. Our method shows improved results when compared to well-known methods, especially for high-frequency signal-rich time series, typically observed in finance.Item Denoising Non-Stationary Signals via Dynamic Multivariate Complex Wavelet Thresholding(MDPI, 2023) Raath, Kim C.; Ensor, Katherine B.; Crivello, Alena; Scott, David W.Over the past few years, we have seen an increased need to analyze the dynamically changing behaviors of economic and financial time series. These needs have led to significant demand for methods that denoise non-stationary time series across time and for specific investment horizons (scales) and localized windows (blocks) of time. Wavelets have long been known to decompose non-stationary time series into their different components or scale pieces. Recent methods satisfying this demand first decompose the non-stationary time series using wavelet techniques and then apply a thresholding method to separate and capture the signal and noise components of the series. Traditionally, wavelet thresholding methods rely on the discrete wavelet transform (DWT), which is a static thresholding technique that may not capture the time series of the estimated variance in the additive noise process. We introduce a novel continuous wavelet transform (CWT) dynamically optimized multivariate thresholding method (𝑊𝑎𝑣𝑒𝐿2𝐸). Applying this method, we are simultaneously able to separate and capture the signal and noise components while estimating the dynamic noise variance. Our method shows improved results when compared to well-known methods, especially for high-frequency signal-rich time series, typically observed in finance.Item Discussion on an approach for identifying and predicting economic recessions in real-time using time-frequency functional models(Wiley, 2012) Ensor, Katherine B.; Center for Computational Finance and Economic SystemsItem Dynamic jump intensities and news arrival in oil futures markets(Springer Nature, 2020) Ensor, Katherine B.; Han, Yu; Ostdiek, Barbara; Turnbull, Stuart M.; Center for Computational Finance and Economic SystemsWe introduce a new class of discrete-time models that explicitly recognize the impact of news arrival. The distribution of returns is governed by three factors: dynamics volatility and two Poisson compound processes, one for negative news and one for positive news. We show in a model-free environment that the arrival of negative and positive news has an asymmetric effect on oil futures returns and volatility. Using the first 12 futures contracts, our empirical results confirm that the effects of negative and positive news are described by different processes, a significant proportion of volatility is explained by news arrival and the impact of negative news is larger than that of positive news.Item Enabling accurate and early detection of recently emerged SARS-CoV-2 variants of concern in wastewater(Springer Nature, 2023) Sapoval, Nicolae; Liu, Yunxi; Lou, Esther G.; Hopkins, Loren; Ensor, Katherine B.; Schneider, Rebecca; Stadler, Lauren B.; Treangen, Todd J.As clinical testing declines, wastewater monitoring can provide crucial surveillance on the emergence of SARS-CoV-2 variant of concerns (VoCs) in communities. In this paper we present QuaID, a novel bioinformatics tool for VoC detection based on quasi-unique mutations. The benefits of QuaID are three-fold: (i) provides up to 3-week earlier VoC detection, (ii) accurate VoC detection (>95% precision on simulated benchmarks), and (iii) leverages all mutational signatures (including insertions & deletions).Item Enterprise and Political Risk Management in Complex Systems(International Research Center for Energy and Economic Development, 2007) Ensor, Katherine B.; Kyj, Lada; Marfin, Gary C.; Center for Computational Finance and Economic SystemsItem Essays on the Use of Duality, Robust Empirical Methods, Panel Treatments, and Model Averaging with Applications to Housing Price Index Construction and World Productivity Growth(2015-04-16) Shang, Chenjun; Sickles, Robin C.; El-Gamal, Mahmoud; Ensor, Katherine B.This dissertation focuses on analyzing the production side of the economy, and aims to provide robust estimates of the parameters of interest. In a production process, the output level is mainly determined by two parts: inputs and productivity. Compared with the inputs, which are concrete and measurable, productivity is an unobservable factor that relies on economic models for estimation. An appropriate and robust modeling method is essential if we want to accurately capture the productivity term. Chapter 1 reviews the research on productivity with a focus on stochastic frontier analysis, which is a classic framework in the productivity literature. This chapter starts with the definition and decomposition of productivity. Measured as a ratio of the outputs to the inputs, productivity can be divided into two main parts: innovations and technical efficiencies. The growth of technologies and innovations depends heavily on education and research, and the technical efficiencies of firms vary with their administration, management skills, and allocation of inputs etc. In studies analyzing these two components, stochastic frontier models have gradually become the standard method. This chapter briefly introduces the development of stochastic frontier models, with an emphasis on the panel data setting. Twelve specifications, as well as their implementation methods, are then discussed in detail. These representative models make different assumptions about the efficiency term, aiming to provide better approximations of the underlying data generating process without adding too many constraints. Comparing all these models, we expect different estimates of productivity from different specifications. The evaluation and selection of a suitable model for empirical analysis become a problem. Standard information criteria provide measures of the performance of each candidate model, but multiple criteria can lead to contradicting conclusions about which model is the best one. In addition, the model selection approach itself ignores the risk of model uncertainty. This issue of dealing with multiple competing models will be addressed in Chapter 3. While Chapter 1 concentrates on the methods of estimating productivity, Chapter 2 focuses on the role of proper specification of the inputs used in generating the output. Though the inputs of a production process are usually observable, their effects on the outputs are often not clear and straightforward. The allocation of different inputs are affected by both the production technology and market prices. Chapter 2 utilizes the duality between the production maximization problem and cost minimization problem to uncover the shadow prices of inputs, and constructs corresponding price indexes for further analysis. This chapter is motivated by recent housing bubbles and considers the housing market for the empirical application. The housing market is an important component of the economy, and constantly attracts interests of researchers. Diewert (2010) for example has provided a comparison of various methods of constructing property price indexes using index number and hedonic regression methods, which he illustrates using data over a number of quarters from a small Dutch town. Chapter 2 provides an alternative approach based on Shephard's dual lemma and I apply it to the same data used by Diewert. This method avoids the multicollinearity problem associated with traditional hedonic regression, and the resulting prices of property characteristics show smoother trends than Diewert's results. The chapter also revisits the Diewert and Shimizu (2013) study that employed hedonic regressions to decompose the price of residential property in Tokyo into land and structure components and that constructed constant quality indexes for land and structure prices respectively. I use three models from Diewert and Shimizu (2013) to fit our real estate data from town ‘A’ in Netherlands, and also construct the price indices for land and structure, which are compared with results derived using the duality theory. Again, we have multiple models in the study of housing market. As in the case of productivity, the shadow prices of property characteristics are unobservable (due to the nature of the input or intermediate good, there may not exist an explicit market.) Thus, we rely on certain methods for estimation, and there are a set of candidate models. Chapters 1 and 2 leave us in a dilemma. Which model is correct? Which model do we choose? Is any model actually the correct one or are we choosing among misspecified models? Do we simply choose one model and ignore results from the others? These issues are addressed in Chapter 3 wherein a model averaging approach is explored to provide estimates that are robust to various model specifications. Model averaging methods can be used to provide robust estimates by combining a set of competing models through certain optimization mechanisms. Chapter 3 pursues robust estimates of world productivity levels as well as its growth rates. Various structural and reduced form models of productivity growth have been proposed in the literature. In either class of models, reduced form measurements of productivity and efficiency are obtained. As the true data generating process of productivity cannot be observed, this chapter examines model averaging approaches that can provide a vehicle to weight predictions (in the form of productivity and efficiency measurements) from different reduced form methods. The reduced form models, typically stochastic frontier methods, have a variety of different settings, which have been discussed in Chapter 1. This chapter considers the jackknife model averaging estimator proposed by Hansen and Racine (2012) and illustrates how to apply the technique to a set of competing stochastic frontier estimators. The derived method is employed to analyze the productivity and efficiency development in three country groups worldwide. The results of the empirical application show that the model averaging method provides more stable estimates. The model selection method, on the other hand, tends to select a model with superficially high goodness of fit, which results from the match between some specific model setting and the data set. A brief discussion of alternative structural models from which a reduced form forecast can be derived is provided to illustrate a different perspective for productivity analysis.Item Estimating marginal survival in the presence of dependent and independent censoring: With applications to dividend initiation policy(2005) Fix, Gretchen Abigail; Ensor, Katherine B.; Huang, XuelinIn many survival analysis settings, the assumption of non-informative (i.e. independent) censoring is not valid. Zheng and Klein (1995, 1996) develop a copula-based method for estimating the marginal survival functions of bivariate dependent competing risks data. We expand upon this earlier work and adapt their method to data in which there are three competing risks representing both dependent and independent censoring. Specifically, our extension allows for the estimation of the survival functions of dependent competing risks X and Y in the presence of a third independent competing risk Z. An application to dividend initiation data is presented.
- «
- 1 (current)
- 2
- 3
- »