Center for Computational Finance and Economic Systems (CoFES)
Permanent URI for this community
Browse
Browsing Center for Computational Finance and Economic Systems (CoFES) by Title
Now showing 1 - 20 of 81
Results Per Page
Sort Options
Item A Bayesian approach for capturing daily heterogeneity in intra-daily durations time series(De Gruyter, 2013) Brownlees, Christian T.; Vannucci, MarinaIntra-daily financial durations time series typically exhibit evidence of long range dependence. This has motivated the introduction of models able to reproduce this stylized fact, like the Fractionally Integrated Autoregressive Conditional Duration Model. In this work we introduce a novel specification able to capture long range dependence. We propose a three component model that consists of an autoregressive daily random effect, a semiparametric time-of-day effect and an intra-daily dynamic component: the Mixed Autoregressive Conditional Duration (Mixed ACD) Model. The random effect component allows for heterogeneity in mean reversal within a day and captures low frequency dynamics in the duration time series. The joint estimation of the model parameters is carried out using MCMC techniques based on the Bayesian formulation of the model. The empirical application to a set of widely traded US tickers shows that the model is able to capture low frequency dependence in duration time series. We also find that the degree of dependence and dispersion of low frequency dynamics is higher in periods of higher financial distress.Item A Bayesian Multivariate Functional Dynamic Linear Model(Taylor & Francis, 2017) Kowal, Daniel R.; Matteson, David S.; Ruppert, DavidWe present a Bayesian approach for modeling multivariate, dependent functional data. To account for the three dominant structural features in the data—functional, time dependent, and multivariate components—we extend hierarchical dynamic linear models for multivariate time series to the functional data setting. We also develop Bayesian spline theory in a more general constrained optimization framework. The proposed methods identify a time-invariant functional basis for the functional observations, which is smooth and interpretable, and can be made common across multivariate observations for additional information sharing. The Bayesian framework permits joint estimation of the model parameters, provides exact inference (up to MCMC error) on specific parameters, and allows generalized dependence structures. Sampling from the posterior distribution is accomplished with an efficient Gibbs sampling algorithm. We illustrate the proposed framework with two applications: (1) multi-economy yield curve data from the recent global recession, and (2) local field potential brain signals in rats, for which we develop a multivariate functional time series approach for multivariate time–frequency analysis. Supplementary materials, including R code and the multi-economy yield curve data, are available online.Item A comprehensive approach to spatial and spatiotemporal dependence modeling(2000) Baggett, Larry Scott; Ensor, Katherine B.One of the most difficult tasks of modeling spatial and spatiotemporal random fields is that of deriving an accurate representation of the dependence structure. In practice, the researcher is faced with selecting the best empirical representation of the data, the proper family of parametric models, and the most efficient method of parameter estimation once the model is selected. Each of these decisions has direct consequence on the prediction accuracy of the modeled random field. In order to facilitate the process of spatial dependence modeling, a general class of covariogram estimators is introduced. They are derived by direct application of Bochner's theorem on the Fourier-Bessel series representation of the covariogram. Extensions are derived for one, two and three dimensions and spatiotemporal extensions for one, two and three spatial dimensions as well. A spatial application is demonstrated for prediction of the distribution of sediment contaminants in Galveston Bay estuary, Texas. Also included is a spatiotemporal application to generate predictions for sea surface temperatures adjusted for periodic climatic effects from a long-term study region off southern California.Item A stochastic approach to prepayment modeling(1996) Overley, Mark S.; Thompson, James R.A new type of prepayment model for use in the valuation of mortgage-backed securities is presented. The model is based on a simple axiomatic characterization of the prepayment decision by the individual in terms of a continuous time, discrete state stochastic process. One advantage of the stochastic approach compared to a traditional regression model is that information on the variability of prepayments is retained. This information is shown to have a significant effect on the value of mortgage-backed derivative securities. Furthermore, the model explains important path dependent properties of prepayments such as seasoning and burnout in a natural way, which improves fit accuracy for mean prepayment rates. This is demonstrated by comparing the stochastic mean to a nonlinear regression model based on time and mortgage rate information for generic Ginnie Mae collateral.Item A time series approach to quality control(1991) Dittrich, Gayle Lynn; Ensor, Katherine B.One way that a process may be said to be "out-of-control" is when a cyclical pattern exists in the observations over time. It is necessary that an accurate control chart be developed to signal when a cycle is present in the process. Two control charts have recently been developed to deal with this problem. One, based on the periodogram, provides a test based on a finite number of frequencies. The other method uses a test which estimates a statistic which covers all frequency values. However, both methods fail to estimate the frequency value of the cycle and are computationally difficult. A new control chart is proposed which not only covers a continuous range of frequency values, but also estimates the frequency of the cycle. It in addition is easier to understand and compute than the two other methods.Item An Approach for the Adaptive Solution of Optimization Problems Governed by Partial Differential Equations with Uncertain Coefficients(2012-09-05) Kouri, Drew; Heinkenschloss, Matthias; Sorensen, Danny C.; Riviere, Beatrice M.; Cox, Dennis D.Using derivative based numerical optimization routines to solve optimization problems governed by partial differential equations (PDEs) with uncertain coefficients is computationally expensive due to the large number of PDE solves required at each iteration. In this thesis, I present an adaptive stochastic collocation framework for the discretization and numerical solution of these PDE constrained optimization problems. This adaptive approach is based on dimension adaptive sparse grid interpolation and employs trust regions to manage the adapted stochastic collocation models. Furthermore, I prove the convergence of sparse grid collocation methods applied to these optimization problems as well as the global convergence of the retrospective trust region algorithm under weakened assumptions on gradient inexactness. In fact, if one can bound the error between actual and modeled gradients using reliable and efficient a posteriori error estimators, then the global convergence of the proposed algorithm follows. Moreover, I describe a high performance implementation of my adaptive collocation and trust region framework using the C++ programming language with the Message Passing interface (MPI). Many PDE solves are required to accurately quantify the uncertainty in such optimization problems, therefore it is essential to appropriately choose inexpensive approximate models and large-scale nonlinear programming techniques throughout the optimization routine. Numerical results for the adaptive solution of these optimization problems are presented.Item An approach to modeling a multivariate spatial-temporal process(2000) Calizzi, Mary Anne; Ensor, Katherine B.Although modeling of spatial-temporal stochastic processes is a growing area of research, one underdeveloped area in this field is the multivariate space-time setting. The motivation for this research originates from air quality studies. By treating each air pollutant as a separate variable, the multivariate approach will enable modeling of not only the behavior of the individual pollutants but also the interaction between pollutants over space and time. Studying both the spatial and the temporal aspects of the process gives a more accurate picture of the behavior of the process. A bivariate state-space model is developed and includes a covariance function which can account for the different cross-covariances across space and time. The Kalman filter is used for parameter estimation and prediction. The model is evaluated through the prediction efforts in an air-quality application.Item An Examination of Some Open Problems in Time Series Analysis(Rice University, 2005) Davis, Ginger MichelleWe investigate two open problems in the area of time series analysis. The first is developing a methodology for multivariate time series analysis when our time series has components that are both continuous and categorical. Our specific contribution is a logistic smooth transition regression (LSTR) model whose transition variable is related to a categorical variable. This methodology is necessary for series that exhibit nonlinear behavior dependent on a categorical variable. The estimation procedure is investigated both with simulation and an economic example. The second contribution to time series analysis is examining the evolving structure in multivariate time series. The application area we concentrate on is financial time series. Many models exist for the joint analysis of several financial instruments such as securities due to the fact that they are not independent. These models often assume some type of constant behavior between the instruments over the time period of analysis. Instead of imposing this assumption, we are interested in understanding the dynamic covariance structure in our multivariate financial time series, which will provide us with an understanding of changing market conditions. In order to achieve this understanding, we first develop a multivariate model for the conditional covariance and then examine that estimate for changing structure using multivariate techniques. Specifically, we simultaneously model individual stock data that belong to one of three market sectors and examine the behavior of the market as a whole as well as the behavior of the sectors. Our aims are detecting and forecasting unusual changes in the system, such as market collapses and outliers, and understanding the issue of portfolio diversification in multivariate financial series from different industry sectors. The motivation for this research concerns portfolio diversification. The false assumption that investment in different industry sectors is uncorrelated is not made. Instead, we assume that the comovement of stocks within and between sectors changes with market conditions. Some of these market conditions include market crashes or collapses and common external influences.Item An examination of some open problems in time series analysis(2005) Davis, Ginger Michelle; Ensor, Katherine B.We investigate two open problems in the area of time series analysis. The first is developing a methodology for multivariate time series analysis when our time series has components that are both continuous and categorical. Our specific contribution is a logistic smooth transition regression (LSTR) model whose transition variable is related to a categorical variable. This methodology is necessary for series that exhibit nonlinear behavior dependent on a categorical variable. The estimation procedure is investigated both with simulation and an economic example. The second contribution to time series analysis is examining the evolving structure in multivariate time series. The application area we concentrate on is financial time series. Many models exist for the joint analysis of several financial instruments such as securities due to the fact that they are not independent. These models often assume some type of constant behavior between the instruments over the time period of analysis. Instead of imposing this assumption, we are interested in understanding the dynamic covariance structure in our multivariate financial time series, which will provide us with an understanding of changing market conditions. In order to achieve this understanding, we first develop a multivariate model for the conditional covariance and then examine that estimate for changing structure using multivariate techniques. Specifically, we simultaneously model individual stock data that belong to one of three market sectors and examine the behavior of the market as a whole as well as the behavior of the sectors. Our aims are detecting and forecasting unusual changes in the system, such as market collapses and outliers, and understanding the issue of portfolio diversification in multivariate financial series from different industry sectors. The motivation for this research concerns portfolio diversification. The false assumption that investment in different industry sectors is uncorrelated is not made. Instead, we assume that the comovement of stocks within and between sectors changes with market conditions. Some of these market conditions include market crashes or collapses and common external influences.Item An Old Dog Learns New Tricks: Novel Applications of Kernel Density Estimators on Two Financial Datasets(2017-12-01) Ginley, Matthew Cline; Ensor, Katherine B.; Scott, David W.In our first application, we contribute two nonparametric simulation methods for analyzing Leveraged Exchange Traded Fund (LETF) return volatility and how this dynamic is related to the underlying index. LETFs are constructed to provide the indicated leverage multiple of the daily total return on an underlying index. LETFs may perform as expected on a daily basis; however, fund issuers state there is no guarantee of achieving the multiple of the index return over longer time horizons. Most, if not all LETF returns data are difficult to model because of the extreme volatility present and limited availability of data. First, to isolate the effects of daily, leveraged compounding on LETF volatility, we propose an innovative method for simulating daily index returns with a chosen constraint on the multi-day period return. By controlling for the performance of the underlying index, the range of volatilities observed in a simulated sample can be attributed to compounding with leverage and the presence of tracking errors. Second, to overcome the limited history of LETF returns data, we propose a method for simulating implied LETF tracking errors while still accounting for their dependence on underlying index returns. This allows for the incorporation of the complete history of index returns in an LETF returns model. Our nonparametric methods are flexible-- easily incorporating any chosen number of days, leverage ratios, or period return constraints, and can be used in combination or separately to model any quantity of interest derived from daily LETF returns. For our second application, we tackle binary classification problems with extremely low class 1 proportions. These ``rare events'' problems are a considerable challenge, which is magnified when dealing with large datasets. Having a minuscule count of class 1 observations motivates the implementation of more sophisticated methods to minimize forecasting bias towards the majority class. We propose an alternative approach to established up-sampling or down-sampling algorithms driven by kernel density estimators to transform the class labels to continuous targets. Having effectively transformed the problem from classification to regression, we argue that under the assumption of a monotonic relationship between predictors and the target, approximations of the majority class are possible in a rare events setting with the use of simple heuristics. By significantly reducing the burden posed by the majority class, the complexities of minority class membership can be modeled more effectively using monotonically constrained nonparametric regression methods. Our approach is demonstrated on a large financial dataset with an extremely low class 1 proportion. Additionally, novel features engineering is introduced to assist in the application of the density estimator used for class label transformation.Item Approximate dynamic factor models for mixed frequency data(2015-10-15) Zhao, Xin; Ensor, Katherine; Kimmel, Marek; Sizova, NataliaTime series observed at different temporal scales cannot be simultaneously analyzed by traditional multivariate time series methods. Adjustments must be made to address issues of asynchronous observations. For example, many macroeconomic time series are published quarterly and other price series are published monthly or daily. Common solutions to the analysis of asynchronous time series include data aggregation, mixed frequency vector autoregressive models, and factor models. In this research, I set up a systematic approach to the analysis of asynchronous multivariate time series based on an approximate dynamic factor model. The methodology treats observations of various temporal frequencies as contemporaneous series. A two-step model estimation and identification scheme is proposed. This method allows explicit structural restrictions that account for appropriate temporal ordering of the mixed frequency data. The methodology consistently estimates the dynamic factors, however, no prior knowledge on the factors is required. To ensure a computationally efficient robust algorithm and model specification, I make use of modern penalized likelihood methodologies. The fitted model captures the effects of temporal relationships across the asynchronous time series in an interpretable manner. The methodology is studied through simulation and applied to several examples. The simulations and examples demonstrate good performance in model specification, estimation and out-of-sample forecasting.Item Autocorrelated data in quality control charts(1994) Hood, Terri Frantom; Ensor, Katherine B.Control charts are regularly developed with the assumption that the process observations have an independent relationship. However, a common occurrence in certain industries is the collection of autocorrelated data. Two approaches are investigated that deal with this issue. The time series approach is based on modeling the data with an appropriate time series model to remove the autocorrelative structure. The EWMA approach is based on modeling the observations as a weighted average of previous data. The residuals from the two approaches are plotted on control charts and the average run lengths are compared. Both methods are applied to simulations that generate in-control data and data that have strategically located nonstandard conditions. The nonstandard conditions simulated are process change, linear drift, mean shift, and variance shift. It is proposed that the time series approach tends to perform better in these situations.Item Beating the House: Identifying Inefficiencies in Sports Betting Markets(arXiv, 2019) Ramesh, Sathya; Mostofa, Ragib; Bornstein, Marco; Dobelman, John; Center for Computational Finance and Economic SystemsInefficient markets allow investors to consistently outperform the market. To demonstrate that inefficiencies exist in sports betting markets, we created a betting algorithm that generates above market returns for the NFL, NBA, NCAAF, NCAAB, and WNBA betting markets. To formulate our betting strategy, we collected and examined a novel dataset of bets, and created a non-parametric win probability model to find positive expected value situations. As the United States Supreme Court has recently repealed the federal ban on sports betting, research on sports betting markets is increasingly relevant for the growing sports betting industry.Item Characterizing Production in the Barnett Shale Resource: Essays on Efficiency, Operator Effects and Well Decline(2016-04-21) Seitlheko, Likeleli; Hartley, Peter RThis dissertation is composed of three papers in the field of energy economics. The first paper estimates revenue and technical efficiency for more than 11,000 wells that were drilled in the Barnett between 2000 and 2010, and also examines how the efficiency estimates differ among operators. To achieve this objective, we use stochastic frontier analysis and a two-stage semi-parametric approach that consists of data envelopment analysis in the first stage and a truncated linear regression in the second stage. The stochastic frontier analysis (SFA) and data envelopment analysis (DEA) commonly identify only two operators as more revenue and technically efficient than Devon, the largest operator in the Barnett. We further find that operators have generally been effective at responding to market incentives and producing the revenue-maximizing mix of gas and oil given the reigning prices. Furthermore, coupled with this last result is the insight that most of the revenue inefficiency is derived from technical inefficiency and not allocative inefficiency. The second paper uses multilevel modeling to examine relative operator effects on revenue generation and natural gas output during the 2000-2010 period. The estimated operator effects are used to determine which operators were more effective at producing natural gas or generating revenue from oil and gas. The operators clump together into three groups – average, below average, and above average – and the effects of individual operators within each group are largely indistinguishable from one another. Among the operators that are estimated to have above average effects in both the gas model and the revenue model are Chesapeake, Devon, EOG and XTO, the top four largest operators in the Barnett. The results also reveal that between-operator differences account for a non-trivial portion of the residual variation in gas or revenue output that remains after controlling for well-level characteristics, and prices in the case of the revenue model. In the third paper, we estimate an econometric model describing the decline of a “typical” well in the Barnett shale. The data cover more than 15,000 wells drilled in the Barnett between 1990 and mid-2011. The analysis is directed at testing the hypothesis proposed by Patzek, Male and Marder (2014) that linear flow rather than radial flow – the latter of which is consistent with Arps (1945) system of equations – governs natural gas production within hydraulically fractured wells in extremely low permeability shale formations. To test the hypothesis, we use a fixed effects linear model with Driscoll-Kraay standard errors, which are robust to autocorrelation and cross-sectional correlation, and estimate the model separately for horizontal and vertical wells. For both horizontal and vertical shale gas wells in the Barnett, we cannot reject the hypothesis of a linear flow regime. This implies that the production profile of a Barnett well can be projected – within some reasonable margin of error – using the decline curve equation of Patzek, Male and Marder (2014) once initial production is known. We then estimate productivity tiers by sampling from the distribution of the length normalized initial production of horizontal wells and generate type curves using the decline curve equation of Patzek, Male and Marder (2014). Finally, we calculate the drilling cost per EUR (expected ultimate recovery) and the breakeven price of natural gas for all the tiers.Item Computational and Statistical Methodology for Highly Structured Data(2020-09-15) Weylandt, Michael; Ensor, Katherine BModern data-intensive research is typically characterized by large scale data and the impressive computational and modeling tools necessary to analyze it. Equally important, though less remarked upon, is the important structure present in large data sets. Statistical approaches that incorporate knowledge of this structure, whether spatio-temporal dependence or sparsity in a suitable basis, are essential to accurately capture the richness of modern large scale data sets. This thesis presents four novel methodologies for dealing with various types of highly structured data in a statistically rich and computationally efficient manner. The first project considers sparse regression and sparse covariance selection for complex valued data. While complex valued data is ubiquitous in spectral analysis and neuroimaging, typical machine learning techniques discard the rich structure of complex numbers, losing valuable phase information in the process. A major contribution of this project is the development of convex analysis for a class of non-smooth "Wirtinger" functions, which allows high-dimensional statistical theory to be applied in the complex domain. The second project considers clustering of large scale multi-way array ("tensor") data. Efficient clustering algorithms for convex bi-clustering and co-clustering are derived and shown to achieve an order-of-magnitude speed improvement over previous approaches. The third project considers principal component analysis for data with smooth and/or sparse structure. An efficient manifold optimization technique is proposed which can flexibly adapt to a wide variety of regularization schemes, while efficiently estimating multiple principal components. Despite the non-convexity of the manifold constraints used, it is possible to establish convergence to a stationary point. Additionally, a new family of "deflation" schemes are proposed to allow iterative estimation of nested principal components while maintaining weaker forms of orthogonality. The fourth and final project develops a multivariate volatility model for US natural gas markets. This model flexibly incorporates differing market dynamics across time scales and different spatial locations. A rigorous evaluation shows significantly improved forecasting performance both in- and out-of-sample. All four methodologies are able to flexibly incorporate prior knowledge in a statistically rigorous fashion while maintaining a high degree of computational performance.Item Computational finance: correlation, volatility, and markets(Wiley, 2014) Ensor, Katherine Bennett; Koev, Ginger M.Financial data by nature are inter-related and should be analyzed using multivariate methods. Many models exist for the joint analysis of multiple financial instruments. Early models often assumed some type of constant behavior between the instruments over the time period of analysis. But today, time-varying covariance models are a key component of financial time series analysis leading to a deeper understanding of changing market conditions. Models for covolatility of financial data quickly grow in their complexity and parameters, and 20 years of research offers a variety of solutions to this complexity. After a short introduction of univariate volatility models, this article begins with the basic multivariate formulation for time series covariance modeling and moves to leading time series tools that address this complexity. Coupling these models with regime switching via a Markov process extends the features that can be understood from market behavior. We ground this review in an example of modeling the covariance of securities within sectors and sectors within markets, with dynamics that allow for two different market regimes. Specifically, we simultaneously model individual daily stock data that belong to one of three market sectors and examine the behavior of the market as a whole as well as the behavior of the market sectors over time. A motivation for this characterization concerns portfolio diversification and stock anomalies, and we capture the changing comovement of stocks within and between sectors as market conditions change. For example, some of these market conditions include market crashes or collapses and common external influences.Item Covariance Estimation in Dynamic Portfolio Optimization: A Realized Single Factor Model*(SSRN, 2009) Kyj, Lada; Ostdiek, Barbara; Ensor, KatherineRealized covariance estimation for large dimension problems is little explored and poses challenges in terms of computational burden and estimation error. In a global minimum volatility setting, we investigate the performance of covariance conditioning techniques applied to the realized covariance matrices of the 30 DJIA stocks. We find that not only is matrix conditioning necessary to deliver the benefits of high frequency data, but a single factor model, with a smoothed covariance estimate, outperforms the fully estimated realized covariance in one-step ahead forecasts. Furthermore, a mixed-frequency single-factor model - with factor coefficients estimated using low-frequency data and variances estimated using high-frequency data performs better than the realized single-factor estimator. The mixed-frequency model is not only parsimonious but it also avoids estimation of high-frequency covariances, an attractive feature for less frequently traded assets. Volatility dimension curves reveal that it is difficult to distinguish among estimators at low portfolio dimensions, but for well-conditioned estimators the performance gain relative to the benchmark 1/N portfolio increases with N.Item Denoising by wavelet thresholding using multivariate minimum distance partial density estimation(2006) Scott, Alena I.; Scott, David W.In this thesis, we consider wavelet-based denoising of signals and images contaminated with white Gaussian noise. Existing wavelet-based denoising methods are limited because they make at least one of the following three unrealistic assumptions: (1) the wavelet coefficients are independent, (2) the signal component of the wavelet coefficient distribution follows a specified parametric model, and (3) the wavelet representations of all signals of interest have the same level of sparsity. We develop an adaptive wavelet thresholding algorithm that addresses each of these issues. We model the wavelet coefficients with a two-component mixture in which the noise component is Gaussian but the signal component need not be specified. We use a new technique in density estimation which minimizes an distance criterion (L2E) to estimate the parameters of the partial density that represents the noise component. The L2E estimate for the weight of the noise component, w&d4;L2E , determines the fraction of wavelet coefficients that the algorithm considers noise; we show that w&d4;L2E corresponds to the level of complexity of the signal. We also incorporate information on inter-scale dependencies by modeling across-scale (parent/child) groups of adjacent coefficients with multivariate densities estimated by L 2E. To assess the performance of our method, we compare it to several standard wavelet-based denoising algorithms on a number of benchmark signals and images. We find that our method incorporating inter-scale dependencies gives results that are an improvement over most of the standard methods and are comparable to the rest. The L2E thresholding algorithm performed very well for 1-D signals, especially those with a considerable amount of high frequency content. Our method worked reasonably well for images, with some apparent advantage in denoising smaller images. In addition to providing a standalone denoising method, L2E can be used to estimate the variance of the noise in the signal for use in other thresholding methods. We also find that the L2E estimate for the noise variance is always comparable and sometimes better than the conventional median absolute deviation estimator.Item Denoising Non-stationary Signals by Dynamic Multivariate Complex Wavelet Thresholding(SSRN, 2020) Raath, Kim; Ensor, Katherine B.; Scott, David W.; Crivello, AlenaOver the past few years, we have seen an increased need for analyzing the dynamically changing behaviors of economic and financial time series. These needs have led to significant demand for methods that denoise non-stationary time series across time and for specific investment horizons (scales) and localized windows (blocks) of time. Wavelets have long been known to decompose non-stationary time series into their different components or scale pieces. Recent methods satisfying this demand first decompose the non-stationary time series using wavelet techniques and then apply a thresholding method to separate and capture the signal and noise components of the series. Traditionally, wavelet thresholding methods rely on the discrete wavelet transforms (DWT), a static thresholding technique that may not capture the time series of the estimated variance in the additive noise process. We introduce a novel continuous wavelet transform (CWT) dynamically-optimized, multivariate thresholding method. Applying this method we are simultaneously able to separate and capture the signal and noise components while estimating the dynamic noise variance. Our method shows improved results when compared to well-known methods, especially for high-frequency signal rich time series, typically observed in finance. Supplementary materials for your article are available online.Item Denoising Non-Stationary Signals via Dynamic Multivariate Complex Wavelet Thresholding(MDPI, 2023) Raath, Kim C.; Ensor, Katherine B.; Crivello, Alena; Scott, David W.Over the past few years, we have seen an increased need to analyze the dynamically changing behaviors of economic and financial time series. These needs have led to significant demand for methods that denoise non-stationary time series across time and for specific investment horizons (scales) and localized windows (blocks) of time. Wavelets have long been known to decompose non-stationary time series into their different components or scale pieces. Recent methods satisfying this demand first decompose the non-stationary time series using wavelet techniques and then apply a thresholding method to separate and capture the signal and noise components of the series. Traditionally, wavelet thresholding methods rely on the discrete wavelet transform (DWT), which is a static thresholding technique that may not capture the time series of the estimated variance in the additive noise process. We introduce a novel continuous wavelet transform (CWT) dynamically optimized multivariate thresholding method (𝑊𝑎𝑣𝑒𝐿2𝐸). Applying this method, we are simultaneously able to separate and capture the signal and noise components while estimating the dynamic noise variance. Our method shows improved results when compared to well-known methods, especially for high-frequency signal-rich time series, typically observed in finance.