Marimoutou

Publications

Federal minimum wage hikes do reduce teenage employment. A replication study of Bazen & Marimoutou (Oxford Bulletin of Economics and Statistics, 2002)Journal articleStephen Bazen et Vêlayoudom Marimoutou, International Journal for Re-Views in Empirical Economics (IREE), Volume 2, Issue 2018-5, pp. 1-13, 2018

In 2002 we published a paper in which we used state space time series methods to analyse the teenage employment-federal minimum wage relationship in the US (Bazen and Marimoutou, 2002). The study used quarterly data for the 46 year period running from 1954 to 1999. We detected a small, negative but statistically significant effect of the federal minimum wage on teenage employment, at a time when some studies were casting doubt on the existence of such an effect. In this note we re-estimate the original model with a further 16 years of data (up to 2015). We find that the model satisfactorily tracks the path of the teenage employment-population ratio over this 60 year period, and yields a consistently negative and statistically significant effect of minimum wages on teenage employment. The conclusion reached is the same as in the original paper, and the elasticity estimates very similar: federal minimum wage hikes lead to a reduction in teenage employment with a short run elasticity of around – 0.13. The estimated long run elasticity of between – 0.37 and – 0.47 is less stable, but is nevertheless negative and statistically significant.

The memory of ENSO revisited by a 2-factor Gegenbauer processJournal articleAudrey Lustig, Philippe Charlot et Vêlayoudom Marimoutou, International Journal of Climatology, Volume 37, Issue 5, pp. 2295-2303, 2017

This article specifies a multi-factor long memory process, namely Gegenbauer process, particularly adapted for data with slow damping correlations and cyclical patterns, and explores the use of this representation in the inter-annual climate variability range capture by indices of the El Niño Southern Oscillation (ENSO). The empirical results suggest that sea surface temperature (SST) indices are stationary long memory processes. It is found that the indices in the eastern and central Pacific exhibit different dynamics. The variability of the eastern equatorial Pacific SST indices (Niño 1 − 2 and Niño 3) is characterized by a large component of long-memory behaviour associated with the quasi-biennial and the semi-annual frequency. In contrast, the variability of the central Pacific SST indices (Niño 3.4 and Niño 4) is characterized by a large component of long-memory behaviour associated with the annual and the semi-annual frequency. These results are consistent with recent studies that suggest that ENS0 SST anomalies in the equatorial Pacific can be considered to consist of two processes. The use of Gegenbauer processes can be considered as an alternative competitive procedure in the analysis of cyclical long memory climatological time series from a different time series perspective.

Energy markets and CO2 emissions: Analysis by stochastic copula autoregressive modelJournal articleVêlayoudom Marimoutou et Manel Soury, Energy, Volume 88, pp. 417-429, 2015

We examine the dependence between the volatility of the prices of the carbon dioxide “CO2” emissions with the volatility of one of their fundamental components, the energy prices. The dependence between the returns will be approached by a particular class of copula, the SCAR (Stochastic Autoregressive) Copulas, which is a time varying copula that was first introduced by Hafner and Manner (2012) [1] in which the parameter driving the dynamic of the copula follows a stochastic autoregressive process. The standard likelihood method will be used together with EIS (Efficient Importance Sampling) method, to evaluate the integral with a large dimension in the expression of the likelihood function. The main result suggests that the dynamics of the dependence between the volatility of the CO2 emission prices and the volatility of energy returns, coal, natural gas and Brent oil prices, do vary over time, although not much in stable periods but rise noticeably during the period of crisis and turmoils.

On the relationship between the prices of oil and the precious metals: Revisiting with a multivariate regime-switching decision treeJournal articlePhilippe Charlot et Vêlayoudom Marimoutou, Energy Economics, Volume 44, pp. 456-467, 2014

This study examines the volatility and correlation and their relationships among the euro/US dollar exchange rates, the S&P500 equity indices, and the prices of WTI crude oil and the precious metals (gold, silver, and platinum) over the period 2005 to 2012. Our model links the univariate volatilities with the correlations via a hidden stochastic decision tree. The ensuing Hidden Markov Decision Tree (HMDT) model is in fact an extension of the Hidden Markov Model (HMM) introduced by Jordan et al. (1997). The architecture of this model is the opposite that of the classical deterministic approach based on a binary decision tree and, it allows a probabilistic vision of the relationship between univariate volatility and correlation. Our results are categorized into three groups, namely (1) exchange rates and oil, (2) S&P500 indices, and (3) precious metals. A switching dynamics is seen to characterize the volatilities, while, in the case of the correlations, the series switch from one regime to another, this movement touching a peak during the period of the Subprime crisis in the US, and again during the days following the Tohoku earthquake in Japan. Our findings show that the relationships between volatility and correlation are dependent upon the nature of the series considered, sometimes corresponding to those found in econometric studies, according to which correlation increases in bear markets, at other times differing from them.

Finite sample properties of tests for STGARCH models and application to the US stock returnsBook chapterGilles Dufrénot, Vêlayoudom Marimoutou et Anne Péguin-Feissolle, In: Progress in Financial Markets Research, C. Kyrtsou (Eds.), 2012, pp. 83-101, Nova Science Publishers, New York, 2012
Fiscal Federalism, State Lobbying and Discretionary Finance: Evidence from IndiaJournal articleRongili Biswas, Sugata Marjit et Vêlayoudom Marimoutou, Economics & Politics, Volume 22, Issue 1, pp. 68-91, 2010

In the quasi-federal democratic polity that India has, lobbying for central funds by the states is often done in a subliminal fashion. Hence, it becomes difficult to get an account of how much lobbying has been done to a particular end. Our paper attempts at constructing certain political proxy variables to quantify the extent of such lobbying in India. We quantify lobbying through the ministerial representation in the council of ministers. We also use several time and state dummies to account for the constituent states' political alignment with the center as well as the coalition and the reform period breaks in the Indian system. Taking panel data that cover 29 years and 14 major states we show that our constructed variables do explain disparity in central fiscal disbursements under the non-formulaic “discretionary” head in a robust way. Our findings remain true even after we take into account the impact of endogeneity of net state income on the transfers. Additionally, our exercise brings to the fore the fact that the coalition governments and economic reform measures impact upon state lobbying at the center in a significant manner.

Extreme Value Theory and Value at Risk: Application to oil marketJournal articleVêlayoudom Marimoutou, Bechir Raggad et Abdelwahed Trabelsi, Energy Economics, Volume 31, Issue 4, pp. 519-530, 2009

Recent increases in energy prices, especially oil prices, have become a principal concern for consumers, corporations, and governments. Most analysts believe that oil price fluctuations have considerable consequences on economic activity. Oil markets have become relatively free, resulting in a high degree of oil-price volatility and generating radical changes to world energy and oil industries. Consequently, oil markets are naturally vulnerable to significant high price shifts. An example of such a case is the oil embargo crisis of 1973. In this newly created climate, protection against market risk has become a necessity. Value at Risk (VaR) measures risk exposure at a given probability level and is very important for risk management. Appealing aspects of Extreme Value Theory (EVT) have made convincing arguments for its use in managing energy price risks. In this paper, we model VaR for long and short trading positions in oil market by applying both unconditional and conditional EVT models to forecast Value at Risk. These models are compared to the performances of other well-known modelling techniques, such as GARCH, Historical Simulation and Filtered Historical Simulation. Both conditional EVT and Filtered Historical Simulation procedures offer a major improvement over the conventional methods. Furthermore, GARCH(1, 1)-t model may provide equally good results which are comparable to two combined procedures. Finally, our results confirm the importance of filtering process for the success of standard approaches.

The "distance-varying" gravity model in international economics: is the distance an obstacle to trade?Journal articleVêlayoudom Marimoutou, Denis Péguin et Anne Péguin-Feissolle, Economics Bulletin, Volume 29, Issue 2, pp. 1139-1155, 2009

In this paper, we address the problem of the role of the distance between trading partners by assuming the variability of coefficients in a standard gravity model. The distance can be interpreted as an indicator of the cost of entry in a market (a fixed cost): the greater the distance, the higher the entry cost, and the more we need to have a large market to be able to cover a high cost of entry. To explore this idea, the paper uses a method called Flexible Least Squares. By allowing the parameters of the gravity model to vary over the observations, our main result is that the more the partner's GDP is large, the less the distance is an obstacle to trade.

Econometric Modeling and InferenceBookJean-Pierre Florens, Vêlayoudom Marimoutou et Anne Péguin-Feissolle, Themes in modern econometrics, 2007, 496 pages, Cambridge University Press, 2007

Presents the main statistical tools of econometrics, focusing specifically on modern econometric methodology. The authors unify the approach by using a small number of estimation techniques, mainly generalized method of moments (GMM) estimation and kernel smoothing. The choice of GMM is explained by its relevance in structural econometrics and its preeminent position in econometrics overall. Split into four parts, Part I explains general methods. Part II studies statistical models that are best suited for microeconomic data. Part III deals with dynamic models that are designed for macroeconomic and financial applications. In Part IV the authors synthesize a set of problems that are specific to statistical methods in structural econometrics, namely identification and over-identification, simultaneity, and unobservability. Many theoretical examples illustrate the discussion and can be treated as application exercises. Nobel Laureate James A. Heckman offers a foreword to the work.

Estimation Methods of the Long Memory Parameter: Monte Carlo Analysis and ApplicationJournal articleMohamed Boutahar, Vêlayoudom Marimoutou et Leïla Nouira, Journal of Applied Statistics, Volume 34, Issue 3, pp. 261-301, 2007

Abstract Since the seminal paper of Granger & Joyeux (1980), the concept of a long memory has focused the attention of many statisticians and econometricians trying to model and measure the persistence of stationary processes. Many methods for estimating d, the long-range dependence parameter, have been suggested since the work of Hurst (1951). They can be summarized in three classes: the heuristic methods, the semi-parametric methods and the maximum likelihood methods. In this paper, we try by simulation, to verify the two main properties of [dcirc]: the consistency and the asymptotic normality. Hence, it is very important for practitioners to compare the performance of the various classes of estimators. The results indicate that only the semi-parametric and the maximum likelihood methods can give good estimators. They also suggest that the AR component of the ARFIMA (1, d, 0) process has an important impact on the properties of the different estimators and that the Whittle method is the best one, since it has the small mean squared error. We finally carry out an empirical application using the monthly seasonally adjusted US Inflation series, in order to illustrate the usefulness of the different estimation methods in the context of using real data.