Publications

Most of the information presented on this page have been retrieved from RePEc with the kind authorization of Christian Zimmermann
Routes to the TopJournal articleJohannes König, Christian Schluter and Carsten Schröder, Review of Income and Wealth, Volume 71, Issue 2, pp. e70015, 2025

Who makes it to the top? We use the leading socio-economic survey in Germany, supplemented by extensive data on the rich, to answer this question. We identify the key predictors for belonging to the top 1 percent of income, wealth, and both distributions jointly. Although we consider many, only a few traits matter: Entrepreneurship and self-employment in conjunction with a sizable inheritance of company assets is the most important covariate combination across all rich groups. Our data suggest that all top 1 percent groups, but especially the joint top 1 percent, are predominantly populated by intergenerational entrepreneurs.

Clustering Approaches for Mixed-Type Data: A Comparative StudyJournal articleBadih Ghattas and Alvaro Sanchez San-Benito, Journal of Probability and Statistics, Volume 2025, Issue 1, pp. 2242100, 2025

Clustering is widely used in unsupervised learning to find homogeneous groups of observations within a dataset. However, clustering mixed-type data remains a challenge, as few existing approaches are suited for this task. This study presents the state-of-the-art of these approaches and compares them using various simulation models. The compared methods include the distance-based approaches k-prototypes, PDQ, and convex k-means, and the probabilistic methods KAy-means for MIxed LArge data (KAMILA), the mixture of Bayesian networks (MBNs), and latent class model (LCM). The aim is to provide insights into the behavior of different methods across a wide range of scenarios by varying some experimental factors such as the number of clusters, cluster overlap, sample size, dimension, proportion of continuous variables in the dataset, and clusters’ distribution. The degree of cluster overlap and the proportion of continuous variables in the dataset and the sample size have a significant impact on the observed performances. When strong interactions exist between variables alongside an explicit dependence on cluster membership, none of the evaluated methods demonstrated satisfactory performance. In our experiments KAMILA, LCM, and k-prototypes exhibited the best performance, with respect to the adjusted rand index (ARI). All the methods are available in R.

A Variational Rationality Perspective on the Quasi-Equilibrium Problem: Climbing the Goal LadderJournal articleAntoine Soubeyran and Jo Carlos De Souza, Journal of Nonlinear and Variational Analysis, Volume 9, Issue 3, pp. 461-478, 2025

In this paper, we examine the quasi-equilibrium problem from a variational rationality perspective. To this end, we first study the convergence of the proximal point method proposed by Bento et al. [Ann. Oper. Res. 316 (2022), 1301-1318] in the more general context of quasi-equilibrium problems using a Bregman distance. Thus, we provide an application of the method through a recent behavioral perspective, more precisely, the variational rationality approach of staying and changing human dynamics, and the important example of climbing the goal ladder in goal pursuit theory. An illustrative simulation demonstrates that Bregman distances improve the computational performance of the method compared to the Euclidean distance.

Foreign Corrupt Practices Act (FCPA) and market quality in emerging economiesJournal articleKrishnendu Ghosh Dastidar and Makoto Yano, Managerial and Decision Economics, Volume 46, Issue 1, pp. 641-665, 2025

In many emerging economies with antiquated laws, bribes paid to government officials reduce economic impediments and serve as a device to improve market competition, thereby contributing to the modernization of an economy. In this context, this paper uses a simple two-stage game theoretic model to investigate the effects of the US Foreign Corrupt Practices Act (FCPA) on such economies. We demonstrate, among others, that while an increase in fines under FCPA reduces overall corruption, it leads to a deterioration in the market quality in an emerging economy. In the presence of FCPA, an increase in the US firm's technological advantage unambiguously leads to a decrease in the market quality in an emerging economy.

Work organization in social enterprises: A source of job satisfaction?Journal articleXavier Joutard, Francesca Petrella and Nadine Richez-Battesti, Kyklos, Volume 78, Issue 1, pp. 111-148, 2025

Many studies suggest that employees of social enterprises experience greater job satisfaction than employees of for-profit organizations, although their pay and employment contracts are usually less favorable. Based on linked employer–employee data from a French survey on employment characteristics and industrial relations and using a decomposition method developed by Gelbach (2016), this paper aims to explain this somewhat paradoxical result. Focusing on work organization variables, we show that the specific work organization of social enterprises explains a large part of the observed job satisfaction differential both in general and more specifically, in terms of satisfaction with access to training and working conditions. By detailing the components of work organization, the higher job satisfaction reported by employees in social enterprises stems from their greater autonomy and better access to information. In contrast to earlier studies, however, our results show that these work organization variables do not have more value for social enterprise employees than for for-profit organization employees in the case of overall job satisfaction. This result casts doubt on the widespread hypothesis that social enterprise employees attach more weight to the nonmonetary advantages of their work than their counterparts in for-profit organizations.

Specific Sensitivity to Rare and Extreme Events: Quasi-Complete Black Swan Avoidance vs Partial Jackpot Seeking in Rat Decision-MakingJournal articleMickael Degoulet, Louis-Matis Willem, Christelle Baunez, Stéphane Luchini and Patrick A. Pintus, eLife, Volume 13, Forthcoming

Most studies assessing animal decision-making under risk rely on probabilities that are typically larger than 10%. To study Decision-Making in uncertain conditions, we explore a novel experimental and modelling approach that aims at measuring the extent to which rats are sensitive - and how they respond - to outcomes that are both rare (probabilities smaller than 1%) and extreme in their consequences (deviations larger than 10 times the standard error). In a four-armed bandit task, stochastic gains (sugar pellets) and losses (time-out punishments) are such that extremely large - but rare - outcomes materialize or not depending on the chosen options. All rats feature both limited diversification, mixing two options out of four, and sensitivity to rare and extreme outcomes despite their infrequent occurrence, by combining options with avoidance of extreme losses (Black Swans) and exposure to extreme gains (Jackpots). Notably, this sensitivity turns out to be one-sided for the main phenotype in our sample: it features a quasi-complete avoidance of Black Swans, so as to escape extreme losses almost completely, which contrasts with an exposure to Jackpots that is partial only. The flip side of observed choices is that they entail smaller gains and larger losses in the frequent domain compared to alternatives. We have introduced sensitivity to Black Swans and Jackpots in a new class of augmented Reinforcement Learning models and we have estimated their parameters using observed choices and outcomes for each rat. Adding such specific sensitivity results in a good fit of the selected model - and simulated behaviors that are close - to behavioral observations, whereas a standard Q-Learning model without sensitivity is rejected for almost all rats. This model reproducing the main phenotype suggests that frequent outcomes are treated separately from rare and extreme ones through different weights in Decision-Making.

Financial and Oil Market’s Co-Movements by a Regime-Switching CopulaJournal articleManel Soury, Econometrics, Volume 12, Issue 2, pp. 14, Forthcoming

Over the years, oil prices and financial stock markets have always had a complex relationship. This paper analyzes the interactions and co-movements between the oil market (WTI crude oil) and two major stock markets in Europe and the US (the Euro Stoxx 50 and the SP500) for the period from 1990 to 2023. For that, I use both the time-varying and the Markov copula models. The latter one represents an extension of the former one, where the constant term of the dynamic dependence parameter is driven by a hidden two-state first-order Markov chain. It is also called the dynamic regime-switching (RS) copula model. To estimate the model, I use the inference function for margins (IFM) method together with Kim’s filter for the Markov switching process. The marginals of the returns are modeled by the GARCH and GAS models. Empirical results show that the RS copula model seems adequate to measure and evaluate the time-varying and non-linear dependence structure. Two persistent regimes of high and low dependency have been detected. There was a jump in the co-movements of both pairs during high regimes associated with instability and crises. In addition, the extreme dependence between crude oil and US/European stock markets is time-varying but also asymmetric, as indicated by the SJC copula. The correlation in the lower tail is higher than that in the upper. Hence, oil and stock returns are more closely joined and tend to co-move more closely together in bullish periods than in bearish periods. Finally, the dependence between WTI crude oil and the SP500 stock index seems to be more affected by exogenous shocks and instability than the oil and European stock markets.

Random Informative Advertising with Vertically Differentiated ProductsJournal articleRim Lahmandi-Ayed and Didier Laussel, Games, Volume 15, Issue 2, pp. 10, Forthcoming

We study a simple model in which two vertically differentiated firms compete in prices and mass advertising on an initially uninformed market. Consumers differ in their preference for quality. There is an upper bound on prices since consumers cannot spend more on the good than a fixed amount (say, their income). Depending on this income and on the ratio between the advertising cost and quality differential (relative advertising cost), either there is no equilibrium in pure strategies or there exists one of the following three types: (1) an interior equilibrium, where both firms have positive natural markets and charge prices lower than the consumer’s income; (2) a constrained interior equilibrium, where both firms have positive natural markets, and the high-quality firm charges the consumer’s income or (3) a corner equilibrium, where the low-quality firm has no natural market selling only to uninformed customers. We show that no corner equilibrium exists in which the high-quality firm would have a null natural market. At an equilibrium (whenever there exists one), the high-quality firm always advertises more, charges a higher price and makes a higher profit than the low-quality one. As the relative advertising cost goes to infinity, prices become equal and the advertising intensities converge to zero as well as the profits. Finally, the advertising intensities are, at least globally, increasing with the quality differential. Finally, in all cases, as the advertising parameter cost increases unboundedly, both prices converge increasingly towards the consumer’s income.

Prioritisation of infectious diseases from a public health perspective: a multi-criteria decision analysis study, France, 2024Journal articleDominique Ploin, Mathilde Alexandre, Bruno Ventelou, Didier Che, Bruno Coignard, Nathalie Boulanger, Christophe Burucoa, François Caron, Pierre Gallian, Yves Hansmann, et al., Eurosurveillance, Volume 29, Issue 50, pp. 2400074, Forthcoming

Background Within the International Health Regulations framework, the French High Council for Public Health was mandated in 2022 by health authorities to establish a list of priority infectious diseases for public health, surveillance and research in mainland and overseas France. Aim Our objective was to establish this list. Methods A multi-criteria decision analysis was used, as recommended by the European Centre for Disease Prevention and Control. A list of 95 entities (infectious diseases or groups of these, including the World Health Organization (WHO)-labelled ‘Disease X’) was established by 17 infectious disease experts. Ten criteria were defined to score entities: incidence rate, case fatality rate, potential for emergence and spread, impact on the individual, on society, on socially vulnerable groups, on the healthcare system, and need for new preventive tools, new curative therapies, and surveillance. Each criterion was assigned a relative weight by 77 multidisciplinary experts. For each entity, 98 physicians from various specialties rated each criterion against the entity, using a four-class Likert-type scale; the ratings were converted into numeric values with a nonlinear scale and respectively weighted to calculate the entity score. Results Fifteen entities were ranked as high-priorities, including Disease X and 14 known pathologies (e.g. haemorrhagic fevers, various respiratory viral infections, arboviral infections, multidrug-resistant bacterial infections, invasive meningococcal and pneumococcal diseases, prion diseases, rabies, and tuberculosis). Conclusion The priority entities agreed with those of the WHO in 2023; almost all were currently covered by the French surveillance and alert system. Repeating this analysis periodically would keep the list updated.

Does improving diagnostic accuracy increase artificial intelligence adoption? A public acceptance survey using randomized scenarios of diagnostic methodsJournal articleYulin Hswen, Ismael Rafai, Antoine Lacombe, Bérengère Davin-Casalena, Dimitri Dubois, Thierry Blayac and Bruno Ventelou, Artificial Intelligence in Health, Volume 2, Issue 1, pp. 114-120, Forthcoming

This study examines the acceptance of artificial intelligence (AI)-based diagnostic alternatives compared to traditional biological testing through a randomized scenario experiment in the domain of neurodegenerative diseases (NDs). A total of 3225 pairwise choices of ND risk-prediction tools were offered to participants, with 1482 choices comparing AI with the biological saliva test and 1743 comparing AI+ with the saliva test (with AI+ using digital consumer data, in addition to electronic medical data). Overall, only 36.68% of responses showed preferences for AI/AI+ alternatives. Stratified by AI sensitivity levels, acceptance rates for AI/AI+ were 35.04% at 60% sensitivity and 31.63% at 70% sensitivity, and increased markedly to 48.68% at 95% sensitivity (p <0.01). Similarly, acceptance rates by specificity were 29.68%, 28.18%, and 44.24% at 60%, 70%, and 95% specificity, respectively (P < 0.01). Notably, AI consistently garnered higher acceptance rates (45.82%) than AI+ (28.92%) at comparable sensitivity and specificity levels, except at 60% sensitivity, where no significant difference was observed. These results highlight the nuanced preferences for AI diagnostics, with higher sensitivity and specificity significantly driving acceptance of AI diagnostics.