English version

  • Beseda KPMS/Department Colloquium (LS/ST 2023/2024)

    Program semináře bude upřesněn v průběhu semestru.
    The program will be specified during the semester.

  • Limit theorems for random fields

    28. 2. 2024
    Prof. Dalibor Volný (University de Rouen Normandie)

    Abstract: TBD
  • Stochastic methods in crystallography (PhD thesis presentation)

    13. 3. 2024
    Mgr. Iva Karafiátová (KPMS MFF UK)

    Abstract: The mechanical properties of a polycrystalline material are determined by its microstructure, which is characterised by grain morphology and crystallographic orientations of the individual grains. The orientation distribution conditioned on the underlying neighbouring structure is crucial in determining how much the material will deform. However, describing the orientation is difficult due to the presence of symmetries. We develop several tools to assess the dependence among the orientations. Besides descriptive characteristics and various independence tests, we introduce interaction models for the joint distribution of crystallographic orientations. Such models may be used as a substitute for expensive laboratory experiments.
  • Bayesian hierarchical modeling of spatial extremes

    27. 3. 2024
    Prof. Christian Genest (McGill University, Montreal)

    Abstract: Climate change and global warming have increased the need to assess and forecast environmental risk over large domains and to develop models for the extremes of natural phenomena such as droughts, floods, torrential precipitation, and heat waves. Because catastrophic events are rare and evidence is limited, Bayesian methods are well suited for the areal analysis of their frequency and size. In this talk, a multi-site modeling strategy for extremes will be described in which spatial dependence is captured through a latent Gaussian random field whose behavior is driven by synthetic covariates from climate reconstruction models. It will be seen through two vignettes that the site-to-site information sharing mechanism built into this approach does not only generally improve inference at any location but also allows for smooth interpolation over large, sparse domains.
    The first application will concern the quantification of the magnitude of extreme surges on the Atlantic coast of Canada as part of the development of an overland flood protection product by an insurance company. The second illustration will show how coherent estimates of extreme precipitation of several durations based on a Bayesian hierarchical spatial model enhances current methodology for the construction, at monitored and unmonitored locations, of IDF curves commonly used in infrastructure design, flood protection, and urban drainage or water management.
  • Analyzing U.S. Senate Speeches with a Structural Text-Based Scaling Model

    10. 4. 2024
    RNDr. Jan Vávra, Ph.D. (Vienna University of Economics and Business)

    Abstract: Structural Text-Based Scaling Model (STBSM) identifies latent topics in documents and infers ideological positions of authors in a flexible way. STBSM extends the hierarchical Poisson factorization model to infer topics from document-term matrices in two ways (1) unidimensional topic-specific ideological positions influence topic-specific word choice and (2) covariates characterize variation in ideological positions using a regression framework. A flexible hierarchical structure for the prior in the generative model is specified to capture the variability in word popularity, topic prevalence, author verbosities and many other. The posterior is estimated by stochastic variational inference with an emphasis on direct coordinate ascent updates. The application to U.S. Senate speeches from session 114 reveals the key polarizing topics and contributes to a better understanding of dynamics in political discourse.

    24. 4. 2024

  • Title 1: Pathwise Duality of Interacting Particle Systems (PhD thesis presentations)

    22. 5. 2024
    Jan Niklas Latz, DSc. (UTIA AV, KPMS MFF UK)

    Abstract: In the study of Markov processes duality is an important tool used to prove various types of long-time behavior. Nowadays, there exist two predominant approaches to Markov process duality: the algebraic one and the pathwise one. Using the well-known contact process as an example, this talk introduces the general idea of how to construct a pathwise duality for an interacting particle system. Afterwards, several different approaches how to construct pathwise dualities are presented. This is joint work with Jan M. Swart.

    Title 2: Multivariate Probabilistic Forecasting of Electricity Prices With Trading Applications (PhD thesis presentations)

    22. 5. 2024
    RNDr. Karel Kozmík (KPMS MFF UK)

    Abstract: A recently introduced approach is extended to probabilistic electricity price forecasting (EPF) utilizing distributional artificial neural networks, based on a regularized distributional multilayer perceptron (DMLP). We develop this technique for a multivariate case EPF with incorporated dependence. The performance of a fully connected architecture and a LSTM architecture of neural networks are tested. The empirical data application analyzes two day-ahead electricity auctions for the United Kingdom market. This creates the opportunity to buy in the first auction for lower price and sell in the second for higher price (or vice versa). Utilizing forecasting results, we develop trading strategies with various investors’ objectives. We find that, while DMLP shows similar performance compared to the benchmarks, the algorithm is considerably less computationally costly.
  • Beseda KPMS/Department Colloquium (ZS/WS 2023/2024)

    Program semináře bude upřesněn v průběhu semestru.
    The program will be specified during the semester.

  • Statistical prediction of load-sharing systems and sign depth tests

    Prof. Christine Müller (TU Dortmund, Department of Statistics)

    Abstract: Load-sharing systems are systems with several components which carry together some external load. As soon as one component fails then the remaining components have to carry more load. This means that the load is redistributed on the remaining components as soon as a component fails. The failure times can be modelled by a birth process if they can be observed. We use this approach for the failures of tension wires in concrete beams which are used in bridges. Here the failure times can be located by acoustic measurements. However, the failures of tension wires can be only observed if the beams are exposed to unrealistic high load. To predict the time of a critical number of failures under realistic load, so-called accelerated life-time experiments can be used. In such experiments, the behaviour under low load is estimated by experiments with much higher load. The aim is not only a point prediction for the time of a critical number of failures under realistic low load, but also prediction intervals which include the correct time with high probability. One possibility do this is the use of confidence sets based on classical maximum likelihood estimators. This method is rather sensitive to outliers and abnormal observations. Therefore, we propose also a method based on the so-called sign depth. Sign depth originated from regression depth. The so-called K-sign depth counts the relative number of K-tuples of residuals with alternating signs. Tests based on it are called K-depth tests. For K=2, this test is equivalent with a classical sign test which has a very bad power in complex models. However, as soon as K is greater than 2, it is very powerful test. We demonstrate this by simulations. For K=3, we even can prove consistency of these tests in rather general situations. Moreover, we show that they lead to reasonable small prediction intervals for our application. In the outlook, we demonstrate how the simple load-sharing model can be extended to a model with damage accumulation. This is a self-exciting point process and can be treated by K-sign depth as well.
  • Self-regularized sparsity

    Prof. RNDr. Ivan Mizera, CSc. (KPMS MFF UK)

    Abstract: Is that what we do - be it called (mathematical) statistics, or (statistical) machine learning, or some other name(s) - to be considered rather science or rather technology? From the outset of "combination of observations" we can record both: as usual, it starts as technology, then transforms into science trying to elucidate the very principles; then a sudden and shocking collapse of seemingly already established course returns the focus to technologic attitudes, as a pragmatic way out of the controversies; but then the resurrection of scientific approach brings new technology. All this aims to be illustrated on the particular history of mixture models within the context of empirical Bayes methodology - a discipline that could have been again considered already well settled if not for the newest breakthroughs connecting it to contemporary quest for sparse solutions. From science to technology, from technology to science - and eventually what?
  • Sampling in a matrix

    Prof. Louis-Paul Rivest (Université Laval, Québec, Canada)

    Abstract: This presentation is concerned with populations that can be seen as a matrix with N rows and M columns. For instance, in a survey of tourists that visit a region, the rows of the matrix are N sites that can be visited by tourists while the columns are M days in the tourist season. The goal is to select a sample that fulfills several constraints. The number of sites visited on a given day is often fixed; it depends on the manpower allocated to the survey. On the other hand, the number of visits to a site is set in advance and depends on the site’s importance. The sample can be seen as a matrix Z with N rows and M columns with a 1 in cell (i,j) if site i is sampled on day j and 0 otherwise. Because of the constraints the row and the column totals of a sample matrix Z are fixed. The sample is selected according to uniform distribution on all the matrices Z fulfilling these constraints. Several algorithms to select such a sample are presented. At each sampled point, the survey variable y is observed. The goal is to estimate the mean of y over the MN entries of the population matrix. This is done using the Horwitz-Thompson estimator. An unbiased estimator of its variance is proposed. Multilevel generalizations of the proposed sampling plan are also discussed; they are useful to address additional constraints, beside fixed row and column totals, that often need to be considered when planning such a survey.

    Rivest, Louis-Paul. « Statistical methods for sampling cross-classified populations under constraints. » Survey Methodology, vol 49 no 2 (2023): to appear in the December issue.
    Rivest, Louis-Paul. « Limiting properties of an equiprobable sampling scheme for 0–1 matrices. » Statistics & Probability Letters 172 (2021): 109047.
    Rivest, Louis-Paul, and Sergio Ewane Ebouele. « Sampling a two dimensional matrix. » Computational Statistics & Data Analysis 149 (2020): 106971.
  • A Measure-Transportation-Based GOF Test for Directional Data

    Prof. Marc Hallin (Université libre de Bruxelles)

    Abstract: Based on recent measure transportation results, we propose new concepts of distribution and quantile functions on the hypersphere. The empirical versions of our distribution functions enjoy the expected Glivenko-Cantelli property of traditional distribution functions and yield fully distribution-free concepts of ranks and signs. They also yield ``universally consistent'' GOF tests which were not available so far due to the absence of a sound concept of distribution function on the hypersphere and significantly outperform existing procedures. Based on joint work with Thomas Verdebout and Hang Liu.
  • Estimators based on ranks of residuals from the viewpoint of optimization

    Prof. RNDr. Ing. Michal Černý, Ph.D. (VŠE Praha)

    Abstract: In linear regression there exist various robust estimators of regression parameters utilizing the information on the ordering of residuals. Such estimators are defined as minimizers of special loss functions depending on the residual ranks. An important example is Jaeckel's dispersion in R-regression. The taks of optimization is to design efficient methods for minimization of such loss functions. The functions form quite a general class encompassing also trimmed least squares or minimization of the median absolute residual for example. The talk is focused on geometry of such functions, distinguishing the convex and nonconvex cases, and namely drawbacks of currently known and used algorithms for them (such as iteratively reweighted least squares). We explain why even the convex case is a real algorithmic challenge. We present some new developments in the area, including the main problem that for minimizing some of the loss functions, which can be written down as linear programming problems with a factorial number of constraints, interior point methods still lack.
  • Rough Path Theory

    RNDr. Petr Čoupek, Ph.D. (KPMS MFF UK)

    Abstract: It is not hard to imagine that introducing some sort of a stochastic noise term into differential equations can be used to account for non-systematic error or uncertainty in the described physical model.
    Typically, however, stochastic processes produce paths that are (Holder) continuous yet nowehere differentiable - a prototypical example is the Wiener process. As such, these paths are too “rough” to be used as integrators and one has to carefully construct a notion of an integral. Subsequently, the (stochastic) differential equations can be treated and solved rigorously. These integrals are often constructed as limits of Riemann-type sums but the convergence is taken in L2, that is, it takes into account all the possible paths of the process (this is the case, for example, for both the Ito and Stratonovich integrals) and, as a consequence, one cannot hope to solve the equations pathwise, i.e. obtain a path that would satisfy the differential equation that takes the path of the noise as input.
    In a certain sense, rough path theory does precisely that. It breaks this procedure into two steps - a probablistic step, where one constructs the integral, and an analytical step, that takes both the path and its (iterated) integral as input and solves the now deterministic equation for that input.
    In the talk, I hope to pave (through examples, analogies, and broad concepts) a smooth path to the theory of rough paths.
  • Beseda KPMS - seminář (LS 2022/2023)

    Program semináře bude upřesněn v průběhu semestru.

  • Portmanteau tests for VARMA models: Chitturi, Hosking, and Li-McLeod revisited

    Prof. Marc Hallin (Université libre de Bruxelles)
    Abstract: The pseudo-Gaussian portmanteau tests of Chitturi, Hosking, and Li and McLeod for VARMA models are revisited from a Le Cam perspective, providing a precise and more rigorous description of the asymptotic behavior of the multivariate portmanteau test statistic, which depends on the dimension $d$ of the observations, the number $m$ of lags involved, and the length $n$ of the observation period. Then, based on the concepts of center-outward ranks and signs recently developed (Hallin, del Barrio, Cuesta-Albertos, and Matran (2021), Annals of Statistics 49, 1139--1165) a class of multivariate rank- and sign-based portmanteau test statistics is proposed which, under the null hypothesis and under a broad family of innovatio densities, can be approximated by an asymptotically chi-square variable. The asymptotic properties of these tests are derived; simulations demonstrate their advantages over their classical pseudo-Gaussian counterpart.
  • Stabilization of Dynamical Systems by Noise with Jumps

    Mgr. Ondřej Týbl (KPMS MFF UK)
    Abstract: Recent work on stabilisation and long-time behaviour of solutions of stochastic (partial) differential equations with a jump part will be presented. The stability is understood in a broad sense, including the existence of a stationary solution and convergence of moments. Long-time properties of the solutions are investigated with emphasis on convergence to a single point - a typical task when considering the so called Robbins-Monro stochastic approximation procedure for approximation of a root of a given function. All of these results can be obtained by means of so called Lyapunov functions and lead to concrete conditions on the coefficients of the equation which can be interpreted geometrically. This is joint work with Professors B. Maslowski, M. Riedle and Doctor J. Seidler.
  • Optimization problems: cases from the Czech Republic and Norway

    Ing. Vít Procházka, Ph.D. (KPMS MFF UK)
    Abstract: Vit is the newest member of KPMS, working as the Assistant Professor within the optimization group. In this talk, he will introduce himself and his background as well as some scientific projects he is/was involved in: from waste management modeling to obtain his bachelor and master degree at the Brno University of Technology to his work during PhD studies in Bergen, Norway. In the far North, he switched to modeling of the trading patterns of the largest bulk carriers (vessels capable of carrying up to 500,000 tons of iron ore or coal). He also worked on more general problems regarding the scenario generation for stochastic optimization problems, which he will dedicate the most attention to (in his talk as well as his future work at the Charles University).
  • RRT for Estimating the Population Total of a Quantitative Variable

    Prof. Jaromír Antoch (KPMS MFF UK)
    Abstract: TBD
  • Quickest detection problem and monitoring of actuarial assumptions: from theory to practice

    Prof. Stephane Loisel (ISFA, Lyon University)
    Abstract: After recalling briefly key features of longevity risk, we explain how to detect as quickly as possible the date where that the actuarial assumptions related to longevity risk are no longer valid. The problem is put as a quickest detection problem. We introduce the so-called cusum process and show its optimality for a generalized Lorden criterion. We then explain how to design Key Risk Indicators thanks to the cusum process. We analyze its advantages and drawbacks for longevity risk monitoring, as well as for some other insurance risks. The method is illustrated on simulated and real-world case studies.​ This is based on a joint work with Nicole El Karoui (Paris) and Yahia Salhi (Lyon), as well as other works in progress.
  • Seminář zrušen

    Abstract: N/A
  • Random tessellation modelling with applications in materials research

    Prof. Viktor Beneš (KPMS MFF UK)
    Abstract: TBD
  • Beseda KPMS - seminář (ZS 2022/2023)

    Program semináře bude upřesněn v průběhu semestru.

  • Parameter estimation in an SPDE model for cell repolarisation

    Dr. Josef Janák (Karlsruhe Institut of Technology, Germany)
    Abstract: As a concrete setting where stochastic partial differential equations (SPDEs) are able to model real phenomena, we propose a stochastic Meinhardt model for cell repolarisation and study how parameter estimation techniques developed for simple linear SPDE models apply in this situation. We pursue estimation of diffusion term based on continuous time observations which are localised in space. We show asymptotic normality for our estimator as the space resolution becomes finer. We demonstrate the performance of the model and the estimator in numerical and real data experiments.

    This is joint work with Professors M. Reiß, T. Bretschneider and Doctor R. Altmeyer.

  • On functional time series with long memory, with applications to mechanically ventilated breathing activity

    Prof. Jan Beran (University of Konstanz)
    Abstract: In cases of severe respiratory failure, mechanical ventilation is an essential life saving measure. Effective mechanical ventilation depends on suitable estimates and predictions of the breathing effort. Advanced techniques include transdiaphragmatic pressure measurement (Pdi) or electromyography (EMG) of the respiratory muscles. However these methods are invasive, and therefore associated with potentially serious side effects. In particular, prolonged weaning may be difficult. In this talk a statistical model is discussed that was developed with the aim of assessing whether noninvasive surface EMG can replace invasive methods. The model can be understood as a non-Markovian extension of state space models or of functional time series models. Statistical inference is based on a functional limit theorem for discreted time processes. Applications include nonlinear prediction intervals for Pdi curves based on sEMG, detection of changes in breathing patterns and monitoring. .

    This is joint work with Jeremy Naescher, Franziska Farquharson, Max Kustermann, Hans-Joachim Kabitz, Stephan Walterspacher, Doreen Werner and Thomas Handzsuj.
  • Semiparametric transformation models: mean and boundary regression


    Prof. Natalie Neumeyer (Universität Hamburg)

    On TUESDAY 8.11.2022, 9:00, KPMS meeting room (1st floor).

  • Minimal correction of infeasible systems

    Prof. Milan Hladík (KAM, MFF UK)
    Abstract: We are given an infeasible system, e.g. a linear system of equations or inequalities, and the aim is to change possibly all the input data in a minimal way such that the system becomes feasible. This minimality can be measured in various norms; we consider the Chebyshev norm and the spectral and Frobenius norms. For example, Frobenius norm used for a linear system of equations leads to the so-called total least squares. In our talk, we give an overview of the topic - we present the minimum perturbation formulae and discuss the complexity issues. We will not dive much into the algorithms.
  • Some notes about asymptotics

    Prof. Marie Hušková (KPMS, MFF UK)

  • Statistical foundations of topological data analysis

    Prof. Christian Hirsch (Aarhus University)
    Abstract: Topological data analysis (TDA) is an emerging field at the interface between algebraic topology and data science. The philosophy behind TDA is to leverage invariants from algebraic topology to gain insights into data sets. While initially, TDA was developed and promoted by mathematicians, it is now applied in a variety of disciplines such as biology, chemistry, and materials science. The key tool in TDA is the persistence diagram, which allows for capturing the appearance and disappearance of topological features at multiple scales. In this talk, I will review recent advances in the statistical foundations of dealing with persistence diagrams on random input. I will also elaborate on how persistence diagrams can be used to design rigorous statistical hypothesis tests for 3D microstructure models based on data from 2D slices.

    This talk is based on joint work with Johannes Krebs and Claudia Redenbach.
  • Beseda KPMS - seminář (LS 2022/2023)

    Program semináře bude upřesněn v průběhu semestru.


Copyright © JČ, MB 2011