Comptes Rendus
Statistics/Probability Theory
Dual representation of φ-divergences and applications
[Représentation duale des φ-divergences et applications]
Comptes Rendus. Mathématique, Volume 336 (2003) no. 10, pp. 857-862.

Dans cette Note, nous donnons une représentation « duale » des divergences. Nous utilisons cette représentation pour définir et étudier de nouveaux estimateurs de la loi et des divergences pour des modèles paramétriques discrets et continus.

In this Note, we give a “dual” representation of divergences. We make use of this representation to define and study some new estimates of the law and of the divergences for discrete and continuous parametric models.

Reçu le :
Accepté le :
Publié le :
DOI : 10.1016/S1631-073X(03)00215-2

Amor Keziou 1

1 LSTA, boı̂te courrier 158, 8A, Université Paris-6, 175, rue du Chevaleret, 75013 Paris, France
@article{CRMATH_2003__336_10_857_0,
     author = {Amor Keziou},
     title = {Dual representation of \protect\emph{\ensuremath{\varphi}}-divergences and applications},
     journal = {Comptes Rendus. Math\'ematique},
     pages = {857--862},
     publisher = {Elsevier},
     volume = {336},
     number = {10},
     year = {2003},
     doi = {10.1016/S1631-073X(03)00215-2},
     language = {en},
}
TY  - JOUR
AU  - Amor Keziou
TI  - Dual representation of φ-divergences and applications
JO  - Comptes Rendus. Mathématique
PY  - 2003
SP  - 857
EP  - 862
VL  - 336
IS  - 10
PB  - Elsevier
DO  - 10.1016/S1631-073X(03)00215-2
LA  - en
ID  - CRMATH_2003__336_10_857_0
ER  - 
%0 Journal Article
%A Amor Keziou
%T Dual representation of φ-divergences and applications
%J Comptes Rendus. Mathématique
%D 2003
%P 857-862
%V 336
%N 10
%I Elsevier
%R 10.1016/S1631-073X(03)00215-2
%G en
%F CRMATH_2003__336_10_857_0
Amor Keziou. Dual representation of φ-divergences and applications. Comptes Rendus. Mathématique, Volume 336 (2003) no. 10, pp. 857-862. doi : 10.1016/S1631-073X(03)00215-2. https://comptes-rendus.academie-sciences.fr/mathematique/articles/10.1016/S1631-073X(03)00215-2/

[1] M. Broniatowski, Estimation through Kullback–Leibler divergence, Math. Methods Statist. (2003) (submitted)

[2] N. Cressie; T. Read Multinomial goodness-of-fit tests, J. Roy. Statist. Soc. Ser. B, Volume 46 (1984) no. 3, pp. 440-464

[3] I. Csiszar On topological properties of f-divergences, Studia Sci. Math. Hungar., Volume 2 (1967), pp. 329-339

[4] A. Dembo; O. Zeitouni Large Deviations Techniques and Applications, Jones & Bartlett, 1998

[5] F. Liese; I. Vajda Convex Statistical Distances, Teubner, Leipzig, 1987

[6] B.G. Lindsay Efficiency versus robustness: the case for minimum Hellinger distance and related methods, Ann. Statist., Volume 22 (1994), pp. 1081-1114

[7] D. Morales; L. Pardo; I. Vajda Asymptotic divergence of estimates of discrete distributions, J. Statist. Plann. Inference, Volume 48 (1995) no. 3, pp. 347-369

[8] L. Rüschendorf On the minimum discrimination information theorem, Statist. Decisions, Suppl., Volume 1 (1984), pp. 263-283

[9] A. van der Vaart Asymptotic Statistics, Cambridge Series in Statistical and Probabilitic Mathematics, 1998

  • Jian Huang; Yuling Jiao; Xu Liao; Jin Liu; Zhou Yu Deep Dimension Reduction for Supervised Representation Learning, IEEE Transactions on Information Theory, Volume 70 (2024) no. 5, p. 3583 | DOI:10.1109/tit.2023.3340658
  • Mohamed Boukeloua; Amor Keziou Empirical Likelihood with Censored Data, Geometric Science of Information, Volume 14071 (2023), p. 125 | DOI:10.1007/978-3-031-38271-0_13
  • Kun Li; Shengwei Tian; Long Yu; Tiejun Zhou; Bo Wang; Fun Wang Mutual information maximization and feature space separation and bi-bimodal mo-dality fusion for multimodal sentiment analysis, Journal of Intelligent Fuzzy Systems, Volume 45 (2023) no. 4, p. 5783 | DOI:10.3233/jifs-222189
  • Xingyu Zhou; Yuling Jiao; Jin Liu; Jian Huang A Deep Generative Approach to Conditional Sampling, Journal of the American Statistical Association, Volume 118 (2023) no. 543, p. 1837 | DOI:10.1080/01621459.2021.2016424
  • Shun Otsubo; Sreekanth K. Manikandan; Takahiro Sagawa; Supriya Krishnamurthy Estimating time-dependent entropy production from non-equilibrium trajectories, Communications Physics, Volume 5 (2022) no. 1 | DOI:10.1038/s42005-021-00787-x
  • Tomoya Sakai; Gang Niu; Masashi Sugiyama Information-Theoretic Representation Learning for Positive-Unlabeled Classification, Neural Computation, Volume 33 (2021) no. 1, p. 244 | DOI:10.1162/neco_a_01337
  • Wei Ni; Zhong-Ping Jiang, 2019 Chinese Control And Decision Conference (CCDC) (2019), p. 3016 | DOI:10.1109/ccdc.2019.8832868
  • Nan Xi, 2019 International Conference on Image and Vision Computing New Zealand (IVCNZ) (2019), p. 1 | DOI:10.1109/ivcnz48456.2019.8961008
  • Igal Sason On Data-Processing and Majorization Inequalities for f-Divergences with Applications, Entropy, Volume 21 (2019) no. 10, p. 1022 | DOI:10.3390/e21101022
  • Sergio Verdú Empirical Estimation of Information Measures: A Literature Guide, Entropy, Volume 21 (2019) no. 8, p. 720 | DOI:10.3390/e21080720
  • Chen Gong; Tongliang Liu; Jian Yang; Dacheng Tao Large-Margin Label-Calibrated Support Vector Machines for Positive and Unlabeled Learning, IEEE Transactions on Neural Networks and Learning Systems, Volume 30 (2019) no. 11, p. 3471 | DOI:10.1109/tnnls.2019.2892403
  • Michel Broniatowski A Weighted Bootstrap Procedure for Divergence Minimization Problems, Analytical Methods in Statistics, Volume 193 (2017), p. 1 | DOI:10.1007/978-3-319-51313-3_1
  • Amor Keziou; Philippe Regnault Semiparametric Estimation of Mutual Information and Related Criteria: Optimal Test of Independence, IEEE Transactions on Information Theory, Volume 63 (2017) no. 1, p. 57 | DOI:10.1109/tit.2016.2620163
  • Marthinus C. du Plessis; Gang Niu; Masashi Sugiyama Class-prior estimation for learning from positive and unlabeled data, Machine Learning, Volume 106 (2017) no. 4, p. 463 | DOI:10.1007/s10994-016-5604-6
  • Amor Keziou Multivariate Divergences with Application in Multisample Density Ratio Models, Geometric Science of Information, Volume 9389 (2015), p. 444 | DOI:10.1007/978-3-319-25040-3_48
  • Amor Keziou; Philippe Regnault Generalized Mutual-Information Based Independence Tests, Geometric Science of Information, Volume 9389 (2015), p. 454 | DOI:10.1007/978-3-319-25040-3_49
  • Voot Tangkaratt; Ning Xie; Masashi Sugiyama Conditional Density Estimation with Dimensionality Reduction via Squared-Loss Conditional Entropy Minimization, Neural Computation, Volume 27 (2015) no. 1, p. 228 | DOI:10.1162/neco_a_00683
  • Marthinus Christoffel DU PLESSIS; Masashi SUGIYAMA Class Prior Estimation from Positive and Unlabeled Data, IEICE Transactions on Information and Systems, Volume E97.D (2014) no. 5, p. 1358 | DOI:10.1587/transinf.e97.d.1358
  • Marthinus Christoffel du Plessis; Masashi Sugiyama Semi-supervised learning of class balance under class-prior change by distribution matching, Neural Networks, Volume 50 (2014), p. 110 | DOI:10.1016/j.neunet.2013.11.010
  • Michel Broniatowski Minimum divergence estimators, maximum likelihood and exponential families, Statistics Probability Letters, Volume 93 (2014), p. 27 | DOI:10.1016/j.spl.2014.06.014
  • Masashi Sugiyama, 2013 International Winter Workshop on Brain-Computer Interface (BCI) (2013), p. 12 | DOI:10.1109/iww-bci.2013.6506611
  • Masashi Sugiyama Direct Approximation of Divergences Between Probability Distributions, Empirical Inference (2013), p. 273 | DOI:10.1007/978-3-642-41136-6_23
  • Michel Broniatowski Weighted Sampling, Maximum Likelihood and Minimum Divergence Estimators, Geometric Science of Information, Volume 8085 (2013), p. 467 | DOI:10.1007/978-3-642-40020-9_51
  • Masashi Sugiyama; Song Liu; Marthinus Christoffel du Plessis; Masao Yamanaka; Makoto Yamada; Taiji Suzuki; Takafumi Kanamori Direct Divergence Approximation between Probability Distributions and Its Applications in Machine Learning, Journal of Computing Science and Engineering, Volume 7 (2013) no. 2, p. 99 | DOI:10.5626/jcse.2013.7.2.99
  • Taiji Suzuki; Masashi Sugiyama Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation, Neural Computation, Volume 25 (2013) no. 3, p. 725 | DOI:10.1162/neco_a_00407
  • Song Liu; Makoto Yamada; Nigel Collier; Masashi Sugiyama Change-point detection in time-series data by relative density-ratio estimation, Neural Networks, Volume 43 (2013), p. 72 | DOI:10.1016/j.neunet.2013.01.012
  • M. El Rhabi; H. Fenniri; A. Keziou; E. Moreau A robust algorithm for convolutive blind source separation in presence of noise, Signal Processing, Volume 93 (2013) no. 4, p. 818 | DOI:10.1016/j.sigpro.2012.09.026
  • Masashi Sugiyama; Taiji Suzuki; Takafumi Kanamori Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation, Annals of the Institute of Statistical Mathematics, Volume 64 (2012) no. 5, p. 1009 | DOI:10.1007/s10463-011-0343-8
  • Takafumi Kanamori; Taiji Suzuki; Masashi Sugiyama f-Divergence Estimation and Two-Sample Homogeneity Test Under Semiparametric Density-Ratio Models, IEEE Transactions on Information Theory, Volume 58 (2012) no. 2, p. 708 | DOI:10.1109/tit.2011.2163380
  • Salim Bouzebda; Mohamed Cherfi Dual Divergence Estimators of the Tail Index, ISRN Probability and Statistics, Volume 2012 (2012), p. 1 | DOI:10.5402/2012/746203
  • Salim Bouzebda; Mohamed Cherfi General Bootstrap for Dual ϕ-Divergence Estimates, Journal of Probability and Statistics, Volume 2012 (2012), p. 1 | DOI:10.1155/2012/834107
  • Michel Broniatowski; Amor Keziou Divergences and duality for estimation and test under moment condition models, Journal of Statistical Planning and Inference, Volume 142 (2012) no. 9, p. 2554 | DOI:10.1016/j.jspi.2012.03.013
  • Bruno Pelletier Inference in ϕ-families of distributions, Statistics, Volume 45 (2011) no. 3, p. 223 | DOI:10.1080/02331880903546324
  • XuanLong Nguyen; Martin J. Wainwright; Michael I. Jordan Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization, IEEE Transactions on Information Theory, Volume 56 (2010) no. 11, p. 5847 | DOI:10.1109/tit.2010.2068870
  • Michel Broniatowski; Amor Keziou Parametric estimation and tests through divergences and the duality technique, Journal of Multivariate Analysis, Volume 100 (2009) no. 1, p. 16 | DOI:10.1016/j.jmva.2008.03.011
  • Amor Keziou; Samuela Leoni-Aubin On empirical likelihood for semiparametric two-sample density ratio models, Journal of Statistical Planning and Inference, Volume 138 (2008) no. 4, p. 915 | DOI:10.1016/j.jspi.2007.02.009

Cité par 36 documents. Sources : Crossref

Commentaires - Politique