يعرض 1 - 1 نتائج من 1 نتيجة بحث عن '"keyword:functional convexity"', وقت الاستعلام: 0.27s تنقيح النتائج
  1. 1
    Academic Journal

    المؤلفون: Rastegin, Alexey

    وصف الملف: application/pdf

    Relation: mr:MR2954323; reference:[1] L. Baladová: Minimum of average conditional entropy for given minimum probability of error.Kybernetika 2 (1966), 416-422. Zbl 0199.21502, MR 0215641; reference:[2] T. Cover, J. Thomas: Elements of Information Theory.John Wiley & Sons, New York 1991. Zbl 1140.94001, MR 1122806; reference:[3] I. Csiszár: Axiomatic characterizations of information measures.Entropy 10 (2008), 261-273. Zbl 1179.94043, 10.3390/e10030261; reference:[4] Z. Daróczy: Generalized information functions.Inform. and Control 16 (1970), 36-51. Zbl 0205.46901, MR 0272528, 10.1016/S0019-9958(70)80040-7; reference:[5] M. H. DeGroot: Optimal Statistical Decisions.McGraw-Hill, New York 1970. Zbl 1136.62011, MR 0356303; reference:[6] D. Erdogmus, J. C. Principe: Lower and upper bounds for misclassification probability based on Rényi's information.J. VLSI Signal Process. 37 (2004), 305-317. Zbl 1073.94507, 10.1023/B:VLSI.0000027493.48841.39; reference:[7] R. M. Fano: Transmission of Information: A Statistical Theory of Communications.MIT Press and John Wiley & Sons, New York 1961. Zbl 0151.24402, MR 0134389; reference:[8] M. Feder, N. Merhav: Relations between entropy and error probability.IEEE Trans. Inform. Theory 40 (1994), 259-266. Zbl 0802.94004, 10.1109/18.272494; reference:[9] S. Furuichi: Information theoretical properties of Tsallis entropies.J. Math. Phys. 47 (2006), 023302. Zbl 1111.94008, MR 2208160, 10.1063/1.2165744; reference:[10] M. Gell-Mann, C. Tsallis, eds.: Nonextensive Entropy - Interdisciplinary Applications.Oxford University Press, Oxford 2004. Zbl 1127.82004, MR 2073730; reference:[11] G. H. Hardy, J. E. Littlewood, G. Polya: Inequalities.Cambridge University Press, London 1934. Zbl 0634.26008; reference:[12] J. Havrda, F. Charvát: Quantification methods of classification processes: concept of structural $\alpha$-entropy.Kybernetika 3 (1967), 30-35. MR 0209067; reference:[13] P. Jizba, T. Arimitsu: The world according to Rényi: thermodynamics of multifractal systems.Ann. Phys. 312 (2004), 17-59. Zbl 1044.82001, MR 2067083, 10.1016/j.aop.2004.01.002; reference:[14] R. Kamimura: Minimizing $\alpha$-information for generalization and interpretation.Algorithmica 22 (1998), 173-197. Zbl 0910.68173, MR 1637503, 10.1007/PL00013828; reference:[15] A. Novikov: Optimal sequential procedures with Bayes decision rules.Kybernetika 46 (2010), 754-770. Zbl 1201.62095, MR 2722099; reference:[16] A. Perez: Information-theoretic risk estimates in statistical decision.Kybernetika 3 (1967), 1-21. Zbl 0153.48403, MR 0208775; reference:[17] A. E. Rastegin: Rényi formulation of the entropic uncertainty principle for POVMs.J. Phys. A: Math. Theor. 43 (2010), 155302. Zbl 1189.81012, MR 2608279, 10.1088/1751-8113/43/15/155302; reference:[18] A. E. Rastegin: Entropic uncertainty relations for extremal unravelings of super-operators.J. Phys. A: Math. Theor. 44 (2011), 095303. Zbl 1211.81021, MR 2771869, 10.1088/1751-8113/44/9/095303; reference:[19] A. E. Rastegin: Continuity estimates on the Tsallis relative entropy.E-print arXiv:1102.5154v2 [math-ph] (2011). MR 2841748; reference:[20] A. E. Rastegin: Fano type quantum inequalities in terms of $q$-entropies.Quantum Information Processing (2011), doi 10.1007/s11128-011-0347-6.; reference:[21] A. Rényi: On measures of entropy and information.In: Proc. 4th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley - Los Angeles 1961, pp. 547-561. Zbl 0106.33001, MR 0132570; reference:[22] A. Rényi: On the amount of missing information in a random variable concerning an event.J. Math. Sci. 1 (1966), 30-33. MR 0210263; reference:[23] A. Rényi: Statistics and information theory.Stud. Sci. Math. Hung. 2 (1967), 249-256. Zbl 0155.27602, MR 0212964; reference:[24] A. Rényi: On some basic problems of statistics from the point of view of information theory.In: Proc. 5th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley - Los Angeles 1967, pp. 531-543. Zbl 0201.51905, MR 0212963; reference:[25] B. Schumacher: Sending entanglement through noisy quantum channels.Phys. Rev. A 54 (1996), 2614-2628. 10.1103/PhysRevA.54.2614; reference:[26] C. Tsallis: Possible generalization of Boltzmann-Gibbs statistics.J. Stat. Phys. 52 (1988), 479-487. Zbl 1082.82501, MR 0968597, 10.1007/BF01016429; reference:[27] I. Vajda: On the statistical decision problem with discrete paprameter space.Kybernetika 3 (1967), 110-126. MR 0215428; reference:[28] I. Vajda: Bounds of the minimal error probability on checking a finite or countable number of hypotheses.Problemy Peredachii Informacii 4 (1968), 9-19 (in Russian); translated as Problems of Information Transmission 4 (1968), 6-14. MR 0267685; reference:[29] K. Życzkowski: Rényi extrapolation of Shannon entropy.Open Sys. Inform. Dyn. 10 (2003), 297-310; corrigendum in the e-print version arXiv:quant-ph/0305062v2. Zbl 1030.94022, MR 1998623