ITMO
ru/ ru

ISSN: 1023-5086

ru/

ISSN: 1023-5086

Scientific and technical

Opticheskii Zhurnal

A full-text English translation of the journal is published by Optica Publishing Group under the title “Journal of Optical Technology”

Article submission Подать статью
Больше информации Back

УДК: 004.932

Comparison of probabilistic programming languages, using the solution of clustering problems and the distinguishing of attributes as an example

For Russian citation (Opticheskii Zhurnal):

Филатов В.И., Потапов А.С. Сравнение вероятностных языков программирования на примере решения задач кластеризации и выделения признаков // Оптический журнал. 2015. Т. 82. № 8. С. 66–75.

 

Filatov V.I., Potapov A.S. Comparison of probabilistic programming languages, using the solution of clustering problems and the distinguishing of attributes as an example [in Russian] // Opticheskii Zhurnal. 2015. V. 82. № 8. P. 66–75.

For citation (Journal of Optical Technology):

V. I. Filatov and A. S. Potapov, "Comparison of probabilistic programming languages, using the solution of clustering problems and the distinguishing of attributes as an example," Journal of Optical Technology. 82(8), 542-550 (2015). https://doi.org/10.1364/JOT.82.000542

Abstract:

The clustering problem is solved, using probabilistic programming languages belonging to two families—languages that implement graphical models (Infer.NET) and arbitrary computable generative models (Church). A comparison is made of the features and efficiency of the implementations. It is established that the Infer.NET language has higher accuracy and throughput of the implementation, but that it required the use of an imperative component of the language, which exceeds the limits of generative models. An autoencoder—a standard element of deep-learning networks—has been implemented in the Church language, which did not require the implementation of specialized network-training methods. It is shown that there is great potential in general-purpose probabilistic languages, the implementation of which, however, requires inference methods to be developed.

Keywords:

probabilistic programming, inductive inference, generative model

Acknowledgements:

This work was carried out with the support of the Ministry of Education and Science of the Russian Federation and with the partial state support of the leading universities of the Russian Federation (Subsidy 074-U01).

OCIS codes: 150.1135

References:

1. N. Goodman, “The principles and practice of probabilistic programming,” ACM SIGPLAN Not. 48, No. 1, 339 (2013).
2. N. Goodman, V. Mansinghka, D. Roy, K. Bonawitz, and J. Tanenbaum, “Church: a language for generative models,” Uncertain. Artif. Intell. 8, 220 (2008).
3. H. Abelson and G. Sussman, Structure and Interpretation of Computer Programs (MIT, Cambridge, Mass., 1996).
4. S. Wang and M. Wand, “Using Infer.Net for statistical analyses,” Am. Stat. 65, No. 2, 115 (2011).
5. T. Minka, “A family of algorithms for approximate Bayesian inference,” Ph.D. thesis (MIT, 2001).
6. T. Minka, “Expectation propagation for approximate Bayesian inference,” Uncertain. Artif. Intell. 1, 362 (2001).
7. J. Bishop, “Variational message passing,” J. Mach. Learn. Res. 6, 661 (2005).
8. V. Mansinghka, “Natively probabilistic computation,” Ph.D. thesis (MIT, 2009).
9. J. Li, J. Cheng, J. Shi, and F. Huang, “Brief introduction of back propagation neural-network algorithm and its improvement,” Adv. Comput. Sci. Inf. Eng. 169, 553 (2012).
10. V. Mansinghka, T. Kulkarni, Y. Perov, and J. Tenenbaum, “Approximate Bayesian image interpretation using generative probabilistic graphics programs,” Adv. Neural Info. Process. Syst. 1, 1520 (2013).