References

The following list includes important references used for implementations within libForest:

  • Online Random Forests / Decision Trees:

    A. Saffari, C Leistner, J. Santner, M.Godec. On-Line Random Forests. International Conference on Computer Vision Workshops, 2009.

  • Variable Importance:

    G. Louppe, L. Wehenkel, A. Sutera, P. Geurts. Understanding Variable Importance in Forests of Randomized Trees. Advances in Neural Information Processing Systems, 2013.

    G. Louppe. Understanding Random Forests. PhD thesis, Universite de Liege, Belgium, 2014.

  • Density Forests:

    A. Criminisi, J. Shotton. Density Forests. In Decision Forests for Computer Vision and Medical Image Analysis, Springer, 2013.

  • Kullback-Leibler Divergence:

    F. Perez-Cruz. Kullback-Leibler DIvergence Estimation of Continuous Distributions. International Symposium on Information Theory, 2008.

  • Kernel Density Estimation:

    B. E. Hansen. Lecture Notes on Nonparametrics. University of Wisconsin, 2009.

    P. B. Stark. Statistics 240 Lecture Notes, part 10: Density Estimation. University of California Berkeley, 2008.

    M. C. Jones, J. S. Marron, S. J. Sheather. A Brief Survey of Bandwidth Selection for Density Estimation. Journal of the American Statistical Association, 91(433), 1996.

    B. A. Turlach. Bandwidth Selection in Kernel Density Estimation: A Review. C.O.R.E. and Intitut de Statistique, Universite Catholique de Louvain, Belgium.

  • K-Means:

    D. Arthur, S. Vassilvitskii. k-means++: The Advantages of Careful Seeding. Proceedings of the ACM-SIAM Symposium on Discrete Algorithms, 2007.

    C. Elkan. Using the Triangle Inequality to Accelerate k-Means. International Conference on Machine Learning, 2003.

    J. Han, M. Kamber, J.Pei. Data Mining: Concepts and Techniques. Morgan Kaufmann Publishers Inc. San Francisco, CA, 2005.

Original page: https://github.com/strands-project/semantic_segmentation/blob/master/src/backend/third-party/libforest/docs/references.md