Description of columns
This section details the meaning of the columns in the table below.Solvers
These columns are about the algorithms used to solve theInput
These columns are about the points on which the Gaussian process is evaluated, i.e. if the process is . * ND: whether multidimensional input is supported. If it is, multidimensional output is always possible by adding a dimension to the input, even without direct support. * Non-real: whether arbitrary non-Output
These columns are about the values yielded by the process, and how they are connected to the data used in the fit. * Likelihood: whether arbitrary non-Hyperparameters
These columns are about finding values of variables which enter somehow in the definition of the specific problem but that can not be inferred by the Gaussian process fit, for example parameters in the formula of the kernel. * Prior: whether specifying arbitraryLinear transformations
These columns are about the possibility of fitting datapoints simultaneously to a process and to linear transformations of it. * Deriv.: whether it is possible to take an arbitrary number of derivatives up to the maximum allowed by the smoothness of the kernel, for any differentiable kernel. Example partial specifications may be the maximum derivability or implementation only for some kernels. Integrals can be obtained indirectly from derivatives. * Finite: whether finite arbitrary linear transformations are allowed on the specified datapoints. * Sum: whether it is possible to sum various kernels and access separately the processes corresponding to each addend. It is a particular case of finite linear transformation but it is listed separately because it is a common feature.Comparison table
Notes
References
{{reflist, refs= {{cite journal , last1=Foreman-Mackey , first1=Daniel , last2=Angus , first2=Ruth , last3=Agol , first3=Eric , last4=Ambikasaran , first4=Sivaram , s2cid=88521913 , title=Fast and Scalable Gaussian Process Modeling with Applications to Astronomical Time Series , journal=The Astronomical Journal , date=9 November 2017 , volume=154 , issue=6 , page=220 , doi=10.3847/1538-3881/aa9332, arxiv=1703.09710 , bibcode=2017AJ....154..220F {{cite journal , last1=P. Cunningham , first1=John , last2=Gilboa , first2=Elad , last3=Saatçi , first3=Yunus , s2cid=6878550 , title=Scaling Multidimensional Inference for Structured Gaussian Processes , journal=IEEE Transactions on Pattern Analysis and Machine Intelligence , date=Feb 2015 , volume=37 , issue=2 , pages=424–436 , doi=10.1109/TPAMI.2013.192, pmid=26353252 {{cite journal , last1=Leith , first1=D. J. , last2=Zhang , first2=Yunong , last3=Leithead , first3=W. E. , s2cid=13627455 , title=Time-series Gaussian Process Regression Based on Toeplitz Computation of O(N2) Operations and O(N)-level Storage , journal=Proceedings of the 44th IEEE Conference on Decision and Control , date=2005 , pages=3711–3716 , doi=10.1109/CDC.2005.1582739 {{cite journal , last1=Quiñonero-Candela , first1=Joaquin , last2=Rasmussen , first2=Carl Edward , title=A Unifying View of Sparse Approximate Gaussian Process Regression , journal=Journal of Machine Learning Research , date=5 December 2005 , volume=6 , pages=1939–1959 , url=http://www.jmlr.org/papers/v6/quinonero-candela05a.html , accessdate=23 May 2020 {{cite journal , last1=Ambikasaran , first1=S. , last2=Foreman-Mackey , first2=D. , last3=Greengard , first3=L. , last4=Hogg , first4=D. W. , last5=O’Neil , first5=M. , s2cid=15206293 , title=Fast Direct Methods for Gaussian Processes , journal=IEEE Transactions on Pattern Analysis and Machine Intelligence , date=1 Feb 2016 , volume=38 , issue=2 , pages=252–265 , doi=10.1109/TPAMI.2015.2448083, pmid=26761732 , arxiv=1403.6015 {{cite journal , last1=Neumann , first1=Marion , last2=Huang , first2=Shan , last3=E. Marthaler , first3=Daniel , last4=Kersting , first4=Kristian , title=pyGPs -- A Python Library for Gaussian Process Regression and Classification , journal=Journal of Machine Learning Research , date=2015 , volume=16 , pages=2611–2616 , url=http://jmlr.org/papers/v16/neumann15a.html {{cite journal , last1=Gardner , first1=Jacob R , last2=Pleiss , first2=Geoff , last3=Bindel , first3=David , last4=Weinberger , first4=Kilian Q , last5=Wilson , first5=Andrew Gordon , title=GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration , journal=Advances in Neural Information Processing Systems , date=2018 , volume=31 , pages=7576–7586 , arxiv=1809.11165 , url=http://papers.nips.cc/paper/7985-gpytorch-blackbox-matrix-matrix-gaussian-process-inference-with-gpu-acceleration.pdf , accessdate=23 May 2020 {{cite journal , last1=Rasmussen , first1=Carl Edward , last2=Nickisch , first2=Hannes , title=Gaussian processes for machine learning (GPML) toolbox , journal=Journal of Machine Learning Research , date=Nov 2010 , volume=11 , issue=2 , pages=3011–3015 , doi=10.1016/0002-9610(74)90157-3, pmid=4204594 {{cite journal , last1=Novak , first1=Roman , last2=Xiao , first2=Lechao , last3=Hron , first3=Jiri , last4=Lee , first4=Jaehoon , last5=Alemi , first5=Alexander A. , last6=Sohl-Dickstein , first6=Jascha , last7=Schoenholz , first7=Samuel S. , title=Neural Tangents: Fast and Easy Infinite Neural Networks in Python , journal=International Conference on Learning Representations , date=2020, arxiv=1912.02803 {{cite journal , last1=Vanhatalo , first1=Jarno , last2=Riihimäki , first2=Jaakko , last3=Hartikainen , first3=Jouni , last4=Jylänki , first4=Pasi , last5=Tolvanen , first5=Ville , last6=Vehtari , first6=Aki , title=GPstuff: Bayesian Modeling with Gaussian Processes , journal=Journal of Machine Learning Research , date=Apr 2013 , volume=14 , pages=1175−1179 , url=http://jmlr.csail.mit.edu/papers/v14/vanhatalo13a.html , accessdate=23 May 2020 {{cite journal , last1=Marelli , first1=Stefano , last2=Sudret , first2=Bruno , title=UQLab: a framework for uncertainty quantification in MATLAB , journal=Vulnerability, Uncertainty, and Risk. Quantification, Mitigation, and Management , date=2014 , pages=2554–2563 , doi=10.3929/ethz-a-010238238 , url=https://www.research-collection.ethz.ch/bitstream/handle/20.500.11850/379365/eth-14488-01.pdf?sequence=1&isAllowed=y , accessdate=28 May 2020 {{cite journal , last1=Matthews , first1=Alexander G. de G. , last2=van der Wilk , first2=Mark , last3=Nickson , first3=Tom , last4=Fujii , first4=Keisuke , last5=Boukouvalas , first5=Alexis , last6=León-Villagrá , first6=Pablo , last7=Ghahramani , first7=Zoubin , last8=Hensman , first8=James , title=GPflow: A Gaussian process library using TensorFlow , journal=Journal of Machine Learning Research , date=April 2017 , volume=18 , issue=40 , pages=1–6 , arxiv=1610.08733 , url=http://jmlr.org/papers/v18/16-537.html , accessdate=6 July 2020 {{cite journal , last1=Couckuyt , first1=Ivo , last2=Dhaene , first2=Tom , last3=Demeester , first3=Piet , title=ooDACE toolbox: a flexible object-oriented Kriging implementation , journal=Journal of Machine Learning Research , date=2014 , volume=15 , pages=3183–3186 , url=http://www.jmlr.org/papers/volume15/couckuyt14a/couckuyt14a.pdf , accessdate=8 July 2020 {{cite journal , last1=Zilber , first1=Daniel , last2=Katzfuss , first2=Matthias , title=Vecchia–Laplace approximations of generalized Gaussian processes for big non-Gaussian spatial data , journal=Computational Statistics & Data Analysis , date=January 2021 , volume=153 , page=107081 , doi=10.1016/j.csda.2020.107081 , arxiv=1906.07828 , s2cid=195068888 , url=https://www.sciencedirect.com/science/article/pii/S0167947320301729 , access-date=1 September 2021 , issn=0167-9473 {{cite journal , last1=Kalaitzis , first1=Alfredo , last2=Lawrence , first2=Neil D. , title=A Simple Approach to Ranking Differentially Expressed Gene Expression Time Courses through Gaussian Process Regression , journal=BMC Bioinformatics , date=May 20, 2011 , volume=12 , issue=1 , pages=180 , doi=10.1186/1471-2105-12-180 , pmid=21599902 , pmc=3116489 , issn=1471-2105 {{cite journal , last1=Roustant , first1=Olivier , last2=Ginsbourger , first2=David , last3=Deville , first3=Yves , title=DiceKriging, DiceOptim: Two R Packages for the Analysis of Computer Experiments by Kriging-Based Metamodeling and Optimization , journal=Journal of Statistical Software , date=2012 , volume=51 , issue=1 , pages=1–55 , doi=10.18637/jss.v051.i01 , s2cid=60672249 , url=https://www.jstatsoft.org/v51/i01/, doi-access=free {{cite journal , first1=Michaël , last1=Baudin , first2=Anne , last2=Dutfoy , first3=Bertrand , last3=Iooss , first4=Anne-Laure , last4=Popelin , title=Open TURNS: An industrial software for uncertainty quantification in simulation , date=2015 , arxiv=1501.05242 {{cite journal , last1=Sarkka , first1=Simo , last2=Solin , first2=Arno , last3=Hartikainen , first3=Jouni , title=Spatiotemporal Learning via Infinite-Dimensional Bayesian Filtering and Smoothing: A Look at Gaussian Process Regression Through Kalman Filtering , journal=IEEE Signal Processing Magazine , date=2013 , volume=30 , issue=4 , pages=51–61 , doi=10.1109/MSP.2013.2246292 , s2cid=7485363 , url=https://ieeexplore.ieee.org/document/6530736 , access-date=2 September 2021External links