ML EPS REG: Difference between revisions

From VASP Wiki
No edit summary
No edit summary
Line 19: Line 19:
All eigenvalues satisfying <math>\lambda_{i} / \lambda_{\mathrm{max}} </math> > {{TAG|ML_EPS_REG}} are contributing by the above equations.
All eigenvalues satisfying <math>\lambda_{i} / \lambda_{\mathrm{max}} </math> > {{TAG|ML_EPS_REG}} are contributing by the above equations.


The parameter {{TAG|ML_EPS_REG}} is not necessarily kept constant throughout the calculation. If at any point in the iterations of the Evidence Approximation  of the square of the quadratic norm of errors (eigth column of <code>REGR/REGRF</code> in [[ML_LOGFILE]]) gets too big (more than 1.2 times larger than before) then {{TAG|ML_EPS_REG}} is doubled.
The smaller the value of {{TAG|ML_EPS_REG}} the the smaller the error of the fit. But at the same time the effects of overfitting increase. We determined empirically that the default value of 1E-14 is a safe value in most cases.
 
If at any point in the iterations of the Evidence Approximation  of the square of the quadratic norm of errors (eigth column of <code>REGR/REGRF</code> in [[ML_LOGFILE]]) gets too big (more than 1.2 times larger than before) then {{TAG|ML_EPS_REG}} is doubled.  


The seventh entry of <code>REGR/REGRF</code> in the [[ML_LOGFILE]] shows the ratio of the regularization (<math>\sigma_{v}^{2}/ \sigma_{w}^{2}</math>) and the largest eigenvalue. Usually this number is a number with many varying digits. If this number becomes a "well rounded" number (e.g. 1.00000000E-14), it is an indication that the cap for the current {{TAG|ML_EPS_REG}} is reached. That means that regularization becomes crucial.  
The seventh entry of <code>REGR/REGRF</code> in the [[ML_LOGFILE]] shows the ratio of the regularization (<math>\sigma_{v}^{2}/ \sigma_{w}^{2}</math>) and the largest eigenvalue. Usually this number is a number with many varying digits. If this number becomes a "well rounded" number (e.g. 1.00000000E-14), it is an indication that the cap for the current {{TAG|ML_EPS_REG}} is reached. That means that regularization becomes crucial.  





Revision as of 10:02, 11 January 2022

ML_EPS_REG = [real]
Default: ML_EPS_REG = 1E-14 

Description: Initial value for the threshold of the eigenvalues of the covariance matrix in the evidence approximation.


This threshold is used to determine which eigenvalues of the covariance matrix are used in the optimization of the regularization parameters and determined by the following equations

.

All eigenvalues satisfying > ML_EPS_REG are contributing by the above equations.

The smaller the value of ML_EPS_REG the the smaller the error of the fit. But at the same time the effects of overfitting increase. We determined empirically that the default value of 1E-14 is a safe value in most cases.

If at any point in the iterations of the Evidence Approximation of the square of the quadratic norm of errors (eigth column of REGR/REGRF in ML_LOGFILE) gets too big (more than 1.2 times larger than before) then ML_EPS_REG is doubled.

The seventh entry of REGR/REGRF in the ML_LOGFILE shows the ratio of the regularization () and the largest eigenvalue. Usually this number is a number with many varying digits. If this number becomes a "well rounded" number (e.g. 1.00000000E-14), it is an indication that the cap for the current ML_EPS_REG is reached. That means that regularization becomes crucial.




Related Tags and Sections

ML_LMLFF, ML_IALGO_LINREG, ML_IREG, ML_SIGV0, ML_SIGW0

Examples that use this tag