Extreme Learning Machine training function
elm_train(
x,
y,
nhid,
actfun,
init_weights = "normal_gaussian",
bias = FALSE,
moorep_pseudoinv_tol = 0.01,
leaky_relu_alpha = 0,
seed = 1,
verbose = FALSE
)
a matrix. The columns of the input matrix should be of type numeric
a matrix. In case of regression the matrix should have n rows and 1 column. In case of classification it should consist of n rows and n columns, where n > 1 and equals to the number of the unique labels.
a numeric value specifying the hidden neurons. Must be >= 1
a character string specifying the type of activation function. It should be one of the following : 'sig' ( sigmoid ), 'sin' ( sine ), 'radbas' ( radial basis ), 'hardlim' ( hard-limit ), 'hardlims' ( symmetric hard-limit ), 'satlins' ( satlins ), 'tansig' ( tan-sigmoid ), 'tribas' ( triangular basis ), 'relu' ( rectifier linear unit ) or 'purelin' ( linear )
a character string spcecifying the distribution from which the input-weights and the bias should be initialized. It should be one of the following : 'normal_gaussian' (normal / Gaussian distribution with zero mean and unit variance), 'uniform_positive' ( in the range [0,1] ) or 'uniform_negative' ( in the range [-1,1] )
either TRUE or FALSE. If TRUE then bias weights will be added to the hidden layer
a numeric value. See the references web-link for more details on Moore-Penrose pseudo-inverse and specifically on the pseudo inverse tolerance value
a numeric value between 0.0 and 1.0. If 0.0 then a simple relu ( f(x) = 0.0 for x < 0, f(x) = x for x >= 0 ) activation function will be used, otherwise a leaky-relu ( f(x) = alpha * x for x < 0, f(x) = x for x >= 0 ). It is applicable only if actfun equals to 'relu'
a numeric value specifying the random seed. Defaults to 1
a boolean. If TRUE then information will be printed in the console
The input matrix should be of type numeric. This means the user should convert any character, factor or boolean columns to numeric values before using the elm_train function
http://arma.sourceforge.net/docs.html
https://en.wikipedia.org/wiki/Moore
https://www.kaggle.com/robertbm/extreme-learning-machine-example
http://rt.dgyblog.com/ml/ml-elm.html
library(elmNNRcpp)
#-----------
# Regression
#-----------
data(Boston, package = 'KernelKnn')
Boston = as.matrix(Boston)
dimnames(Boston) = NULL
x = Boston[, -ncol(Boston)]
y = matrix(Boston[, ncol(Boston)], nrow = length(Boston[, ncol(Boston)]), ncol = 1)
out_regr = elm_train(x, y, nhid = 20, actfun = 'purelin', init_weights = 'uniform_negative')
#---------------
# Classification
#---------------
data(ionosphere, package = 'KernelKnn')
x_class = ionosphere[, -c(2, ncol(ionosphere))]
x_class = as.matrix(x_class)
dimnames(x_class) = NULL
y_class = as.numeric(ionosphere[, ncol(ionosphere)])
y_class_onehot = onehot_encode(y_class - 1) # class labels should begin from 0
out_class = elm_train(x_class, y_class_onehot, nhid = 20, actfun = 'relu')