| sits_mlp {sits} | R Documentation |
Use a multi-layer perceptron algorithm to classify data. This function is a front-end to the "keras" method R package. Please refer to the documentation in that package for more details.
sits_mlp( samples = NULL, layers = c(512, 512, 512), activation = "relu", dropout_rates = c(0.1, 0.2, 0.3), optimizer = keras::optimizer_adam(learning_rate = 0.001), epochs = 100, batch_size = 64, validation_split = 0.2, verbose = 0 )
samples |
Time series with the training samples. |
layers |
Vector with number of hidden nodes in each layer. |
activation |
Vector with the names of activation functions. Valid values are 'relu', 'elu', 'selu', 'sigmoid'. |
dropout_rates |
Vector with the dropout rates (0,1) for each layer. |
optimizer |
Function with a pointer to the optimizer function (default is optimization_adam()). Options are optimizer_adadelta(), optimizer_adagrad(), optimizer_adam(), optimizer_adamax(), optimizer_nadam(), optimizer_rmsprop(), optimizer_sgd() |
epochs |
Number of iterations to train the model. |
batch_size |
Number of samples per gradient update. |
validation_split |
Number between 0 and 1. Fraction of the training data for validation. The model will set apart this fraction and will evaluate the loss and any model metrics on this data at the end of each epoch. |
verbose |
Verbosity mode (0 = silent, 1 = progress bar, 2 = one line per epoch). |
Either a model to be passed in sits_predict or a function prepared to be called further.
The parameters for the MLP have been chosen based on the work by Wang et al. 2017 that takes multilayer perceptrons as the baseline for time series classifications: (a) Three layers with 512 neurons each, specified by the parameter 'layers'; (b) Using the 'relu' activation function; (c) dropout rates of 10 (d) the "optimizer_adam" as optimizer (default value); (e) a number of training steps ('epochs') of 100; (f) a 'batch_size' of 64, which indicates how many time series are used for input at a given steps; (g) a validation percentage of 20 will be randomly set side for validation.
Gilberto Camara, gilberto.camara@inpe.br
Rolf Simoes, rolf.simoes@inpe.br
Hassan Fawaz, Germain Forestier, Jonathan Weber, Lhassane Idoumghar, and Pierre-Alain Muller, "Deep learning for time series classification: a review", Data Mining and Knowledge Discovery, 33(4): 917–963, 2019.
Zhiguang Wang, Weizhong Yan, and Tim Oates, "Time series classification from scratch with deep neural networks: A strong baseline", 2017 international joint conference on neural networks (IJCNN).
Implementation based on the python keras implementation provided in https://github.com/hfawaz/dl-4-tsc.
## Not run: # Retrieve the set of samples for the Mato Grosso region data(samples_modis_4bands) samples_mt_ndvi <- sits_select(samples_modis_4bands, bands = "NDVI") # Build a machine learning model based on deep learning dl_model <- sits_train(samples_mt_ndvi, sits_mlp()) # get a point with a 16 year time series point_ndvi <- sits_select(point_mt_6bands, bands = "NDVI") # classify the point point_class <- sits_classify(point_ndvi, dl_model) # plot the classified point plot(point_class) ## End(Not run)