Open In App

ANN Classification with ‘nnet’ Package in R

Last Updated : 21 Aug, 2023
Improve
Improve
Like Article
Like
Save
Share
Report

Artificial Neural Networks (ANNs) are a type of machine learning algorithm that are modeled after the structure and function of the human brain. ANNs are used for both regression and classification problems. In classification problems, ANNs can be used to classify input data into one of several categories.

In R, the nnet package provides functions for creating and training ANNs. The nnet package uses the backpropagation algorithm to train ANNs. Backpropagation is a supervised learning algorithm that adjusts the weights of the neurons in the network based on the error between the predicted output and the actual output.

To use the nnet package for classification, you first need to prepare your data. Your data should be in a data frame, with the columns representing the input variables and the rows representing the observations. The last column of the data frame should contain the class labels. The class labels should be factors.

Once your data is prepared, you can use the nnet function to create and train an ANN. The nnet function takes several arguments, including the input data, the number of hidden layers in the network, and the number of nodes in each hidden layer. The nnet function also allows you to specify the activation function used by the neurons, the maximum number of iterations for the training algorithm, and the convergence tolerance.

After training the ANN, you can use it to make predictions on new data using the predict function. The predict function takes the trained ANN and the new data as arguments and returns the predicted class labels.

Here is an example of how to use the nnet package for classification:

R




# Load the nnet package
library(nnet)
 
# Load the Sonar dataset
data(iris)
names(iris)
 
# Split the data into training and testing sets
set.seed(123)
train_idx <- sample(nrow(iris), nrow(iris)*0.7)
train_data <- iris[train_idx, ]
test_data <- iris[-train_idx, ]
 
# Train an ANN with one hidden layer
model <- nnet(Species ~ ., data = train_data, size = 5, linout = FALSE)
 
# Make predictions on the testing set
predictions <- predict(model, newdata = test_data, type = "class")
 
# Calculate the accuracy of the model
accuracy <- sum(predictions == test_data$Species) / nrow(test_data)
cat("Accuracy:", round(accuracy, 2))


Output

 

We begin by loading the nnet package which provides the functionality for building artificial neural networks in R.

Next, we load the Sonar dataset which is included in the mlbench package. The Sonar dataset contains sonar signals that were used to detect underwater mines.

We then split the Sonar dataset into training and testing sets. We use the set.seed function to ensure that the results are reproducible. We randomly sample 70% of the rows for the training set and the remaining 30% for the testing set.

We then train an artificial neural network with one hidden layer using the nnet function. We specify the formula Class ~ . to indicate that we want to predict the Class variable using all other variables in the dataset. We set the number of nodes in the hidden layer to 5 and set linout = FALSE to indicate that we want to use the logistic activation function for the output layer as well.

After training the model, we use the predict function to make predictions on the testing set. We pass in the trained model and the testing data, and specify type = “class” to indicate that we want to predict the class labels rather than the probabilities.

Finally, we calculate the accuracy of the model by comparing the predicted class labels to the actual class labels in the testing set. We divide the number of correctly predicted instances by the total number of instances in the testing set. We print the accuracy of the model using cat.

Example 2 

R




# Load the nnet package
library(nnet)
 
# Load the mtcars dataset
data(mtcars)
 
# Split the dataset into training and testing sets
set.seed(123)
train_index <- sample(1:nrow(mtcars), 0.7*nrow(mtcars))
train_data <- mtcars[train_index, -1]
train_classes <- ifelse(mtcars[train_index, 1] > median(mtcars$mpg), 1, 0)
test_data <- mtcars[-train_index, -1]
test_classes <- ifelse(mtcars[-train_index, 1] > median(mtcars$mpg), 1, 0)
 
# Train the neural network
mtcars_ann <- nnet(train_data, train_classes, size = 5, maxit = 1000)
 
# Predict the classes for the test data
test_predictions <- round(predict(mtcars_ann, test_data))


Output

 

First, we load the nnet package, which is necessary for training the neural network. Then, we load the mtcars dataset, which we will use for training and testing our model.

Next, we split the dataset into training and testing sets. The set.seed(123) line ensures that we get the same random split every time we run the code. We use the sample() function to randomly select 70% of the rows in the mtcars dataset to use as training data. We create two subsets of the data, train_data and test_data, containing the predictor variables (all columns except the first one) for the training and testing data, respectively. We also create two subsets of the classes, train_classes and test_classes, which contain binary values indicating whether each car in the training or testing set has an mpg value above or below the median mpg value for the entire dataset.

After splitting the data, we use the nnet() function to train the neural network. We pass in the training data and classes as the first two arguments, followed by the size and maxit parameters, which determine the number of hidden neurons and maximum number of iterations for the training algorithm.

Finally, we use the predict() function to generate predictions for the testing data using the trained neural network. The round() function is used to round the predicted continuous probabilities to binary values of 0 or 1. These binary predictions can then be compared to the binary test classes to evaluate the performance of the neural network.

Conclusion:

In conclusion, the nnet package in R provides a straightforward and effective way to build artificial neural networks for binary classification problems. By specifying the formula and the number of hidden nodes, we can quickly train a model and make predictions on new data. The predict function makes it easy to generate class labels or probabilities for new data, and we can evaluate the performance of the model using standard metrics such as accuracy. With its simplicity and flexibility, the nnet package is a valuable tool for anyone interested in using artificial neural networks for classification problems in R.



Like Article
Suggest improvement
Share your thoughts in the comments

Similar Reads