Main Content

Human Activity Recognition Simulink Model for Smartphone Deployment

This example shows how to prepare a Simulink® model that classifies human activity based on smartphone sensor signals for code generation and smartphone deployment. The example provides two Simulink models that are ready for deployment to an Android device and an iOS device. After you install the required support package for a target device, train the classification model and deploy the Simulink model to the device.

Load Sample Data Set

Load thehumanactivitydata set.

loadhumanactivity

Thehumanactivitydata set contains 24,075 observations of five different physical human activities: Sitting, Standing, Walking, Running, and Dancing. Each observation has 60 features extracted from acceleration data measured by smartphone accelerometer sensors. The data set contains the following variables:

  • actid— Response vector containing the activity IDs in integers: 1, 2, 3, 4, and 5 representing Sitting, Standing, Walking, Running, and Dancing, respectively

  • actnames— Activity names corresponding to the integer activity IDs

  • feat— Feature matrix of 60 features for 24,075 observations

  • featlabels— Labels of the 60 features

The Sensor HAR (human activity recognition) App[1]was used to create thehumanactivitydata set. When measuring the raw acceleration data with this app, a person placed a smartphone in a pocket so that the smartphone was upside down and the screen faced toward the person. The software then calibrated the measured raw data accordingly and extracted the 60 features from the calibrated data. For details about the calibration and feature extraction, see[2]and[3], respectively. The Simulink models described later also use the raw acceleration data and include blocks for calibration and feature extraction.

Prepare Data

This example uses 90% of the observations to train a model that classifies the five types of human activities and 10% of the observations to validate the trained model. Usecvpartitionto specify a 10% holdout for the test set.

rng('default')% For reproducibilityPartition = cvpartition(actid,'Holdout',0.10); trainingInds = training(Partition);% Indices for the training setXTrain = feat(trainingInds,:); YTrain = actid(trainingInds); testInds = test(Partition);% Indices for the test setXTest = feat(testInds,:); YTest = actid(testInds);

Convert the feature matrixXTrainand the response vectorYTraininto a table to load the training data set in the Classification Learner app.

tTrain = array2table([XTrain YTrain]);

Specify the variable name for each column of the table.

tTrain.Properties.VariableNames = [featlabels''Activities'];

Train Boosted Tree Ensemble Using Classification Learner App

Train a classification model by using the Classification Learner app. To open the Classification Learner app, enterclassificationLearnerat the command line. Alternatively, click theAppstab, and click the arrow at the right of theAppssection to open the gallery. Then, underMachine Learning and Deep Learning, clickClassification Learner.

On theClassification Learnertab, in theFilesection, clickNew Sessionand selectFrom Workspace.

In the New Session dialog box, click the arrow forData Set Variable, and then select the tabletTrain. Classification Learner detects the predictors and the response from the table.

The default option is 5-fold cross-validation, which protects against overfitting. ClickStart Session. Classification Learner loads the data set and plots a scatter plot of the first two features.

On theClassification Learnertab, click the arrow at the right of theModel Typesection to open the gallery. Then, underEnsemble Classifiers, clickBoosted Trees.

The Current Model pane of the Data Browser displays the default settings of the boosted tree ensemble model.

On theClassification Learnertab, in theTrainingsection, clickTrain. When the training is complete, the History pane of the Data Browser displays the 5-fold, cross-validated classification accuracy.

On theClassification Learnertab, in theExportsection, clickExport Model, and then selectExport Compact Model. ClickOKin the dialog box. The structuretrainedModelappears in the MATLAB Workspace. The fieldClassificationEnsembleoftrainedModelcontains the compact model. Extract the trained model from the structure.

classificationEnsemble = trainedModel.ClassificationEnsemble;

Train Boosted Tree Ensemble at Command Line

Alternatively, you can train the same classification model at the command line.

template = templateTree('MaxNumSplits',20); classificationEnsemble = fitcensemble(XTrain,YTrain,...'Method','AdaBoostM2',...'NumLearningCycles',30,...“学习者”,template,...'LearnRate',0.1,...'ClassNames',[1; 2; 3; 4; 5]);

Perform 5-fold cross-validation forclassificationEnsembleand compute the validation accuracy.

partitionedModel = crossval(classificationEnsemble,'KFold',5); validationAccuracy = 1-kfoldLoss(partitionedModel)
validationAccuracy = 0.9830

Evaluate Performance on Test Data

Evaluate performance on the test data set.

testAccuracy = 1-loss(classificationEnsemble,XTest,YTest)
testAccuracy = 0.9763

The trained model correctly classifies 97.63% of the human activities on the test data set. This result confirms that the trained model does not overfit to the training data set.

Note that the accuracy values can vary slightly depending on your operating system.

Save Trained Model

For code generation including a classification model object, usesaveLearnerForCoderandloadLearnerForCoder.

Save the trained model by usingsaveLearnerForCoder.

saveLearnerForCoder(classificationEnsemble,'EnsembleModel.mat');

The function blockpredictActivityin the Simulink models loads the trained model by usingloadLearnerForCoderand uses the trained model to classify new data.

Deploy Simulink Model to Device

Now that you have prepared a classification model, you can open the Simulink model, depending on which type of smartphone you have, and deploy the model to your device. Note that the Simulink model requires theEnsembleModel.matfile and the calibration matrix fileslexHARAndroidCalibrationMatrix.matorslexHARiOSCalibrationMatrix.mat. If you click the button located in the upper-right section of this page and open this example in MATLAB®, then MATLAB® opens the example folder that includes these calibration matrix files.

TypeslexHARAndroidExampleto open the Simulink model for Android deployment.

TypeslexHARiOSExampleto open the Simulink model for iOS deployment. You can open the model on the Mac OS platform.

The two Simulink models classify human activity based on acceleration data measured by a smartphone sensor. The models include the following blocks:

  • TheAccelerometerblock receives raw acceleration data from accelerometer sensors on the device.

  • Thecalibrateblock is a MATLAB Function block that calibrates the raw acceleration data. This block uses the calibration matrix in theslexHARAndroidCalibrationMatrix.matfile or theslexHARiOSCalibrationMatrix.matfile. If you click the button located in the upper-right section of this page and open this example in MATLAB®, then MATLAB® opens the example folder that includes these files.

  • The display blocksAcc X,Acc Y, andAcc Zare connected to thecalibrateblock and display calibrated data points for each axis on the device.

  • Each of theBufferblocks,X Buffer,Y Buffer, andIZ Buffer, buffers 32 samples of an accelerometer axis with 12 samples of overlap between buffered frames. After collecting 20 samples, eachBufferblock joins the 20 samples with 12 samples from the previous frame and passes the total 32 samples to theextractFeaturesblock. EachBufferblock receives an input sample every 0.1 second and outputs a buffered frame including 32 samples every 2 seconds.

  • TheextractFeaturesblock is a MATLAB Function block that extracts 60 features from a buffered frame of 32 accelerometer samples. This function block uses DSP System Toolbox™ and Signal Processing Toolbox™.

  • ThepredictActivityblock is a MATLAB Function block that loads the trained model from theEnsembleModel.matfile by usingloadLearnerForCoderand classifies the user activity using the extracted features. The output is an integer between 1 and 5, corresponding to Sitting, Standing, Walking, Running, and Dancing, respectively.

  • ThePredicted Activityblock displays the classified user activity values on the device.

  • TheVideo Outputsubsystem uses a multiport switch block to choose the corresponding user activity image data to display on the device. TheConvert to RGBblock decomposes the selected image into separate RGB vectors and passes the image to theActivity Displayblock.

将仿真软件模型部署到你的设备金宝app,follow the steps inRun Model on Android Devices(Simulink Support Package for Android Devices)orRun Model on Apple iOS Devices(Simulink Support Package for Apple iOS Devices). Run the model on your device, place the device in the same way as described earlier for collecting the training data, and try the five activities. The model displays the classified activity accordingly.

To ensure the accuracy of the model, you need to place your device in the same way as described for collecting the training data. If you want to place your device in a different location or orientation, then collect the data in your own way and use your data to train the classification model.

The accuracy of the model can be different from the accuracy of the test data set (testaccuracy),这取决于设备。改进模型,you can consider using additional sensors and updating the calibration matrix. Also, you can add another output block for audio feedback to the output subsystem using Audio Toolbox™. Use a ThingSpeak™ write block to publish classified activities and acceleration data from your device to the Internet of Things. For details, seehttps://thingspeak.com/.

References

[1] El Helou, A. Sensor HAR recognition App. MathWorks File Exchange//www.tatmou.com/matlabcentral/fileexchange/54138-sensor-har-recognition-app

[2] STMicroelectronics, AN4508 Application note. “Parameters and calibration of a low-g 3-axis accelerometer.” 2014.

[3] El Helou, A. Sensor Data Analytics. MathWorks File Exchange//www.tatmou.com/matlabcentral/fileexchange/54139-sensor-data-analytics--french-webinar-code-

See Also

|||

Related Topics