Technology
Technology
Review
AI-Driven Sensing Technology: Review
Long Chen † , Chenbin Xia † , Zhehui Zhao, Haoran Fu * and Yunmin Chen
Department of Civil Engineering, Zhejiang University, Hangzhou 310058, China; [email protected] (L.C.);
[email protected] (C.X.); [email protected] (Z.Z.)
* Correspondence: [email protected]
† These authors contributed equally to this work.
Abstract: Machine learning and deep learning technologies are rapidly advancing the capabilities of
sensing technologies, bringing about significant improvements in accuracy, sensitivity, and adaptabil-
ity. These advancements are making a notable impact across a broad spectrum of fields, including
industrial automation, robotics, biomedical engineering, and civil infrastructure monitoring. The core
of this transformative shift lies in the integration of artificial intelligence (AI) with sensor technology,
focusing on the development of efficient algorithms that drive both device performance enhance-
ments and novel applications in various biomedical and engineering fields. This review delves into
the fusion of ML/DL algorithms with sensor technologies, shedding light on their profound impact
on sensor design, calibration and compensation, object recognition, and behavior prediction. Through
a series of exemplary applications, the review showcases the potential of AI algorithms to signifi-
cantly upgrade sensor functionalities and widen their application range. Moreover, it addresses the
challenges encountered in exploiting these technologies for sensing applications and offers insights
into future trends and potential advancements.
1. Introduction
In the current era marked by swift technological evolution, sensing technology oc-
Citation: Chen, L.; Xia, C.; Zhao, Z.; cupies a pivotal position in diverse sectors, including advanced industrial processes [1],
Fu, H.; Chen, Y. AI-Driven Sensing robotics [2], biomedical engineering [3–6], and civil engineering [7,8]. These sensors employ
Technology: Review. Sensors 2024, 24, sophisticated structural design [9–12] and innovative material optimization [13,14] in their
2958. https://doi.org/10.3390/ sensitive units to transform stimuli from objects into electrical or optical signals. This
s24102958 conversion process is further refined through stages like signal amplification, filtering,
Academic Editor: Maurizio Talamo and impedance matching, enhancing the signal’s quality, stability, and interoperability.
However, despite continuous technological innovations, improvements in sensor accuracy,
Received: 31 March 2024 sensitivity, and adaptability still face bottlenecks due to the precision limitations of micro-
Revised: 30 April 2024 nano fabrication processes [15], the pace of new material development and application [16],
Accepted: 4 May 2024
intrinsic noise limitations of circuit components [17], and the complexity and real-time
Published: 7 May 2024
requirements of signal processing algorithms [18].
These bottlenecks lead to a variety of unique and complex challenges across different
application domains. For instance, in industrial automation, the precision and sensitivity
Copyright: © 2024 by the authors.
of sensors on the production line affect the speed and accuracy of product line operations
Licensee MDPI, Basel, Switzerland.
and defect detection, making sensors crucial for ensuring production efficiency and product
This article is an open access article quality [19]. In the realm of robotics, sensors are required to offer high accuracy while also
distributed under the terms and possessing multifunctional adaptability, enabling robots, i.e., unmanned aerial vehicles [20]
conditions of the Creative Commons and deep-sea robots [21], to adapt to fluctuating work environments and tasks [22]. Similarly,
Attribution (CC BY) license (https:// in biomedical engineering and structural health monitoring, sensors are tasked with identify-
creativecommons.org/licenses/by/ ing subtle physiological or structural changes, ensuring both high precision and reliability
4.0/).
even under complex or extreme conditions [23], such as monitoring physiological data of
human skin during motion [24] and monitoring railway responses in permafrost regions [25].
In this context, the advent of machine learning and deep learning technologies stands
as a crucial breakthrough in overcoming traditional technological constraints [26]. These
cutting-edge algorithms uncover intricate patterns and correlations by autonomously
analyzing vast data sets, thus optimizing sensor performance. They enhance sensor ac-
curacy [27] and sensitivity [28] under specific conditions and bolster adaptability [29]
to environmental shifts. More critically, beyond monitoring, these technologies enable
efficient identification and predictive capabilities, heralding a new era in machine main-
tenance [30,31], disease diagnosis [32–34], structural damage prevention [35], and the
environmental awareness and adaptability of robots [36–38]. Extensive research now
concentrates on merging artificial intelligence with sensor technology [39], ranging from
performance enhancement algorithms [40] and algorithm-driven device design [41] to
broad applications in biomedical [42] and engineering fields [43].
Machine learning and deep learning’s contributions to sensing technology are seg-
mented into four principal areas: sensor design, calibration and compensation, object
recognition and classification, and behavior prediction. In this paper, we delve into the
vital functions of artificial intelligence algorithms within these realms, highlighting the
latest progress in innovative applications. This paper first discusses in Section 2 the role of
artificial intelligence algorithms in guiding the sensor design. Subsequent sections, from
Sections 3–5, explore the impact of algorithms on sensor calibration and compensation,
object recognition and classification, and behavior prediction. The paper concludes by
discussing the challenges of advancing sensing technology with these approaches and
offers a forward-looking perspective on future trends.
Figure 1.
Figure 1. Application
ApplicationofofML/DL
ML/DL algorithms
algorithmsin insensor
sensordesign.
design.(a)(a)Schematic
Schematicillustrations
illustrations
ofof
thethe in-
inverse
verse design process for a seashell mesosurface utilized in sensor integration. Adapted
design process for a seashell mesosurface utilized in sensor integration. Adapted with permission.with per-
mission. Copyright
Copyright 2023, American
2023, American Association
Association for the Advancement
for the Advancement of Scienceof Science
[41]. (b–d)[41]. (b–d) En-
Enhancing sensor
hancing sensor signal processing through DL algorithm integration to simulate human neural
signal processing through DL algorithm integration to simulate human neural transmission. (b) Left:
transmission. (b) Left: depiction of a wireless parallel pressure cognition platform (WiPPCoP) on a
depiction of a wireless parallel pressure cognition platform (WiPPCoP) on a robotic hand, capturing
robotic hand, capturing tactile signals simultaneously at unique frequencies. Right: implementa-
tactile
tion ofsignals
WiPPCoPsimultaneously at unique
in a robotic system. frequencies.with
Reproduced Right: implementation
permission. Copyrightof WiPPCoP in a robotic
2020, Wiley-VCH
system. Reproduced with permission. Copyright 2020, Wiley-VCH GmbH, Weinheim, Germany
GmbH, Weinheim [37]. (c) Structure of a CNN trained with testing data. (d) Illustration of the hu- [37].
(c) Structure of a CNN trained with testing data. (d) Illustration of the human
man somatosensory system’s process for transmitting pressure sensations. Reproduced with per- somatosensory
system’s process for 2020,
mission. Copyright transmitting pressure
Wiley-VCH GmbH,sensations.
Weinheim Reproduced
[37]. with permission. Copyright 2020,
Wiley-VCH GmbH, Weinheim, Germany [37].
2.2. Performance Enhancement
2.2. Performance Enhancement
Integrating machine learning algorithms into the signal processing phase of sensors
Integrating machine learning algorithms into the signal processing phase of sensors
can significantly enhance the accuracy of the devices. Samuel Rosset et al. used machine
can significantly enhance the accuracy of the devices. Samuel Rosset et al. used machine
learning to detect pressure and its location on sensors, applying varied frequency signals
learning to detect pressure and its location on sensors, applying varied frequency signals to
to collect impedance and capacitance data. These data were analyzed to identify key sta-
collect impedance and capacitance data. These data were analyzed to identify key statistical
tistical features, which were then processed using algorithms like K-nearest neighbors
features, which were then processed using algorithms like K-nearest neighbors (KNNs), lin-
(KNNs), linear discriminant analysis (LDA), and decision trees (DTs). Their method
ear discriminant analysis (LDA), and decision trees (DTs). Their method achieved over 99%
achieved over 99% accuracy on a three-zone sensor for both location and pressure inten-
accuracy on a three-zone sensor for both location and pressure intensity [46]. Additionally,
sity [46]. Additionally, WiPPCoP, a novel wireless parallel signal processing technique,
WiPPCoP, a novel wireless parallel signal processing technique, was developed for tactile
was developed for tactile signal management in robotics and prosthetic applications. This
signal management in robotics and prosthetic applications. This method began by collect-
ing a vast amount of pressure signal data through wireless pressure sensors, which could
be mounted on robot hands or other devices requiring pressure sensing [Figure 1b]. Based
Sensors 2024, 24, 2958 4 of 28
on pre-processed data, a CNN model was constructed to automatically learn the feature
representation of pressure signals, facilitating classification or regression predictions of the
signals [Figure 1c]. Regression predictions were used to forecast the continuous output of
pressure signals. When trained with 100 data points, the CNN model demonstrated a mean
squared error (MSE) and an error index of 0.12 and 0.09, respectively, indicating its appli-
cability to real-world pressure signal processing tasks [Figure 1d]. In practice, the model
could eliminate complex external wiring and monitor pressure at different locations in real-
time. For instance, a trained CNN model could monitor pressure levels on a robot’s hand,
aiding the robot in better task execution [37]. Further, Mehdi Ghommem et al. explored a
microelectromechanical system (MEMS) sensor for detecting pressure and temperature,
utilizing electrodes under a microbeam with direct and alternating voltage applications.
Their design considered ambient temperature effects on the microbeam and air pressure
impact on squeeze-film damping. A neural network trained on input data—comprising
the first three natural frequencies of an arch beam at various temperatures, quality factors,
and static deflection—enabled the detection of intertwined temperature and pressure out-
puts. Optimal temperature and pressure predictions, with RMSE values of 0.158 and 0.997,
respectively, were achieved using leaky ReLU as the activation function [47].
Furthermore, ML/DL algorithms can enhance the limit of detection (LOD) for sensors.
Experiments taking hydrogen concentration sensors as an example were conducted in six
different metal channels (Au, Cu, Mo, Ni, Pt, Pd) for H2 sensing. By employing Deep
Neural Networks (DNNs) and Gated Recurrent Units (GRUs) to train on the real-time noise
signals of chemical sensors, a hidden relationship between hydrogen concentration and
signal noise was established. This significantly improves the accuracy of gas sensors in
detecting low concentrations of hydrogen [48].
Beyond electronic signal sensors, ML/DL algorithms are widely applied in fiber Bragg
grating sensors for improving key parameters such as range, signal-to-noise ratio, and ac-
curacy. When external pressure affects these sensors, the phase birefringence in the optical
path changes, causing wavelength shifts in the interference spectrum. These shifts, encap-
sulating pressure variations, are characterized by tracking wavelength changes against
pressure. A long short-term memory (LSTM) neural network model has been applied to
convert recorded raw spectra into one- or two-dimensional data, enabling accurate pressure
prediction. Experiments demonstrate the LSTM model’s superior accuracy over traditional
machine learning methods, with a root-mean-square error (RMSE) of only 4.4 kPa within a
0–5 MPa range, thus allowing for precise fiber optic sensor measurements [49]. Similarly, a
high spatial resolution flexible optical pressure sensor has been designed, where surface
pressure affects the absorption and transmittance of reflected light between shielding and
sensing layers, altering RGB values in corresponding images. Convolutional neural net-
work (CNN) algorithms extract features from images to determine the force’s magnitude
and location applied to the sensor, achieving an RMSE of about 0.1 mm for positioning and
0.67 N for normal force [50].
Fiber optic sensors, sensitive to both strain and temperature, face challenges with
cross-sensitivity, making it difficult to distinguish between strain and temperature from
single Bragg wavelength shifts. To address this issue, Sanjib Sarkar employed a multi-
target supervised ensemble regression algorithm from machine learning to simultaneously
predict strain and temperature. By learning the relationship between the reflected spec-
trum and its corresponding temperature and strain, the Boosting ensemble estimator
effectively separated temperature from strain. The study compared two averaging en-
semble methods—random forest regression (RFR) and Bagging regression (BR)—with two
boosting ensemble methods—gradient-boosting regression (GBR) and adaptive boosting
regression (ABR), finding GBR to perform the best, with post-separation errors for temper-
ature and strain within 10% of actual values [51,52]. The Extreme Learning Machine (ELM)
was also applied to quickly and accurately determine strain and temperature from fiber
optic sensors, which exhibit central wavelength shifts due to changes in strain, temperature,
grating period, and refractive index. Using ELM to analyze the spectrum alongside temper-
Sensors 2024, 24, 2958 5 of 28
ature and strain data from two sensors facilitated the discernment of their interrelationships.
When compared with centroid, Gaussian polynomial fit, and back propagation algorithms,
ELM demonstrated superior precision (RMSE = 0.0906) and response time (t = 0.325) [53].
Distributed Acoustic Sensing (DAS) technology senses sound or vibration by measur-
ing phase changes of light transmitted through a fiber optic. For this sensing technique,
X. Dong et al. introduced a novel denoising method based on CNN, termed L-FM-CNN, for
processing random and coherent noise in distributed fiber optic acoustic-sensing Vertical
Seismic Profile (VSP) data. This method combines leaky rectifier linear unit activation func-
tions, forward modeling, and energy ratio matrix (ERM) to enhance the signal-to-noise ratio
(SNR). Experimental results showed an SNR improvement of over 10 db using L-FM-CNN
compared to methods like DnCNNs [54].
Instrumental variation poses significant challenges in the sensor field due to differences
in sensor and device manufacturing that result in varied responses to identical signal
sources and time-varying drift, characterized by changes in sensor attributes, operational
conditions, or the signal source over time. Models trained on data from an earlier period
are not suitable for new devices or data from later periods due to these variations. To
overcome these challenges, Ke Yan introduced Maximum Independent Domain Adaptation
(MIDA) and a semi-supervised version of MIDA. These methods address instrumental
differences and time-varying drift by treating them as discrete and continuous distribution
changes in the feature space and then learning a subspace that maximizes independence
from the domain features, thereby reducing the discrepancy in distributions across domains.
The effectiveness of the proposed algorithms is demonstrated through experiments on
synthetic datasets and four real-world datasets related to sensor measurement, significantly
enhancing the practicability of sensor systems [55].
In summary, incorporating AI methods into the design process of sensors can stream-
line design time, reduce computational costs, and minimize iterations, facilitating the rapid
development of configurations that meet specific environmental or functional requirements.
Moreover, integrating ML/DL algorithms into the signal-processing phase significantly
improves critical parameters. Yet, AI’s role in sensor design faces challenges, including the
extensive training necessary for AI algorithms to facilitate design. Moreover, previously
trained models risk becoming outdated due to the algorithms’ inability to interpret the
complex interplay of multi-field responses of devices, rendering them incapable of antici-
pating performance changes over time, such as aging. This underscores the limitations in
the universality of AI-driven sensor design.
accuracy of ±2.5% (FS) across a temperature range of −50 ◦ C to 150 ◦ C [56]. Similarly, the
MLP algorithm has been utilized for calibration assistance, accurately estimating pressure
with an error margin of ±1% (FS) within the same temperature range [57].
In the calibration of sensors, artificial intelligence algorithms not only reduce signal
drift caused by environmental factors but also decrease the workload associated with
calibrating nonlinear sensor response curves. For example, for nonlinear temperature
sensors, José Rivera developed an automatic calibration method based on ANN. The study
analyzed various network topologies like MLP and radial basis function (RBF), along with
training algorithms such as backpropagation, the conjugate gradient algorithm, and the
Levenberg–Marquardt algorithm. They found these methods offer superior overall accuracy
compared to piecewise linearization and polynomial linearization methods, enabling
intelligent sensors to be calibrated more quickly and accurately, addressing issues like
offset, gain changes, and non-linearity. With five calibration points, the error rate was 0.17%,
and the calibration time was reduced to 3523 ms for five to eight calibration points [58].
For pressure sensors, an ELM-based method was applied, utilizing ELM’s capability to
approximate any nonlinear function, calibrating system errors caused by temperature and
voltage fluctuations. The ELM showed optimal performance in both calibration accuracy
and speed, with an RMSE of 0.546 and a calibration time of 1.3 s [59]. Expanding to broader
sensor calibration types, Alessandro Depari et al. introduced a two-stage method based on
the Adaptive Network-based Fuzzy Inference System (ANFIS), requiring fewer calibration
points and lower computational power during the recalibration phase. The first stage
involves preliminary calibration of the sensor system under standard conditions using a
large number of calibration points to train the ANFIS. The second stage, requiring relatively
fewer points and parameter adjustments through gradient descent, facilitates recalibration,
reducing computational effort and enabling online recalibration. This method, applied in
a pyroelectric biaxial positioning system, achieves a resolution of 20 µm across the entire
7 mm × 7 mm detectable area [60].
fluctuations due to temperature changes. This approach involved modeling the sensor’s oper-
ational range, creating an inverse model by connecting the sensor to an ANN, and training the
ANN to adapt to temperature shifts, ultimately reducing uncompensated temperature error by
about 98% [62]. Further, ANNs employing conventional and inverse delayed function neuron
models significantly reduced temperature drift errors in high-resolution sensors [63]. For
extreme conditions, an MLP-based model yielded automatic compensation with an accuracy
within ±0.5% across a broad temperature range. Similar methods enhanced fiber optic sen-
sors’ accuracy to over 95% by compensating for temperature-related expansion and bending
losses [64]. Guanwu Zhou used 88 sets of temperature and pressure combinations as learning
samples (training set to validation set ratio of 2:1) to compare the calibration performance
of various methods, including VCR, RPA, BP, SVM, RBF, and ELM. The results indicated
that, compared with other algorithms, ELM exhibited superior generalization capabilities
and faster learning speeds even with a smaller calibration sample size. ELM achieved higher
accuracy (0.23%) and was capable of calibrating pressures ranging from 0 to 20 MPa within a
temperature span of −40 ◦ C to 85 ◦ C [61].
Aside from temperature-related inaccuracies, sensor errors can stem from various
factors like noise from power supply or semiconductor signal interference, fixed offsets
due to manufacturing flaws, temperature shifts or other environmental influences, and
drifts where the sensor output-to-input ratio changes over time. These combined factors
can gradually diminish the accuracy of MEMS-based Inertial Navigation Systems. Hua
Chen devised a CNN-based approach to mitigate these disturbances in inertial sensor
signals. This method processes Inertial Measurement Unit (IMU) raw data, including errors,
through CNNs that segment data into smaller time units for error identification, achieving
an 80% accuracy in distinguishing accelerometer and gyroscope signals compared to
traditional static and rate tests [65]. Furthermore, an automatic compensation method using
FLANN addresses changes in the pressure sensor environment, manufacturing parameter
shifts, and aging effects, maintaining maximum error within ±2% [44]. In gas sensors,
aging (e.g., surface reorganization over time) and poisoning (e.g., irreversible binding
from contamination) also pose challenges due to physical and chemical reactions between
chemical analytes and the sensor film in the gas phase. Alexander Vergara proposed an
ensemble method using a weighted combination of classifiers trained at different times
with Support Vector Machines (SVMs) to mitigate such effects. This approach updates
classifier weights based on their current batch performance, allowing for drift identification
and compensation and enhancing gas recognition accuracy post-drift to 91.84% [40].
Additionally, specific scenarios, like uneven pressure distribution and insufficient
curvature adaptation in robotic arm applications, can cause sensor drift. Dong-Eon Kim
established lookup tables to linearize outputs from resistive barometric sensors based on
cubic weight, employing linear regression, lookup methods, and supervised learning with
known object weights as training data to ensure stable grip force measurement [66].
In acoustical signal processing scenarios, voice enhancement serves as a specific form
of sensor signal compensation, addressing the issue of consonant phoneme loss due to
high-frequency attenuation in traditional throat microphones. Shenghan Gao and his
team developed a flexible vibration sensor based on non-contact electromagnetic coupling
[Figure 2g] for capturing vocal fold vibration signals [Figure 2h]. They utilized short-time
Fourier transform (STFT) to decompose speech into amplitude and phase, employing four
neural network models—fully connected neural network (FCNN), long short-term memory
(LSTM), bidirectional long short-term memory (BLSTM), and convolutional-recurrent neu-
ral network (CRNN)—to extract and enhance speech data features [Figure 2i]. Experimental
results indicated that BLSTM performed best in improving speech quality but was the least
favorable for hardware deployment, boosting short-time objective intelligibility (STOI)
from 0.18 to nearly 0.80 [67].
Overall, AI algorithms can reduce errors caused by environmental changes, voltage
fluctuations, and other factors during sensor calibration or automatically compensate for
errors due to environmental changes, voltage fluctuations, and device aging during sensor
Sensors
Sensors 24, x24,
2024,2024, FOR2958PEER REVIEW 8 of828of 30
Figure
Figure2. 2.
Application
Application of of
ML/DL
ML/DL algorithms
algorithms in in
calibration
calibration andandcompensation.
compensation.(a–f) Process
(a–f) Process of of
uti-
lizing an ML algorithm for compensating sensor thermal drift: (a) signal conditioning
utilizing an ML algorithm for compensating sensor thermal drift: (a) signal conditioning circuit for
circuit for
data normalization. Adapted with permission. Copyright 2014, Multidisciplinary Digital Publish-
data normalization. Adapted with permission. Copyright 2014, Multidisciplinary Digital Publishing
ing Institute [61]. (b) Calculation of pressure error prior to temperature compensation. Repro-
Institute [61]. (b) Calculation of pressure error prior to temperature compensation. Reproduced with
duced with permission. Copyright 2014, Multidisciplinary Digital Publishing Institute [61]. (c)
permission. Copyright 2014, Multidisciplinary Digital Publishing Institute [61]. (c) Configuration of
Configuration of the SLFN trained with testing data. Adapted with permission. Copyright 2014,
the SLFN trained with testing data. Adapted with permission. Copyright 2014, Multidisciplinary
Multidisciplinary Digital Publishing Institute [61]. (d) MCU with digital thermometer and control-
Digital Publishing Institute [61]. (d) MCU with digital thermometer and controller chip for storing
ler chip for storing SLFN weights and biases. Adapted with permission. Copyright 2014, Multidis-
SLFN weights and biases. Adapted with permission. Copyright 2014, Multidisciplinary Digital
ciplinary Digital Publishing Institute [61]. (e) Pressure error following temperature compensation.
Publishing Institute
Reproduced [61]. (e) Pressure
with permission. Copyrighterror2014,
following temperature compensation.
Multidisciplinary Reproduced
Digital Publishing with
Institute [61].
permission. Copyright 2014, Multidisciplinary Digital Publishing Institute [61]. (f)
(f) Interface circuit for digital signal output. Adapted with permission. Copyright 2014, Multidisci- Interface circuit for
digitalDigital
plinary signal output. Adapted
Publishing with permission.
Institute Copyright
[61]. (g–i) Process of 2014,
usingMultidisciplinary
a DL algorithm Digital Publishing
for speech enhance-
Institute
ment: [61]. (g–i)
(g) flexible Processsensor
vibration of using a DL algorithm
placed for speech
on a volunteer’s enhancement:
neck. Reproduced(g) flexible
with vibration
permission.
sensor placed
Copyright 2022,on a volunteer’s
American neck.
Institute Reproduced
of Physics [67]. with permission. Copyright
(h) Cross-sectional view of the 2022, American
vibration sen-
Institute
sor. of Physics
Reproduced with[68]. (h) Cross-sectional
permission. Copyright view2022,
of theAmerican
vibration sensor. Reproduced
Institute of Physicswith permission.
[67]. (i) Neural
Copyright 2022,
network-based American
algorithm Institute
flow of Physics
for ambient noise[68]. (i) Neural
reduction, network-based
including algorithm flow
signal collection, speechfor
ambient noise reduction, including signal collection, speech decomposition,
decomposition, feature extraction and enhancement. Adapted with permission. Copyright 2022, feature extraction and
American Institute
enhancement. of Physics
Adapted [67].
with permission. Copyright 2022, American Institute of Physics [68].
possible to reduce decision-making time in recognition, increase accuracy, lower the cost of
manual identification, and minimize environmental interference for more precise feature
extraction. The complexity of the application scenarios dictates the sensor information
requirements; for instance, voice recognition can be achieved solely through vibration
signals, whereas motion recognition often necessitates a combination of signals from visual
and pressure sensors.
contact shapes and is highly efficient in computation (total classification time for local
contact shapes = 576 µs) [77].
Joint or muscle movements often induce subtle yet distinct strains on the skin, making
strain sensors highly sensitive to variations in motion. For example, knee joint movements
can be detected using a wearable system based on strain sensors. Neural networks and
RF algorithms used to analyze knee joint angles during walking and static bending tasks
show mean absolute errors (MAEs) of 1.94 degrees and 3.02 degrees, respectively, with a
coefficient of determination (R2 ) of 0.97. This method proves more accurate than traditional
linear approaches, improving precision by about 6 degrees [96]. Finger joint movements
can be recognized in real-time by integrating carbon nanotube (CNT)-based resistive strain
sensors into a textile glove [Figure 3a]. The resistance changes in the CNTs/TPE coating
due to strain are converted into electrical signals, then analyzed and learned using CNNs
[Figure 3b] to identify different hand gestures or joint motion patterns [Figure 3c], achieving
a 99.167% recognition accuracy for precise VR/AR control, including shooting, baseball
pitching, and flower arranging [27]. Furthermore, gesture recognition extended to sign
language interpretation using algorithms like SVM boasts up to 98.63% accuracy and less
than one second recognition time [97].
Recognizing joint or muscle movements typically involves categorizing by electrical
signal strength or phase differences. Due to the frequent changes in joint or muscle movement,
monitoring data through bioelectrical and triboelectric signals is a common method, where
machine learning significantly improves recognition accuracy [42]. Bioelectric sensors, which
convert biological reactions into measurable electrical signals, facilitate the recognition of
human gestures in handball games. Various gestures, which trigger muscle group signals, are
captured by the Myo armband gesture control. Data from eight bioelectrical potential channels
for each gesture by two players are collected, creating a dataset that, after preprocessing and
feature extraction, is trained using an SVM model to distinguish five different gestures,
achieving recognition accuracies of 92% and 84% for the two players [98].
Posture recognition in human behavior is a common application [99–102], significantly
relevant for monitoring systems and security analysis. AI-assisted sensors can provide
real-time posture alerts [103], reducing the need for manual care. For instance, the LifeChair
smart cushion, incorporating pressure sensors, a smartphone app interface, and machine
learning, offers real-time posture recognition and guidance. The posture dataset comprises
user BMI, timestamps, raw sensor values, and posture labels, with the RF algorithm
learning the mappings between raw sensor values and postures for recognition. It achieves
high recognition accuracy, up to 98.93% [104], for over thirteen different sitting postures.
Additionally, human posture inclination can be identified by combining flexible pressure
sensors and neural networks. Initially, large-area flexible pressure sensors collect data from
the human back [Figure 3d]; these pressure data are then input into a pre-trained neural
network [Figure 3e] that determines the body’s inclination based on the input pressure data
[Figure 3f], with recognition accuracies ranging from 0.94 to 0.98 for five postures [105].
To enhance the accuracy of sitting posture recognition, Jongryun Roh compared the
efficacy of multiple algorithms within a low-complexity posture monitoring system that
employs four pressure sensors mounted on a seat. These sensors collect data related to
user weight and the orientation of the sitting posture, both front-to-back and side-to-side.
Various machine learning algorithms, including SVM with RBF kernel, LDA, QDA, NB,
and a random forest classifier, were applied to classify six sitting postures using 84 posture
samples. The decision tree showed the lowest accuracy at 76.79%, while the SVM with RBF
kernel achieved the highest at 97.2% [93]. In addition to accuracy, model training time is a
critical metric for sensor recognition algorithms. Aurora Polo Rodríguez proposed a method
using Pressure Mat Sensors to classify human postures in bed. They transformed raw
pressure data into grayscale visualizations for analysis, collected 232 samples, and utilized
data augmentation techniques to expand the dataset by generating synthetic sleeping
postures. By comparing two CNN models with different numbers of convolutional layers
and stages of dropout layer usage, both models reached accuracies of 98%, with the model
having fewer convolutional layers requiring only two-thirds the training time of the more
complex model [103].
Sensors 2024, 24, 2958 12 of 28
Beyond activity, human rest also requires monitoring and feedback, as analyzing sleep
behavior can improve sleep issues. Carter Green et al. developed a TCN model trained
with data from an under-bed pressure sensor array to recognize and classify sleep postures
and states. Information related to sleep, such as event types, start times, and durations,
was extracted from polysomnography (PSG) and pressure sensor mat (PSM) data. Features
extracted from PSM data, including the number of active sensors, the sum of weighted
sensor values, lateral center pressure, lateral variance, and longitudinal center pressure,
served as inputs for a CNN, with body position (supine, prone, left side, right side) and a
Boolean value of sleep state as outputs. With data augmentation, a classification accuracy of
0.998 was reported [106]. This tool, as an economical and effective sleep assessment method,
holds great potential, simultaneously reducing patient burden and professional workload.
Figure 3.
Figure 3. Application
Application of
of ML/DL
ML/DLalgorithms
algorithmsininrecognition
recognitionand
andclassification
classificationusing
usingunidimensional
unidimensional
data. (a–c) Illustration of gesture recognition through a DL algorithm. (a) Depiction of smart
data. (a–c) Illustration of gesture recognition through a DL algorithm. (a) Depiction of smart gloves
gloves with embedded strain sensors for data acquisition. Reproduced with permission. Copyright
with embedded strain sensors for data acquisition. Reproduced with permission. Copyright 2020,
2020, WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim [27]. (b) Diagram of a CNN model
WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim, Germany [27]. (b) Diagram of a CNN model
refined using testing data. Adapted with permission. Copyright 2020, WILEY-VCH Verlag GmbH
& Co. KGaA, Weinheim, Germany [27]. (c) Classification of three distinct gestures based on strain
data. Adapted with permission. Copyright 2020, WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim,
Germany [27]. (d–f) Procedure of recognizing sitting posture inclination with a DL algorithm. (d) Display
of a strain sensing array on a seat backrest (left) and the corresponding data acquisition and visualization
system (right). Reproduced with permission. Copyright 2022, Institute of Physics [105]. (e) Outline
of a CNN framework adjusted with testing data. Reproduced with permission. Copyright 2022,
Institute of Physics [105]. (f) Visualization of pressure contours and their associated sitting posture
recognitions. Adapted with permission. Copyright 2022, Institute of Physics [105]. (g–j) Method of
speech recognition via an ML algorithm. (g) Vocal signal from an unidentified speaker. ELSEVIER.
Adapted with permission. Copyright 2018, ELSEVIER [115]. (h) Visuals of an f-PAS for vocal signal
capture (left) alongside a schematic of a f-PAS (right). Reproduced with permission. Copyright 2018,
ELSEVIER [115]. (i) Conceptual diagram of a GMM refined with testing data. (j) Speaker search and
identification within a database. Adapted with permission. Copyright 2018, ELSEVIER [115].
Sensors 2024, 24, 2958 14 of 28
with a 97.13% accuracy rate [Figure 4c], demonstrating the capability to differentiate joint
positions and states with the assistance of machine learning algorithms [121].
Beyond studying substantial human movements like motion, some research also fo-
cuses on subtle activities such as swallowing and breathing. For instance, Beril Polat used
Epidermal Graphene Sensors to measure strain and sEMG signals, employing machine
learning to estimate the volume of swallowed water and distinguish between actual swal-
lowing actions and motion artifacts. Using SVM algorithms, the cumulative volume of
swallowed water from 5 to 30 mL was estimated, with an accuracy rate exceeding 92% [122].
Ke Yan et al. explored feature selection in gas detection to assist in diabetes screening by
analyzing gas components in breath samples. Initially, gases were collected using Carbon
Dioxide Sensors, Temperature-Humidity Sensors, and Metal Oxide Semiconductor Sensors
in an electronic nose. They optimized feature selection with Support Vector Machine Re-
cursive Feature Elimination (SVM-RFE) and Correlation Bias Reduction (CBR), effectively
distinguishing between healthy subjects and diabetes patients by VOC concentrations.
This methodology achieved diabetes detection accuracies of 90.37%, enhanced to 91.67%
with CBR, and reached a peak accuracy of 95% when combining nonlinear SVM-RFE with
advanced strategies [34].
Figure
Figure4.4.Application
Application of ML/DL algorithms
of ML/DL algorithmsininrecognition
recognitionandand classification
classification usingusing multi-dimen-
multi-dimensional
sional data. (a–c) Process of recognizing joint movement states with a DL algorithm.
data. (a–c) Process of recognizing joint movement states with a DL algorithm. (a) Depiction (a)ofDepiction
seam-
of less
seamless multimodal sensors designed for pressure and strain data gathering. Adapted
multimodal sensors designed for pressure and strain data gathering. Adapted with permission. with
permission. Copyright 2022, Nature Publishing Group [121]. (b) Schematic of an LSTM
Copyright 2022, Nature Publishing Group [121]. (b) Schematic of an LSTM network refined with
network
refined with testing data. Adapted with permission. Copyright 2022, Nature Publishing Group
testing data. Adapted with permission. Copyright 2022, Nature Publishing Group [121]. (c) Recogni-
[121]. (c) Recognition of six joint movements based on pressure and strain measurements. Adapted
tion of six joint movements based on pressure and strain measurements. Adapted with permission.
with permission. Copyright 2022, Nature Publishing Group [121]. (d–f) Demonstration of object
Copyright 2022, Nature Publishing Group [121]. (d–f) Demonstration of object recognition via a
recognition via a DL algorithm. (d) Optical image of the 5 × 5-pixel BOSSA sensor array for acquir-
DL algorithm. (d) Optical image of the 5 × 5-pixel BOSSA sensor array for acquiring pressure and
ing pressure and material data. Reproduced with permission. Copyright 2022, American Chemical
material data. Reproduced with permission. Copyright 2022, American Chemical Society [126].
Society [126]. (e) Structure of an MLP model optimized with testing data. Adapted with permis-
(e) Structure of an MLP model optimized with testing data. Adapted with permission. Copyright
sion. Copyright 2022, American Chemical Society [126]. (f) Identification of objects using pressure
2022, American Chemical Society [126]. (f) Identification of objects using pressure and material
and material information. Adapted with permission. Copyright 2022, American Chemical Society
information. Adapted with permission. Copyright 2022, American Chemical Society [126].
[126].
Electronic skin, a significant component in robotics, benefits from multi-signal data
Beyond
fusion. texture
Kee-Sun recognition,
Soh developed multidimensional
macroscopic datausing
electronic skin analysis plays a crucial
a single-layer role in
piezoresis-
robotics research, particularly
tive MWCNT-PDMS compositeinfilmdifferentiating
equipped with the deformation
strain and locationstates of soft
sensors. actuators
A DNN
crucial for robot
processes control.
resistance Two caused
changes hydrogel by sensors
applied detect temperature
pressure, and various
assessing pressure levelsmechani-
and
callocations
deformations of softwith
in real-time actuators utilizing
over 99% a data-driven
accuracy machinetactile
[128]. Additionally, learning approach,
sensors are em-such
asployed
lateralin electronic
strain, skin and
torsion, for object recognition,
bending. involving
A machine a sensor
learning on acombining
model robotic arm1D-CNN
touching lay-
ersvarious
with aobjects multiple times
feed-forward neuralat network
different locations to gathersensor
(FNN) decodes shape information through five
signals to identify
pressure, surface normals, and curvature measurements. Local features,
states of a soft finger (free bending, bending on contact with room temperature, invariant tohigh-tem-
trans-
lation and rotation, are extracted via unsupervised learning with the k-means algorithm.
perature objects, twisting, and stretching), achieving an accuracy of approximately 86.3%
Object identification then proceeds with a dictionary lookup method, where a dictionary of
[127].
k words created by k-means facilitates object recognition through a histogram codebook,
comparing an object’s histogram distribution to those in a database. This process, requiring
only ten touches, achieves a 91.2% accuracy [129].
data (e.g., force, vibration) analysis, utilizing dimension reduction and support vector
regression to measure parameters like tool wear width. The study compared different
dimension reduction techniques, including kernel principal component analysis and locally
linear embedding, for their efficacy in virtual sensing. The KPCA-SVR model excelled,
showing superior performance with a Pearson Correlation Coefficient of 0.9843, RMSE
of 5.4428, MAE of 3.9583, and MAPE of 0.037, indicating its effectiveness in tool wear
detection [130].
Moreover, large-structure wireless health monitoring can also be achieved through
multi-dimensional data analysis. By measuring vibration responses of a cantilever beam
with a piezoresistive surface acoustic wave (SAW) accelerometer, exploiting SAW’s mod-
ulation by stress/strain during propagation and measuring impedance changes with a
pressure sensor, researchers used continuous wavelet transform and Gabor functions for
time-frequency analysis of the beam’s free vibration. This allowed for decay coefficient
calculation and decay type classification (linear, exponential, or mixed) based on shape
changes over time and frequency. They applied three machine learning models, RF, SVM,
and LightGBM, to automatically learn decay coefficient features and patterns for damage
detection and severity assessment, achieving classification accuracies of 65.4%, 84.6%, and
88.5% on raw data, and 84.6%, 76.9%, and 76.9% on standardized data, respectively [131].
ML/DL-based multi-dimensional data analysis has also been applied to monitor the
state-of-charge (SOC) of batteries. Bruno Rente and colleagues developed a SOC estimation
method for lithium-ion batteries using FBG sensors and machine learning. FBG sensors
monitor the strain and temperature changes during battery usage, indicators closely related
to the battery’s internal chemical reactions. Dynamic Time Warping (DTW) standardizes
the strain data, which, after being processed with the nearest-neighbor classifier method,
accurately estimates the battery’s SOC with an error margin of 2% [132].
In summary, the application of artificial intelligence in recognition and classification
enhances accuracy, reduces errors caused by environmental factors, and maintains high
response speeds even with complex tasks. However, ML/DL models require substantial
amounts of training data, and the limited samples available from sensor data may lead to
model overfitting. Additionally, the scarcity of samples complicates the determination of
the most suitable model structure, such as the optimal number of layers and parameters.
5. Behavior Prediction
Predicting future behavior from data collected by sensors is a crucial application of
artificial intelligence in sensing technology. Combining behavior prediction with warning
systems can significantly reduce the likelihood of accidents.
In the healthcare and caregiving sectors, timely prediction of patients’ risky behaviors
can substantially decrease the chance of injuries and reduce caregiving costs. For patients
with severe injuries requiring bed rest, predicting when they might leave the bed becomes
crucial. A novel approach utilizing a deep learning model with an 80 × 40 sensor array in
bed sheets was developed to monitor sleep posture changes and predict bed-exit behaviors.
This method involves collecting sleep pressure data using thin pressure-sensitive sensors
and analyzing it with CNNs or Auto Encoders (AEs) to identify sleep postures. The
relationship between various sleeping positions and wake-up times was examined to
determine which postures predict waking up. With this information, caregivers can take
preventive actions, such as providing support or preventing falls before a patient leaves
the bed. The prediction accuracy for CNNs reached 92%, while AEs achieved 88% [133].
Beyond patients with injuries, even those partially recovered need continuous moni-
toring of their condition to avoid actions that might hinder their rehabilitation. AI-assisted
wearable sensor devices can predict and warn against such hazardous behaviors during
daily activities. Hongcheng Xu and colleagues developed a stretchable iontronic pressure
sensor (SIPS) that senses pressure through changes in the electrochemical double layer
(EDL) and electrode contact area [Figure 5a], combined with a fully convolutional network
(FCN) algorithm to learn from the collected data [Figure 5b]. This deep learning technique
itoring of their condition to avoid actions that might hinder their rehabilitation. AI-as-
sisted wearable sensor devices can predict and warn against such hazardous behaviors
during daily activities. Hongcheng Xu and colleagues developed a stretchable iontronic
pressure sensor (SIPS) that senses pressure through changes in the electrochemical double
Sensors 2024, 24, 2958 layer (EDL) and electrode contact area [Figure 5a], combined with a fully convolutional 18 of 28
network (FCN) algorithm to learn from the collected data [Figure 5b]. This deep learning
technique accurately interprets and analyzes complex biophysical signal data from pres-
sure sensors,
accurately predicting
interprets andknee positions
analyzes frombiophysical
complex different pressure contours
signal data to assess sensors,
from pressure rehabil-
itation progress
predicting and prevent
knee positions fromfurther injury
different [Figure
pressure 5c], with
contours a prediction
to assess accuracy
rehabilitation of up
progress
to 91.8%
and [18].further injury [Figure 5c], with a prediction accuracy of up to 91.8% [18].
prevent
Figure 5.
Figure 5. Application
Application ofof ML/DL
ML/DL algorithms
algorithms in
in behavior prediction. (a–c)
behavior prediction. (a–c) Process
Process ofof knee
knee joint
joint angle
an-
gle prediction via a DL algorithm. (a) SIPS with processing circuit on a knee for pressure
prediction via a DL algorithm. (a) SIPS with processing circuit on a knee for pressure data collection. data col-
lection. Reproduced with permission. Copyright 2022, Nature Publishing Group [133]. (b) The
Reproduced with permission. Copyright 2022, Nature Publishing Group [133]. (b) The structure of
structure of FCN was refined with testing data. Adapted with permission. Copyright 2021, Nature
FCN was refined with testing data. Adapted with permission. Copyright 2021, Nature Publishing
Publishing Group [133]. (c) Prediction of knee bending states through normalized stress distribu-
Group [133]. (c)
tion analysis. Predictionwith
Reproduced of knee bending Copyright
permission. states through
2022,normalized stress distribution
Nature Publishing Group [133]. analysis.
(d–f)
Reproduced withangle
Process of ankle permission.
prediction Copyright
via an ML2022, Nature(d)
algorithm. Publishing Groupsystem
Pressure sensor [133]. (d–f) Process
in insole (left) of
ankle angle prediction
and schematic overview via(right).
an MLAdapted
algorithm. (d) permission.
with Pressure sensor system2021,
Copyright in insole (left) and schematic
Multidisciplinary Digi-
tal Publishing
overview Institute
(right). Adapted[134]. (e) permission.
with Conceptual diagram
Copyright of 2021,
KNN Multidisciplinary
algorithm refined Digital
with testing data.
Publishing
(f) Ankle[134].
Institute angle(e)
predictions
Conceptual from pressure
diagram data. Adapted
of KNN algorithmwith permission.
refined Copyright
with testing data. (f)2021,
AnkleMulti-
angle
disciplinary Digital Publishing Institute [134].
predictions from pressure data. Adapted with permission. Copyright 2021, Multidisciplinary Digital
Publishing Institute [134].
Due to the convenience of installing plantar pressure sensors and the ease of data
extraction, ML/DL-based predictions are frequently used for foot impact force risk analysis
and fall risk prediction. For instance, wearable pressure insoles combined with multiple lin-
ear regression (MR) can predict the foot strike angle (FSA), considering factors like weight,
height, and age. This process involves collecting running pressure and dynamic data, such
as foot landing type and gait pattern, standardizing it, and training the most impactful fea-
tures on FSA, achieving a prediction accuracy above 90% [134]. Zachary Choffin developed
a method using shoe pressure sensors and machine learning to predict ankle angles. Their
system [Figure 5d], comprising six force-sensing resistors (FSRs), a microcontroller, and a
Bluetooth Low Energy (LE) chip, employs the KNN algorithm to compute the Euclidean
distance between training datasets and input data points, identifying the k-nearest data
points [Figure 5e]. This method, selecting the ten nearest neighbors, predicts discrete ankle
angles with over 93% accuracy during squats and over 87% during bends [Figure 5f] [135].
Additionally, shoe pressure sensors can predict fall risks by collecting dynamic walking
data from insoles embedded with wireless pressure sensors, analyzing gait and balance
data features, and using logistic regression with oversampling techniques, achieving a
high Area Under the Curve (AUC) of 0.88. Furthermore, training with the RF model and
oversampling yielded an accuracy of 0.81 and a specificity of 0.88 [136].
In the industrial production field, for safety concerns, such as hazardous gas leaks,
the priority is to locate the gas source and address it promptly. Using a convolutional long
short-term memory network (CNN-LSTM) to learn from multiple gas sensor fluctuations
caused by different gas source locations can quickly identify the gas source under hazardous
conditions. This approach accounts for environmental factors like wind direction and speed
and changes in the gas source location over time. CNNs clean and extract features from
Sensors 2024, 24, 2958 19 of 28
collected data, while LSTMs learn temporal characteristics, and the processed data are
input into a DNN to predict the gas source location with an accuracy of 93.9% [137].
Beyond predicting human behaviors, AI-assisted sensor systems are also used to
forecast the future states of general objects. For instance, a model combining CNN and
bidirectional long short-term memory networks (bidirectional LSTM) is applied to predict
actual tool wear. This model initially collects raw sensor data from the tool, including
acceleration and sound frequency, to serve as input. A one-dimensional CNN extracts
features from the raw input sequence, followed by a two-layer bidirectional LSTM that
encodes temporal patterns. On top of the LSTM output, two fully connected layers are
stacked to further extract advanced features. The output from these layers is fed into a
linear regression layer to predict the final tool wear depth, facilitating risk alerts or tool
replacement notifications. The model achieves an RMSE of less than 8.1% across different
datasets [138].
In robotic hand applications, size constraints often limit the manipulator. AI algo-
rithms, particularly CNNs, are employed to predict and delineate the shapes of objects
larger than the hand by identifying their contours and edges. This involves tactile sensors
performing contact experiments to slide over and map the object’s surface, gathering tactile
data. Deep CNNs then analyze these data, focusing on shear forces from tactile movement,
to accurately predict the position and angle of the object’s contours and edges, achieving
position accuracy within 3 mm and angle accuracy within 9 degrees [139].
In summary, integrating artificial intelligence with sensors for prediction enhances
the accuracy and real-time capabilities of forecasts, even in complex, strongly nonlinear
scenarios. However, these predictions are based on monitoring data rather than mechanistic
analysis models. Therefore, the accuracy and sensitivity of model predictions for unseen
data or scenarios are not guaranteed, posing significant demands on the generalization
abilities and robustness of ML/DL models.
Pressure sensor
MLP 98.9 1. Enhance classification accuracy;
+ material sensor
2. Handle multi-source data for complex
Sensor placement significantly
data
Table 1. Cont.
Author Contributions: Conceptualization, H.F.; methodology, H.F.; software, H.F.; validation, L.C.,
C.X. and Z.Z.; formal analysis, Z.Z.; investigation, L.C. and C.X.; resources, H.F.; data curation,
L.C. and C.X.; writing—original draft preparation, L.C. and C.X.; writing—review and editing, H.F.;
visualization, L.C., C.X. and Z.Z.; supervision, H.F.; project administration, H.F. and Y.C.; funding
acquisition, H.F. and Y.C. All authors have read and agreed to the published version of the manuscript.
Funding: This research was funded by the National Natural Science Foundation of China, grant
numbers [12272342, 11972325, 22004108, 12002311].
Data Availability Statement: Data sharing is not applicable.
Acknowledgments: The authors would like to express their appreciation to Binghui Ma, Bocheng
Zhang, Tao Zhou, and Yuke Li for their contribution to this work.
Conflicts of Interest: The authors declare no conflict of interest.
Sensors 2024, 24, 2958 23 of 28
References
1. Spinelle, L.; Gerboles, M.; Kok, G.; Persijn, S.; Sauerwald, T. Review of portable and low-cost sensors for the ambient air
monitoring of benzene and other volatile organic compounds. Sensors 2017, 17, 1520. [CrossRef]
2. Chen, J.; Zhu, Y.; Chang, X.; Pan, D.; Song, G.; Guo, Z.; Naik, N. Recent Progress in Essential Functions of Soft Electronic Skin.
Adv. Funct. Mater. 2021, 31, 2104686. [CrossRef]
3. Gu, Y.; Zhang, T.; Chen, H.; Wang, F.; Pu, Y.; Gao, C.; Li, S. Mini Review on Flexible and Wearable Electronics for Monitoring
Human Health Information. Nanoscale Res. Lett. 2019, 14, 263–315. [CrossRef]
4. Luo, D.; Sun, H.; Li, Q.; Niu, X.; He, Y.; Liu, H. Flexible Sweat Sensors: From Films to Textiles. ACS Sens. 2023, 8, 465–481.
[CrossRef] [PubMed]
5. Gu, J.; Shen, Y.; Tian, S.; Xue, Z.; Meng, X. Recent Advances in Nanowire-Based Wearable Physical Sensors. Biosensors 2023,
13, 1025. [CrossRef]
6. Xue, Z.; Zhao, J. Bioelectric Interface Technologies in Cells and Organoids. Adv. Mater. Interfaces 2023, 10, 2300550. [CrossRef]
7. Bao, Y.; Chen, Z.; Wei, S.; Xu, Y.; Tang, Z.; Li, H. The State of the Art of Data Science and Engineering in Structural Health
Monitoring. Engineering 2019, 5, 234–242. [CrossRef]
8. Tang, Z.; Chen, Z.; Bao, Y.; Li, H. Convolutional Neural Network-based Data Anomaly Detection Method Using Multiple
Information for Structural Health Monitoring. Struct. Control Health Monit. 2019, 26, e2296.1–e2296.22. [CrossRef]
9. Bao, Z.; Mannsfeld, S.C.B.; Tee, B.C.-K.; Stoltenberg, R.M.; Chen, C.V.H.-H.; Barman, S.; Muir, B.V.O.; Sokolov, A.N.; Reese, C.
Highly sensitive flexible pressure sensors with microstructured rubber dielectric layers. Nat. Mater. 2010, 9, 859–864.
10. Bai, N.; Wang, L.; Wang, Q.; Deng, J.; Wang, Y.; Lu, P.; Huang, J.; Li, G.; Zhang, Y.; Yang, J.; et al. Graded intrafillable
architecture-based iontronic pressure sensor with ultra-broad-range high sensitivity. Nat. Commun. 2020, 11, 209. [CrossRef]
11. Xue, Z.; Jin, T.; Xu, S.; Bai, K.; He, Q.; Zhang, F.; Cheng, X.; Ji, Z.; Pang, W.; Shen, Z.; et al. Assembly of complex 3D structures and
electronics on curved surfaces. Sci. Adv. 2022, 8, 6922. [CrossRef] [PubMed]
12. Gao, H.; Li, J.; Wang, Z.; Xue, Z.; Meng, X. Design of Porous Partition Elastomer Substrates for the Island–Bridge Structures in
Stretchable Inorganic Electronics. ASME J. Appl. Mech. 2024, 91, 051005. [CrossRef]
13. Eka, N.; Cynthia, P.M.; Kaylee, M.C.; Ilhoon, J.; Charles, S.H. Electrochemical paper-based devices: Sensing approaches and
progress toward practical applications. Lab Chip 2020, 2, 9–34.
14. Nurlely, M.; Ahmad, L.; Ling, L. Potentiometric enzyme biosensor for rapid determination of formaldehyde based on succinimide-
functionalized polyacrylate ion-selective membrane. Meas. J. Int. Meas. Confed. 2021, 175, 109112. [CrossRef]
15. Zhang, J.; Liu, X.; Neri, G.; Pinna, N. Nanostructured Materials for Room-Temperature Gas Sensors. Adv. Mater. 2016, 28, 795–831.
[CrossRef] [PubMed]
16. Ho, D.H.; Choi, Y.Y.; Jo, S.B.; Myoung, J.M.; Cho, J.H. Sensing with MXenes: Progress and Prospects. Adv. Mater. 2021, 33, e2005846.
[CrossRef] [PubMed]
17. Ejeian, F.; Azadi, S.; Razmjou, A.; Orooji, Y.; Kottapalli, A.; Ebrahimi, W.M.; Asadnia, M. Design and Applications of MEMS Flow
Sensors: A Review. Sens. Actuators Phys. 2019, 295, 483–502. [CrossRef]
18. Xu, H.; Gao, L.; Zhao, H.; Huang, H.; Wang, Y.; Chen, G.; Qin, Y.; Zhao, N.; Xu, D.; Duan, L.; et al. Stretchable and Anti-
Impact Iontronic Pressure Sensor with an Ultrabroad Linear Range for Biophysical Monitoring and Deep Learning-Aided Knee
Rehabilitation. Microsyst. Nanoeng. 2021, 7, 92. [CrossRef]
19. Sarang, T.; Zheng, Z.; Henrik, R.; Jose, M.; Tuomo, M.; Sergey, N.; Toni, H.; Hiski, N.M.; Zahidul, H.B.; Ville, V.L.; et al. Sensors
and AI Techniques for Situational Awareness in Autonomous Ships: A Review. IEEE Trans. Intell. Transp. Syst. 2022, 23, 64–83.
20. Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned
Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [CrossRef]
21. Guorui, L.; Xiangping, C.; Fanghao, Z.; Yiming, L.; Youhua, X.; Xunuo, C.; Zhen, Z.; Mingqi, Z.; Baosheng, W.; Shunyu, Y.; et al.
Self-powered soft robot in the Mariana Trench. Nature 2021, 591, 66–71.
22. Yuan, W.; Dong, S.; Adelson, E.H. GelSight: High-resolution robot tactile sensors for estimating geometry and force. Sensors 2017,
17, 2762. [CrossRef]
23. Yasser, K.; Aminy, E.; Claire, M.; Adrien, P.; Ana, C. Monitoring of Vital Signs with Flexible and Wearable Medical Devices. Adv.
Mater. 2016, 28, 4373–4395.
24. Song, H.; Luo, G.; Ji, Z.; Bo, R.; Xue, Z.; Yan, D.; Zhang, F.; Bai, K.; Liu, J.; Cheng, X.; et al. Highly-integrated, miniaturized,
stretchable electronic systems based on stacked multilayer network materials. Sci. Adv. 2022, 8, eabm3785. [CrossRef] [PubMed]
25. Ma, W.; Cheng, G.; Wu, Q. Construction on permafrost foundations: Lessons learned from the Qinghai–Tibet railroad. Cold Reg.
Sci. Technol. 2009, 59, 3–11.
26. Zhou, F.; Chai, Y. Near-sensor and in-sensor computing. Nat. Electron. 2020, 3, 664–671. [CrossRef]
27. Wen, F.; Sun, Z.; He, T.; Shi, Q.; Zhu, M.; Zhang, Z.; Li, L.; Zhang, T.; Lee, C. Machine Learning Glove Using Self-Powered
Conductive Superhydrophobic Triboelectric Textile for Gesture Recognition in VR/AR Applications. Adv. Sci. 2020, 7, 2000261.
[CrossRef]
28. Young, H.J.; Seong, K.H.; Hee, S.W.; Jae, H.H.; Trung, X.; Hyunsin, P.; Junyeong, K.; Sunghun, K.; Chang, D.Y.; Keon, J.L. Flexible
Piezoelectric Acoustic Sensors and Machine Learning for Speech Processing. Adv. Mater. 2020, 35, 1904020.
Sensors 2024, 24, 2958 24 of 28
29. Muhammad, S.; Stephan, B.; Hans, S.; Paul, J.M.H.; Ozlem, D.I. Towards Detection of Bad Habits by Fusing Smartphone and
Smartwatch Sensors. In Proceedings of the 2015 IEEE International Conference on Pervasive Computing and Communication
Workshops (PerCom Workshops), St. Louis, MO, USA, 23–27 March 2015; pp. 591–596.
30. Chen, Z.; Li, W. Multisensor Feature Fusion for Bearing Fault Diagnosis Using Sparse Autoencoder and Deep Belief Network.
IEEE Trans. Instrum. Meas. 2017, 7, 1693–1702. [CrossRef]
31. Wei, L.; Wang, X.; Li, L.; Yu, L.; Liu, Z. A Low-Cost Tire Pressure Loss Detection Framework Using Machine Learning. IEEE Trans.
Ind. Electron. 2021, 12, 12730–12738. [CrossRef]
32. Fang, Y.; Zou, Y.; Xu, J.; Chen, G.; Zhou, Y.; Deng, W.; Zhao, X.; Roustaei, M.; Hsiai, T.K.; Chen, J. Ambulatory Cardiovascular
Monitoring Via a Machine-Learning-Assisted Textile Triboelectric Sensor. Adv. Mater. 2021, 41, 2104178. [CrossRef] [PubMed]
33. Kwon, S.H.; Dong, L. Flexible Sensors and Machine Learning for Heart Monitoring. Nano Energy 2022, 102, 107632. [CrossRef]
34. Yan, K.; Zhang, D. Feature Selection and Analysis on Correlated Gas Sensor Data with Recursive Feature Elimination. Sens.
Actuators B Chem. 2015, 212, 353–363. [CrossRef]
35. Faiz, M.; Ramandeep, S.; Mohd, A.; Anwar, A.S.; Chandani, B.; Nausheen, F. Machine Learning Techniques in Wireless Sensor
Networks: Algorithms, Strategies, and Applications. Int. J. Intell. Syst. Appl. Eng. 2023, 11, 685–694.
36. Chun, S.; Kim, J.S.; Yoo, Y.; Choi, Y.; Jung, S.J.; Jang, D.; Lee, G.; Song, K.I.; Nam, K.S.; Youn, I.; et al. An Artificial Neural Tactile
Sensing System. Nat. Electron. 2021, 4, 429–438. [CrossRef]
37. Lee, G.H.; Park, J.K.; Byun, J.; Yang, J.C.; Kwon, S.Y.; Kim, C.; Jang, C.; Sim, J.Y.; Yook, J.G.; Park, S. Parallel Signal Processing of a
Wireless Pressure-Sensing Platform Combined with Machine-Learning-Based Cognition, Inspired by the Human Somatosensory
System. Adv. Mater. 2020, 32, e1906269. [CrossRef] [PubMed]
38. Gandarias, J.M.; Garcia-Cerezo, A.J.; Gomez-de-Gabriel, J.M. CNN-Based Methods for Object Recognition with High-Resolution
Tactile Sensors. IEEE Sens. J. 2019, 19, 6872–6882. [CrossRef]
39. Wang, Y.; Adam, M.L.; Zhao, Y.; Zheng, W.; Gao, L.; Yin, Z.; Zhao, H. Machine Learning-Enhanced Flexible Mechanical Sensing.
Nano-Micro Lett. 2023, 15, 190–222. [CrossRef] [PubMed]
40. Vergara, A.; Vembu, S.; Ayhan, T.; Ryan, M.A.; Homer, M.L.; Huerta, R. Chemical Gas Sensor Drift Compensation Using Classifier
Ensembles. Sens. Actuators B Chem. 2012, 166–167, 320–329.
41. Cheng, X.; Fan, Z.; Yao, S.; Jin, T.; Lv, Z.; Lan, Y.; Bo, R.; Chen, Y.; Zhang, F.; Shen, Z.; et al. Programming 3D Curved Mesosurfaces
Using Microlattice Designs. Science 2023, 379, 1225–1232. [CrossRef]
42. Moin, A.; Zhou, A.; Rahimi, A.; Menon, A.; Benatti, S.; Alexandrov, G.; Tamakloe, S.; Ting, J.; Yamamoto, N.; Khan, Y.; et al. A
Wearable Biosensing System with In-Sensor Adaptive Machine Learning for Hand Gesture Recognition. Nat. Electron. 2020, 4,
54–63. [CrossRef]
43. Yang, J.; Chen, J.; Su, Y.; Jing, Q.; Li, Z.; Yi, F.; Wen, X.; Wang, Z.; Wang, Z.L. Eardrum-Inspired Active Sensors for Self-Powered
Cardiovascular System Characterization and Throat-Attached Anti-Interference Voice Recognition. Adv. Mater. 2015, 27,
1316–1326. [CrossRef] [PubMed]
44. Patra, J.C.; Panda, G.; Baliarsingh, R. Artificial Neural Network-Based Nonlinearity Estimation of Pressure Sensors. IEEE Trans.
Instrum. Meas. 1994, 43, 874–881. [CrossRef]
45. Ma, C.; Li, G.; Qin, L.; Huang, W.; Zhang, H.; Liu, W.; Dong, T.; Li, S.T. Analytical Model of Micropyramidal Capacitive Pressure
Sensors and Machine-Learning-Assisted Design. Adv. Mater. Technol. 2021, 6, 2100634. [CrossRef]
46. Rosset, S.; Belk, S.; Mahmoudinezhad, M.H.; Anderson, I. Leveraging Machine Learning for Arrays of Soft Sensors. Electroact.
Polym. Actuators Devices (EAPAD) XXV 2023, 12482, 58–67.
47. Ghommem, M.; Puzyrev, V.; Najar, F. Deep Learning for Simultaneous Measurements of Pressure and Temperature Using Arch
Resonators. Appl. Math. Model. 2021, 93, 728–744. [CrossRef]
48. Cho, S.Y.; Lee, Y.; Lee, S.; Kang, H.; Kim, J.; Choi, J.; Ryu, J.; Joo, H.; Jung, H.T.; Kim, J. Finding Hidden Signals in Chemical
Sensors Using Deep Learning. Anal. Chem. 2020, 9, 6529–6537. [CrossRef]
49. Mei, Y.; Zhang, S.; Cao, Z.; Xia, T.; Yi, X.; Liu, Z. Deep Learning Assisted Pressure Sensing Based on Sagnac Interferometry
Realized by Side-Hole Fiber. J. Light. Technol. 2023, 41, 784–793. [CrossRef]
50. Cao, Z.; Lu, Z.; Zhang, Q.; Luo, D.; Chen, J.; Tian, Q.; Liu, Z.; Dong, Y. Flexible Optical Pressure Sensor with High Spatial
Resolution Based on Deep Learning. In Proceedings of the Eighth Symposium on Novel Photoelectronic Detection Technology
and Applications, Kunming, China, 7–9 December 2021.
51. Sarkar, S.; Inupakutika, D.; Banerjee, M.; Tarhani, M.; Eghbal, M.K.; Shadaram, M. Discrimination of Strain and Temperature
Effects on FBG-Based Sensor Using Machine Learning. In Proceedings of the 2020 IEEE Photonics Conference (IPC), Vancouver,
BC, Canada, 28 September–1 October 2020; pp. 1–2.
52. Sarkar, S.; Inupakutika, D.; Banerjee, M.; Tarhani, M.; Shadaram, M. Machine Learning Methods for Discriminating Strain and
Temperature Effects on FBG-Based Sensors. IEEE Photonics Technol. Lett. 2021, 16, 876–879. [CrossRef]
53. Xu, X.; Wang, Y.; Zhu, D.; Shi, J. Accurate Strain Extraction via Kernel Extreme Learning Machine for Fiber Bragg Grating Sensor.
IEEE Sens. J. 2022, 8, 7792–7797. [CrossRef]
54. Dong, X.; Li, Y.; Zhong, T.; Wu, N.; Wang, H. Random and Coherent Noise Suppression in DAS-VSP Data by Using a Supervised
Deep Learning Method. IEEE Geosci. Remote. Sens. Lett. 2022, 19, 1–5. [CrossRef]
55. Ke, Y.; Lu, K.; Zhang, D. Learning Domain-Invariant Subspace Using Domain Features and Independence Maximization. IEEE
Trans. Cybern. 2018, 48, 288–299.
Sensors 2024, 24, 2958 25 of 28
56. Ji, T.; Pang, Q.; Liu, X. An Intelligent Pressure Sensor Using Rough Set Neural Networks. In Proceedings of the 2006 IEEE
International Conference on Information Acquisition, Veihai, China, 20–23 August 2006; pp. 717–721.
57. Patra, J.C.; van den Bos, A. Auto-Calibration and -Compensation of a Capacitive Pressure Sensor Using Multilayer Perceptrons.
ISA Trans. 2000, 2, 175–190. [CrossRef] [PubMed]
58. Rivera, J.; Carrillo, M.; Chacón, M.; Herrera, G.; Bojorquez, G. Self-Calibration and Optimal Response in Intelligent Sensors
Design Based on Artificial Neural Networks. Sensors 2007, 7, 1509–1529. [CrossRef]
59. Chang, Y.; Cui, X.; Hou, G.; Jin, Y. Calibration of the Pressure Sensor Device with the Extreme Learning Machine. In Proceedings
of the 2020 21st International Conference on Electronic Packaging Technology (ICEPT), Guangzhou, China, 12–15 August 2020;
pp. 1–5.
60. Depari, A.; Flammini, A.; Marioli, D.; Taroni, A. Application of an ANFIS Algorithm to Sensor Data Processing. IEEE Trans.
Instrum. Meas. 2007, 56, 75–79. [CrossRef]
61. Zhou, G.; Zhao, Y.; Guo, F.; Xu, W. A Smart High Accuracy Silicon Piezoresistive Pressure Sensor Temperature Compensation
System. Sensors 2014, 7, 12174–12190. [CrossRef] [PubMed]
62. Pramanik, C.; Islam, T.; Saha, H. Temperature Compensation of Piezoresistive Micro-Machined Porous Silicon Pressure Sensor by
ANN. Microelectron. Reliab. 2006, 46, 343–351. [CrossRef]
63. Futane, N.P.; Chowdhury, S.R.; Chowdhury, C.R.; Saha, H. ANN Based CMOS ASIC Design for Improved Temperature-Drift
Compensation of Piezoresistive Micro-Machined High Resolution Pressure Sensor. Microelectron. Reliab. 2010, 50, 282–291.
[CrossRef]
64. Gao, Y.; Qiu, Y.; Chen, H.; Huang, Y.; Li, G. Four-Channel Fiber Loop Ring-down Pressure Sensor with Temperature Compensation
Based on Neural Networks. Microw. Opt. Technol. Lett. 2010, 8, 1796–1799. [CrossRef]
65. Chen, H.; Aggarwal, P.; Taha, T.M.; Chodavarapu, V.P. Improving Inertial Sensor by Reducing Errors Using Deep Learning
Methodology. In Proceedings of the NAECON 2018—IEEE National Aerospace and Electronics Conference, Dayton, OH, USA,
23–26 July 2018; pp. 197–202.
66. Kim, D.E.; Kim, K.S.; Park, J.H.; Ailing, L.; Lee, J.M. Stable Grasping of Objects using Air Pressure Sensors on a Robot Hand. In
Proceedings of the 2018 18th International Conference on Control, Automation and Systems (ICCAS), PyeongChang, Republic of
Korea, 17–20 October 2018; pp. 500–502.
67. Shenghan Gao, S.; Zheng, C.; Zhao, Y.; Wu, Z.; Li, J.; Huang, X. Comparison of Enhancement Techniques Based on Neural
Networks for Attenuated Voice Signal Captured by Flexible Vibration Sensors on Throats. Nanotechnol. Precis. Eng. 2022, 1, 013001.
68. Larson, C.; Spjut, J.; Knepper, R.; Shepherd, R. A Deformable Interface for Human Touch Recognition Using Stretchable Carbon
Nanotube Dielectric Elastomer Sensors and Deep Neural Networks. Soft Robot. 2019, 6, 611–620. [CrossRef]
69. Kim, D.W.; Kwon, J.; Jeon, B.; Park, Y.L. Adaptive Calibration of Soft Sensors Using Optimal Transportation Transfer Learning for
Mass Production and Long-Term Usage. Adv. Intell. Syst. 2020, 6, 1900178. [CrossRef]
70. Xu, Y.; Zhang, S.; Li, S.; Wu, Z.; Li, Y.; Li, Z.; Chen, X.; Shi, C.; Chen, P.; Zhang, P.; et al. A Soft Magnetoelectric Finger for Robots’
Multidirectional Tactile Perception in Non-Visual Recognition Environments. NPJ Flex. Electron. 2024, 8, 2. [CrossRef]
71. Lee, J.H.; Kim, S.H.; Heo, J.S.; Kwak, J.Y.; Park, C.W.; Kim, I.; Lee, M.; Park, H.H.; Kim, Y.H.; Lee, S.J.; et al. Heterogeneous
Structure Omnidirectional Strain Sensor Arrays With Cognitively Learned Neural Networks. Adv. Mater. 2023, 13, 2208184.
[CrossRef]
72. Kondratenko, Y.; Atamanyuk, I.; Sidenko, I.; Kondratenko, G.; Sichevskyi, S. Machine Learning Techniques for Increasing
Efficiency of the Robot’s Sensor and Control Information Processing. Sensors 2022, 3, 1062. [CrossRef]
73. Levins, M.; Lang, H. A Tactile Sensor for an Anthropomorphic Robotic Fingertip Based on Pressure Sensing and Machine
Learning. IEEE Sens. J. 2020, 22, 13284–13290. [CrossRef]
74. Xu, Z.; Zheng, Y.; Rawashdeh, S.A. A Simple Robotic Fingertip Sensor Using Imaging and Shallow Neural Networks. IEEE Sens.
J. 2019, 19, 8878–8886. [CrossRef]
75. Hellebrekers, T.; Kroemer, O.; Majidi, C. Soft Magnetic Skin for Continuous Deformation Sensing. Adv. Intell. Syst. 2019,
1, 1900025. [CrossRef]
76. Gandarias, J.M.; Gomez-de-Gabriel, J.M.; Garcia-Cerezo, A. Human and Object Recognition with a High-Resolution Tactile Sensor.
In Proceedings of the 2017 IEEE SENSORS, Glasgow, UK, 29 October–1 November 2017; pp. 1–3.
77. Hongbin, L.; Xiaojing, S.; Thrishantha, N.; Lakmal, D.S.; Kaspar, A. A Computationally Fast Algorithm for Local Contact Shape
and Pose Classification Using a Tactile Array Sensor. In Proceedings of the 2012 IEEE International Conference on Robotics and
Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 1410–1415.
78. Castaño, F.; Beruvides, G.; Haber, R.E.; Artuñedo, A. Obstacle Recognition Based on Machine Learning for On-Chip LiDAR
Sensors in a Cyber-Physical System. Sensors 2017, 9, 2109. [CrossRef]
79. Jamali, N.; Sammut, C. Majority Voting: Material Classification by Tactile Sensing Using Surface Texture. IEEE Trans. Robot. 2011,
27, 508–521. [CrossRef]
80. Yao, H.; Li, P.; Cheng, W.; Yang, W.; Yang, Z.; Ali, H.P.A.; Guo, H.; Tee, B.C.K. Environment-Resilient Graphene Vibrotactile
Sensitive Sensors for Machine Intelligence. ACS Mater. Lett. 2020, 8, 986–992. [CrossRef]
81. King, D.; Lyons, W.B.; Flanagan, C.; Lewis, E. An Optical-Fiber Sensor for Use in Water Systems Utilizing Digital Signal Processing
Techniques and Artificial Neural Network Pattern Recognition. IEEE Sens. J. 2004, 4, 21–27. [CrossRef]
Sensors 2024, 24, 2958 26 of 28
82. Hwang, Y.J.; Yu, H.; Lee, G.; Shackery, I.; Seong, J.; Jung, Y.; Sung, S.H.; Choi, J.; Jun, S.C. Multiplexed DNA-Functionalized
Graphene Sensor with Artificial Intelligence-Based Discrimination Performance for Analyzing Chemical Vapor Compositions.
Microsyst. Nanoeng. 2023, 9, 28. [CrossRef]
83. Craven, M.A.; Gardner, J.W.; Bartlett, P.N. Electronic Noses—Development and Future Prospects. TrAC Trends Anal. Chem. 1996,
9, 486–493. [CrossRef]
84. Zhan, C.; He, J.; Pan, M.; Luo, D. Component Analysis of Gas Mixture Based on One-Dimensional Convolutional Neural Network.
Sensors 2021, 21, 347. [CrossRef]
85. Nguyen, X.A.; Gong, S.; Cheng, W.; Chauhan, S. A Stretchable Gold Nanowire Sensor and Its Characterization Using Machine
Learning for Motion Tracking. IEEE Sens. J. 2021, 13, 15269–15276. [CrossRef]
86. Hegde, N.; Bries, M.; Swibas, T.; Melanson, E.; Sazonov, E. Automatic Recognition of Activities of Daily Living Utilizing
Insole-Based and Wrist-Worn Wearable Sensors. IEEE J. Biomed. Health Inform. 2018, 4, 979–988. [CrossRef]
87. Jiang, Y.; Sadeqi, A.; Miller, E.L.; Sonkusale, S. Head Motion Classification Using Thread-Based Sensor and Machine Learning
Algorithm. Sci. Rep. 2021, 11, 2646. [CrossRef]
88. Anderson, W.; Choffin, Z.; Jeong, N.; Callihan, M.; Jeong, S.; Sazonov, E. Empirical Study on Human Movement Classification
Using Insole Footwear Sensor System and Machine Learning. Sensors 2022, 22, 2743. [CrossRef]
89. Kobsar, D.; Ferber, R. Wearable Sensor Data to Track Subject-Specific Movement Patterns Related to Clinical Outcomes Using a
Machine Learning Approach. Sensors 2018, 9, 2828. [CrossRef]
90. Islam, M.; Tabassum, M.; Nishat, M.M.; Faisal, F.; Hasan, M.S. Real-Time Clinical Gait Analysis and Foot Anomalies Detection
Using Pressure Sensors and Convolutional Neural Network. In Proceedings of the 2022 7th International Conference on Business
and Industrial Research (ICBIR), Bangkok, Thailand, 19–20 May 2022; pp. 717–722.
91. Luo, J.; Wang, Z.; Xu, L.; Wang, A.C.; Han, K.; Jiang, T.; Lai, Q.; Bai, Y.; Tang, W.; Fan, F.R.; et al. Flexible and Durable Wood-Based
Triboelectric Nanogenerators for Self-Powered Sensing in Athletic Big Data Analytics. Nat. Commun. 2019, 10, 5147. [CrossRef]
92. Hassan, M.M.; Uddin, M.Z.; Mohamed, A.; Almogren, A. A Robust Human Activity Recognition System Using Smartphone
Sensors and Deep Learning. Future Gener. Comput. Syst. 2018, 81, 307–313. [CrossRef]
93. Wen, Z.; Yang, Y.; Sun, N.; Li, G.; Liu, Y.; Chen, C.; Shi, J.; Xie, L.; Jiang, H.; Bao, D.; et al. A Wrinkled PEDOT: PSS Film Based
Stretchable and Transparent Triboelectric Nanogenerator for Wearable Energy Harvesters and Active Motion Sensors. Adv. Funct.
Mater. 2018, 28, 1803684. [CrossRef]
94. Mani, N.; Haridoss, P.; George, B. Smart Suspenders with Sensors and Machine Learning for Human Activity Monitoring. IEEE
Sens. J. 2023, 23, 10159–10167. [CrossRef]
95. Xie, Y.; Wu, X.; Huang, X.; Liang, Q.; Deng, S.; Wu, Z.; Yao, Y.; Lu, L. A Deep Learning-Enabled Skin-Inspired Pressure Sensor for
Complicated Recognition Tasks with Ultralong Life. Research 2023, 6, 0157. [CrossRef]
96. Gholami, M.; Ejupi, A.; Rezaei, A.; Ferrone, A.; Menon, C. Estimation of Knee Joint Angle Using a Fabric-Based Strain Sensor and
Machine Learning: A Preliminary Investigation. In Proceedings of the 2018 7th IEEE International Conference on Biomedical
Robotics and Biomechatronics (Biorob), Enschede, The Netherlands, 26–29 August 2018; pp. 589–594.
97. Zhou, Z.; Chen, K.; Li, X.; Zhang, S.; Wu, Y.; Zhou, Y.; Meng, K.; Sun, C.; He, Q.; Fan, W.; et al. Sign-to-speech translation using
machine-learning-assisted stretchable sensor arrays. Nat. Electron. 2020, 3, 571–578. [CrossRef]
98. Krishnan, K.S.; Saha, A.; Ramachandran, S.; Kumar, S. Recognition of Human Arm Gestures Using Myo Armband for the Game
of Hand Cricket. In Proceedings of the 2017 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS), Ottawa,
ON, Canada, 5–7 October 2017; pp. 389–394.
99. Gonzalez-Cely, A.X.; Bastos-Filho, T.; Diaz, C.A.R. Wheelchair Posture Classification Based on POF Pressure Sensors and Machine
Learning Algorithms. In Proceedings of the 2022 IEEE Latin American Electron Devices Conference (LAEDC), Cancun, Mexico,
4–6 July 2022; pp. 1–4.
100. Roh, J.; Park, H.J.; Lee, K.J.; Hyeong, J.; Kim, S.; Lee, B. Sitting Posture Monitoring System Based on a Low-Cost Load Cell Using
Machine Learning. Sensors 2018, 18, 208. [CrossRef]
101. Lee, H.J.; Yang, J.C.; Choi, J.; Kim, J.; Lee, G.S.; Sasikala, S.P.; Lee, G.H.; Park, S.H.K.; Lee, H.M.; Sim, J.Y.; et al. Hetero-Dimensional
2D Ti3 C2 Tx MXene and 1D Graphene Nanoribbon Hybrids for Machine Learning-Assisted Pressure Sensors. ACS Nano 2021, 15,
10347–10356. [CrossRef]
102. Zemp, R.; Tanadini, M.; Plüss, S.; Schnüriger, K.; Singh, N.B.; Taylor, W.R.; Lorenzetti, S. Application of Machine Learning
Approaches for Classifying Sitting Posture Based on Force and Acceleration Sensors. BioMed Res. Int. 2016, 2016, 5978489.
[CrossRef]
103. Rodríguez, A.P.; Gil, D.; Nugent, C.; Quero, J.M. In-Bed Posture Classification from Pressure Mat Sensors for the Prevention of
Pressure Ulcers Using Convolutional Neural Networks. Bioinform. Biomed. Eng. 2020, 8, 338–349.
104. Bourahmoune, K.; Amagasa, T. AI-Powered Posture Training: Application of Machine Learning in Sitting Posture Recognition
Using the LifeChair Smart Cushion. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
AI for Improving Human Well-Being, Macao, China, 10–16 August 2019; pp. 5808–5814.
105. Zhong, H.; Fu, R.; Chen, S.; Zhou, Z.; Zhang, Y.; Yin, X.; He, B. Large-Area Flexible MWCNT/PDMS Pressure Sensor for
Ergonomic Design with Aid of Deep Learning. Nanotechnology 2022, 33, 345502. [CrossRef]
Sensors 2024, 24, 2958 27 of 28
106. Green, C.; Bouchard, M.; Goubran, R.; Robillard, R.; Higginson, C.; Lee, E.; Knoefel, F. Sleep-Wake and Body Position Classification
with Deep Learning Using Pressure Sensor Mat Measurements. In Proceedings of the 2023 IEEE International Symposium on
Medical Measurements and Applications (MeMeA), Jeju, Republic of Korea, 14–16 June 2023; pp. 1–6.
107. Huang, K.-H.; Tan, F.; Wang, T.-D.; Yang, Y.-J. A Highly Sensitive Pressure-Sensing Array for Blood Pressure Estimation Assisted
by Machine-Learning Techniques. Sensors 2019, 19, 848. [CrossRef]
108. Gudiño-Ochoa, A.; García-Rodríguez, J.A.; Ochoa-Ornelas, R.; Cuevas-Chávez, J.I.; Sánchez-Arias, D.A. Noninvasive Diabetes
Detection through Human Breath Using TinyML-Powered E-Nose. Sensors 2024, 24, 1294. [CrossRef]
109. Roberts, M.; Driggs, D.; Thorpe, M.; Gilbey, J.; Yeung, M.; Ursprung, S.; Aviles-Rivero, A.I.; Etmann, C.; McCague, C.; Beer, L.;
et al. Common pitfalls and recommendations for using machine learning to detect and prognosticate for COVID-19 using chest
radiographs and CT scans. Nat. Mach. Intell. 2021, 3, 199–217. [CrossRef]
110. Zhou, Y.; Shen, M.; Cui, X.; Shao, Y.; Li, L.; Zhang, Y. Triboelectric Nanogenerator Based Self-Powered Sensor for Artificial
Intelligence. Nano Energy 2021, 84, 105887. [CrossRef]
111. Wu, C.; Ding, W.; Liu, R.; Wang, J.; Wang, A.; Wang, J.; Li, S.; Zi, Y.; Wang, Z.L. Keystroke Dynamics Enabled Authentication and
Identification using Triboelectric Nanogenerator Array. Mater. Today 2018, 21, 216–222. [CrossRef]
112. Chen, J.; Zhu, G.; Yang, J.; Jing, Q.; Bai, P.; Yang, W.; Qi, X.; Su, Y.; Wang, Z.L. Personalized Keystroke Dynamics for Self-Powered
Human–Machine Interfacing. ACS Nano 2015, 9, 105–116. [CrossRef]
113. Zhang, W.; Deng, L.; Yang, L.; Yang, P.; Diao, D.; Wang, P.; Wang, Z.L. Multilanguage-handwriting self-powered recognition
based on triboelectric nanogenerator enabled machine learning. Nano Energy 2020, 77, 105174. [CrossRef]
114. Shi, Q.; Zhang, Z.; He, T.; Sun, Z.; Wang, B.; Feng, Y.; Shan, X.; Salam, B.; Lee, C. Deep Learning Enabled Smart Mats as a Scalable
Floor Monitoring System. Nat. Commun. 2020, 11, 4609. [CrossRef]
115. Han, J.H.; Min, B.K.; Hong, S.K.; Park, H.; Kwak, J.; Wang, H.S.; Joe, D.J.; Park, J.H.; Jung, Y.H.; Hur, S.; et al. Machine
Learning-Based Self-Powered Acoustic Sensor for Speaker Recognition. Nano Energy 2018, 53, 658–665. [CrossRef]
116. Zhuo, W.; Ziyang, C.; Lin, M.; Qi, W.; Heng, W.; Leal-Junior, A.; Xiaoli, L.; Carlos, M.; Rui, M. Optical Microfiber Intelligent
Sensor: Wearable Cardiorespiratory and Behavior Monitoring with a Flexible Wave-Shaped Polymer Optical Microfiber. ACS
Appl. Mater. Interfaces 2024, 16, 8333–8345.
117. Li, C.; Sánchez, R.-V.; Zurita, G.; Cerrada, M.; Cabrera, D. Fault Diagnosis for Rotating Machinery Using Vibration Measurement
Deep Statistical Feature Learning. Sensors 2016, 16, 895. [CrossRef]
118. Tao, J.; Liu, Y.; Yang, D. Bearing Fault Diagnosis Based on Deep Belief Network and Multisensor Information Fusion. Shock Vib.
2016, 2016, 9306205. [CrossRef]
119. Yun, J.; Lee, S.-S. Human Movement Detection and Identification Using Pyroelectric Infrared Sensors. Sensors 2014, 14, 8057–8081.
[CrossRef]
120. Gao, J.; Li, Z.; Chen, Z. Dual-Mode Pressure Sensor Integrated with Deep Learning Algorithm for Joint State Monitoring in Tennis
Motion. J. Sens. 2023, 2023, 5079256. [CrossRef]
121. Wen, L.; Nie, M.; Chen, P.; Zhao, Y.N.; Shen, J.; Wang, C.; Xiong, Y.; Yin, K.; Sun, L. Wearable Multimode Sensor with a Seamless
Integrated Structure for Recognition of Different Joint Motion States with the Assistance of a Deep Learning Algorithm. Microsyst.
Nanoeng. 2022, 8, 24. [CrossRef]
122. Polat, B.; Becerra, L.L.; Hsu, P.; Kaipu, V.; Mercier, P.P.; Cheng, C.; Lipomi, D.J. Epidermal Graphene Sensors and Machine
Learning for Estimating Swallowed Volume. ACS Appl. Nano Mater. 2021, 4, 8126–8134. [CrossRef]
123. Orii, H.; Tsuji, S.; Kouda, T.; Kohama, T. Tactile Texture Recognition Using Convolutional Neural Networks for Time-Series Data
of Pressure and 6-Axis Acceleration Sensor. In Proceedings of the 2017 IEEE International Conference on Industrial Technology
(ICIT), Toronto, ON, Canada, 22–25 March 2017; pp. 1076–1080.
124. Tsuji, S.; Kohama, T. Using a Convolutional Neural Network to Construct a Pen-Type Tactile Sensor System for Roughness
Recognition. Sens. Actuators A Phys. 2019, 291, 7–12. [CrossRef]
125. Thuruthel, T.G.; Iida, F. Multimodel Sensor Fusion for Learning Rich Models for Interacting Soft Robots. In Proceedings of the
2023 IEEE International Conference on Soft Robotics (RoboSoft), Singapore, 3–7 April 2023; pp. 1–6.
126. Luo, Y.; Xiao, X.; Chen, J.; Li, Q.; Fu, H. Machine-Learning-Assisted Recognition on Bioinspired Soft Sensor Arrays. ACS Nano
2022, 4, 6734–6743. [CrossRef]
127. Sun, Z.; Wang, S.; Zhao, Y.; Zhong, Z.; Zuo, L. Discriminating Soft Actuators’ Thermal Stimuli and Mechanical Deformation by
Hydrogel Sensors and Machine Learning. Adv. Intell. Syst. 2022, 9, 2200089. [CrossRef]
128. Sohn, K.S.; Chung, J.; Cho, M.Y.; Timilsina, S.; Park, W.B.; Pyo, M.; Shin, N.; Sohn, K.; Kim, J.S. An Extremely Simple Macroscale
Electronic Skin Realized by Deep Machine Learning. Sci. Rep. 2017, 7, 11061. [CrossRef]
129. Luo, S.; Mou, W.; Li, M.; Althoefer, K.; Liu, H. Rotation and Translation Invariant Object Recognition with a Tactile Sensor. In
Proceedings of the SENSORS, 2014 IEEE, Valencia, Spain; 2014; pp. 1030–1033.
130. Wang, J.; Xie, J.; Zhao, R.; Zhang, L.; Duan, L. Multisensory Fusion Based Virtual Tool Wear Sensing for Ubiquitous Manufacturing.
Robot. Comput. Integr. Manuf. 2017, 45, 47–58. [CrossRef]
131. Suzuki, S.; Kondoh, J. Cantilever Damage Evaluation Using Impedance-Loaded SAW Sensor with Continuous Wavelet Analysis
and Machine Learning. Jpn. J. Appl. Phys. 2021, 60, SDDC09. [CrossRef]
132. Rente, B.; Fabian, M.; Vidakovic, M.; Liu, X.; Li, X.; Li, K.; Sun, T.; Grattan, K.T.V. Lithium-Ion Battery State-of-Charge Estimator
Based on FBG-Based Strain Sensor and Employing Machine Learning. IEEE Sens. J. 2021, 2, 1453–1460. [CrossRef]
Sensors 2024, 24, 2958 28 of 28
133. Kuwahara, N.; Wada, K. Bed-Leaving Prediction Using a Sheet-Type Pressure-Sensitive Sensor Base with Deep-Learning. J. Fiber
Sci. Technol. 2017, 12, 343–347. [CrossRef]
134. Moore, S.R.; Kranzinger, C.; Fritz, J.; Stöggl, T.; Kröll, J.; Schwameder, H. Foot Strike Angle Prediction and Pattern Classification
Using LoadsolTM Wearable Sensors: A Comparison of Machine Learning Techniques. Sensors 2020, 20, 6737. [CrossRef]
135. Choffin, Z.; Jeong, N.; Callihan, M.; Olmstead, S.; Sazonov, E.; Thakral, S.; Getchell, C.; Lombardi, V. Ankle Angle Prediction
Using a Footwear Pressure Sensor and a Machine Learning Technique. Sensors 2021, 21, 3790. [CrossRef]
136. Agrawal, D.K.; Usaha, W.; Pojprapai, S.; Wattanapan, P. Fall Risk Prediction Using Wireless Sensor Insoles With Machine Learning.
IEEE Access 2023, 11, 23119–23126. [CrossRef]
137. Bilgera, C.; Yamamoto, A.; Sawano, M.; Matsukura, H.; Ishida, H. Application of Convolutional Long Short-Term Memory Neural
Networks to Signals Collected from a Sensor Network for Autonomous Gas Source Localization in Outdoor Environments.
Sensors 2018, 12, 4484. [CrossRef]
138. Zhao, R.; Yan, R.; Wang, J.; Mao, K. Learning to Monitor Machine Health with Convolutional Bi-Directional LSTM Networks.
Sensors 2017, 17, 273. [CrossRef]
139. Lepora, N.F.; Church, A.; De Kerckhove, C.; Hadsell, R.; Lloyd, J. From Pixels to Percepts: Highly Robust Edge Perception and
Contour Following Using Deep Learning and an Optical Biomimetic Tactile Sensor. IEEE Robot. Autom. Lett. 2019, 4, 2101–2107.
[CrossRef]
140. Guo, W.; Wang, Y.; Chen, X.; Jiang, P. Federated transfer learning for auxiliary classifier generative adversarial networks:
Framework and industrial application. J. Intell. Manuf. 2024, 35, 1439–1454. [CrossRef]
141. Tsuboi, Y.; Sakai, Y.; Shimizu, R.; Goto, M. Multiple treatment effect estimation for business analytics using observational data.
Cogent Eng. 2024, 11, 2300557. [CrossRef]
142. He, Y.; Lin, J.; Liu, Z.; Wang, H.; Li, L.; Han, S. AMC: AutoML for Model Compression and Acceleration on Mobile Devices. In
Computer Vision–ECCV 2018, Proceedings of the 15th European Conference, Munich, Germany, September 8–14 2018; Springer: Cham,
Switzerland; pp. 815–832.
143. Satyanarayanan, M. The Emergence of Edge Computing. Computer 2017, 50, 30–39. [CrossRef]
144. Raissi, M.; Perdikaris, P.; Karniadakis, G.E. Physics-informed neural networks: A deep learning framework for solving forward
and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. 2019, 378, 686–707. [CrossRef]
145. Chai, Y. Silicon photodiodes that multiply. Nat. Electron. 2022, 5, 483–484. [CrossRef]
146. Wan, T.; Shao, B.; Ma, S.; Zhou, Y.; Li, Q.; Chai, Y. In-Sensor Computing: Materials, Devices, and Integration Technologies. Adv.
Mater. 2023, 35, 2203830. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.