Metaheuristic Algorithms For Parameter Estimation of DC Servo Motors With Quantized Sensor Measurements.
Metaheuristic Algorithms For Parameter Estimation of DC Servo Motors With Quantized Sensor Measurements.
Corresponding Author:
Dr Debani Prasad Mishra
Head of Electronics and Electrical department, International Institute of Information Technology
Bhubaneswar, Odisha
Email: [email protected]
1. INTRODUCTION
Robotics, CNC equipment, printing presses, packing equipment, and many more applications use servo
motors extensively. It is an extremely accurate and exact motor that offers accurate torque, velocity, and
angular position control. Robot joints and limbs can conduct precise and sophisticated movements because
servo motors are employed to regulate their movement in robotics[1]. Servo motors are employed in CNC
machines to precisely control the movement of the cutting tool, producing precise machining processes.
Servo motors are used in printing presses and packaging equipment to manage the printing and packaging
process, producing output that is reliable and of a high calibre.Servo motors are commonly used in thrust
vector control systems to control the movement of the nozzle or other control surfaces,allowing the rocket to
alter its flight path and trajectory.[2]Modern industrial control systems rely heavily on servo motors, and
accurate parameter estimate is essential for best performance.As stated in [3], the process of system
identification entails a number of phases in order to produce an accurate model that can successfully reflect
the behavior of a system. The experimental setup for this approach, which involves the planning, carrying
out, and validating of experiments, is shown in Figure 1. The generated models can be applied to research
projects, as shown in [4], or they can be used to construct adaptive control loops, as shown in [5].
The idea of a mathematical model is an essential component in many disciplines, including physics and
others. There are many different kinds of models that may be found in the literature, each one designed for a
particular use. As mentioned, they can be roughly divided into two categories:
(a) Theoretical Model and
(b) Experimental Model.
In experimental model system identification, a distinction is made between[6]-
● Non-parametric models have an indistinct structure and an unbounded number of parameters.[7]
They are ordinarily spoken to utilizing graphical or unthinkable groups, such as a step reaction plot,
for example.
● The structure of metric models is well-defined, and the number of parameters is finite[8]. To reflect
the system behavior, these models are often specified using equations such as transfer functions or
differential equations.
● In order to determine the model parameters of a straightforward DC motor while taking into account
sensor quantization, this paper compares and contrasts three distinct population-based optimisation
strategies. The majority of conventional optimization algorithms are gradient-based, which means
they stick to local optima and can cause early convergence to local optima while taking a long time
to reach the global optima[9]. Non-convex and multimodal optimization problems, which can have
numerous local minima, can be challenging for conventional techniques[10]. Metaheuristics, on the
other hand, are stochastic optimisation algorithms that search the search space for the best solution
without using gradients but rather heuristics and random search. For non-convex and multimodal
optimisation issues, these approaches can be useful.[11]
2.1. fundamentals of dc servo motor using 2-phase incremental optical rotary encoder
In this article, the floor function is employed to quantize the continuous rotation data and an armature
controlled DC servo motor is used to mimic the transfer function. A set of differential equations can be used
to mathematically represent the physical behavior of a DC servo motor[12]. From figure 2.1,The DC servo
motor's fundamental working concept is that a current flowing through a coil creates a magnetic field that
interacts with a permanent magnet to rotate the motor shaft[13].Write electrical equations and mechanical
equations. Use the electromechanical relationships to couple the two equations.[14]
Int J Appl Power Eng, Vol. 99, No. 1, Month 2099: 1-1x
3
Int J Appl Power Eng ISSN: 2252-8792 ❒
The system's input is armature voltage, and its output is the measured shaft angle in degrees.
Consider the inputs ea(t) and eb(t), and the output ia(t). Wrap KVL around the armature-
d i a (t)
e a (t)=Ra ×i a (t)+ L × + eb (t )(1)
dt ❑
Mechanical Dynamics
d ω m (t)
T (t)=J m × + B m × ωm (t)(2)
dt
❑
Electromechanical equation are given as:-e b (t )=K E × ωm ( t)(3)
❑
T (t)=K T ×i a (t )(4)
I a ( s)=
[ 1
La ⋅ s + R a ]
[ E a (s)−E b (s) ](6)
Paper’s should be the fewest possible that accurately describe … (First Author)
4
❒ ISSN: 2252-8792
solving for Θ m=
[]
1
s
⋅ Ωm (s )→
Θm (s )
=
[ KT
Ea (s) La ⋅ J m ⋅ s +( La ⋅ Bm + R a ⋅ J m )⋅s 2+(K T ⋅ K E + R a ⋅ Bm )⋅ s
3
(12)
]
Figure 2.2 Actual or modeled block diagram of the DC-servo motor along with the rotary encoder
Int J Appl Power Eng, Vol. 99, No. 1, Month 2099: 1-1x
5
Int J Appl Power Eng ISSN: 2252-8792 ❒
Fig.3.2 Magnified portion of fig.3.1 showing quantized output in degreesFigure shows the quantized
sensor readings when the armature voltage follows a variable stair function.
In order to reduce computational complexity, we used the Integral Absolute Error as the cost to evaluate
the performance of the optimisation methods in our paper[15].
IAE = Integral Absolute Error
∞
IAE = ∫ ❑ |e(t)|dt
0
We employed IAE as our objective/cost function in this paper, which must be minimized using heuristics.
The six variables are the parameters La, Ra , K t , K b , J, fo, where each set of these six variables is a solution
to the process of forecasting the DC-servo motor transfer function.
The genetic algorithm is a type of optimization algorithm that is inspired by the process of natural selection
and genetic evolution in biology. It is used to find the best solution to a problem by iteratively evolving a
Paper’s should be the fewest possible that accurately describe … (First Author)
6
❒ ISSN: 2252-8792
population of candidate solutions using genetic operations such as selection, crossover, and mutation.John
Holland first invented genetic algorithms (GA) in 1975 [16].
The fundamental steps of a genetic algorithm which is outlined in figure 5.1 are as follows[17]:
1. Initialization: The algorithm starts by creating an initial population of potential solutions. This is
typically done by generating a set of random individuals that represent different solutions to the
problem at hand. The size of the population is determined based on the specific problem and the
available computational resources.
2. Evaluation: Each individual in the population is evaluated based on its fitness, which measures how
well it solves the problem. The fitness function is problem-specific and defines the criteria for
determining the quality of a solution. The fitness function assigns a score or value to each
individual, indicating its fitness level. The evaluation process involves applying the fitness function
to each individual and calculating its fitness score.
3. Selection: After evaluating the fitness of all individuals, a selection process is applied to choose the
fittest individuals from the population. The selection methods aim to favor individuals with higher
fitness scores, increasing the likelihood of their genetic material being passed to the next generation.
Common selection methods include roulette wheel selection, tournament selection, and rank-based
selection.
4. Crossover: Once the fittest individuals are selected, pairs of individuals are combined to create new
offspring through a process called crossover or recombination. In crossover, genetic material from
the selected individuals is exchanged or mixed to produce new solutions. The specific crossover
method depends on the representation of individuals and the problem domain. For example, in
genetic algorithms, crossover may involve exchanging segments of binary strings or combining
numerical values.
5. Mutation: After crossover, the new offspring may undergo mutation, which introduces small random
changes to their genetic makeup. Mutation helps to introduce new genetic variations into the
population and prevent premature convergence to suboptimal solutions. The mutation process
typically involves randomly altering or modifying certain genes or attributes of the individuals.
6. Repeat: The new generation, consisting of the offspring from crossover and possibly mutated
individuals, replaces the old population. The steps of evaluation, selection, crossover, and mutation
are repeated iteratively for a certain number of generations or until a stopping criterion is met. The
stopping criterion could be reaching a maximum number of generations, finding a solution that
meets a predefined threshold of fitness, or detecting a convergence condition.
The genetic algorithm is widely used in optimization problems such as finding the optimal solution to a
mathematical equation, designing optimal engineering structures, and optimizing financial portfolios.[18]
Int J Appl Power Eng, Vol. 99, No. 1, Month 2099: 1-1x
7
Int J Appl Power Eng ISSN: 2252-8792 ❒
Particle Swarm Optimization (PSO) is a population-based optimization technique inspired by the social
behavior of bird flocking or fish schooling. Kenndy and Eberhart first introduced Particle Swarm
Optimization (PSO) in 1995[19].It is used to solve optimization problems by simulating the social behavior
of a swarm of particles moving in a multidimensional search space. The particles move through the search
space, and their positions and velocities are updated based on their own best position, the best position found
by any particle in the swarm, and the current position of the particle.
The steps included in Particle Swarm Optimization which is outlined in figure 5.2 are as follows[19]:
1. Initialization: The algorithm starts by creating a swarm of particles, where each particle represents a
potential solution to the problem. Each particle is randomly assigned a position within the search
space, which corresponds to a particular solution, and a random velocity vector that determines its
movement.
2. Evaluation: Each particle in the swarm is evaluated based on its fitness, which quantifies how well it
solves the problem. The fitness function calculates a fitness value for each particle based on its
position in the search space. The fitness function assesses how well the particle's position meets the
problem's requirements or objectives.
3. Update particle's best position: Each particle remembers its own best position that it has encountered
so far. This is the position where the particle achieved the highest fitness value. This information is
stored as the particle's personal best or local best position.
Paper’s should be the fewest possible that accurately describe … (First Author)
8
❒ ISSN: 2252-8792
4. Update swarm's best position: The swarm keeps track of the particle with the best fitness value
among all the particles. This particle's position is considered the global best position for the swarm.
The swarm's best position is updated whenever a particle achieves a better fitness value than the
current global best.
5. Update velocity: The velocity of each particle is updated based on its current velocity, its distance
from its own best position, and its distance from the swarm's best position. The velocity update is
influenced by two factors: cognitive component and social component. The cognitive component
guides the particle towards its personal best position, while the social component attracts the particle
towards the swarm's best position. These components are weighted and combined to update the
particle's velocity.
6. Update position: Once the velocity is updated, the position of each particle is updated accordingly.
The particle's position is adjusted by adding the updated velocity vector to its current position. This
update allows the particle to move towards more promising regions in the search space.
7. Repeat: Steps 2 through 6 are repeated iteratively until a stopping criterion is met. This can be
reaching a maximum number of iterations, finding a solution that meets a predefined threshold of
fitness, or detecting a convergence condition. The iterative process allows the particles to explore
and exploit the search space, gradually converging towards better solutions.
The Particle Swarm Optimization algorithm can be used to solve a wide range of optimization problems,
such as finding the optimal solution to a mathematical equation, designing optimal engineering structures,
and optimizing financial portfolios. One of the advantages of PSO is its ability to find the global optimum
solution in a multi-modal search space.[20]
Int J Appl Power Eng, Vol. 99, No. 1, Month 2099: 1-1x
9
Int J Appl Power Eng ISSN: 2252-8792 ❒
The Firefly Algorithm is a nature-inspired optimization algorithm that is used to solve optimization
problems. It was developed by Xin-She Yang in 2008[21], and it is based on the flashing patterns and
attraction behavior of fireflies.In the Firefly Algorithm, each firefly represents a potential solution to an
optimization problem, and the objective is to find the optimal solution by simulating the flashing and
attraction behavior of fireflies. The algorithm is particularly effective in solving optimization problems that
involve multiple local optima.
The steps included within the firefly algorithm which is outlined in figure 5.2 are as follows[22]:
1. Initialization: The algorithm starts by creating a swarm of fireflies, typically represented as points in
a search space. Each firefly is assigned a random position within the search space and a fitness value
that represents how well it solves the problem being optimized. This fitness value is initially
calculated based on the current position of the firefly.
2. Evaluation: The fitness of each firefly is evaluated using a fitness function that quantifies the quality
of its solution. The fitness function measures how well the firefly's position satisfies the problem's
requirements or objectives. It assigns a score or value to each firefly based on its performance,
indicating its fitness level.
3. Attraction: The attractiveness of a firefly is determined by its brightness, which is proportional to its
fitness value. Brighter fireflies are more attractive to other fireflies in the swarm. The attractiveness
between two fireflies is influenced by their distance and the brightness of the firefly. Fireflies that
are closer together are more likely to be attracted to each other, and brighter fireflies have a stronger
pull on others.
4. Movement: Each firefly moves towards the most attractive firefly in its vicinity. The movement is
guided by the attractiveness value, which takes into account the brightness and distance between
fireflies. The fireflies adjust their positions to get closer to the more attractive fireflies, mimicking
the behavior of fireflies in nature. Additionally, fireflies may also adjust their brightness, aiming to
increase their attractiveness to other fireflies.
5. Repeat: Steps 2 to 4 are repeated iteratively until a stopping criterion is met. This could be reaching
a maximum number of iterations, finding a solution that meets a predetermined threshold of fitness,
or when the algorithm has converged to a satisfactory solution. By continuously evaluating the
fitness, updating the attractiveness, and adjusting the positions and brightness of the fireflies, the
swarm collectively explores the search space in search of better solutions.
The Firefly Algorithm is highly effective in solving a wide range of optimization problems, such as finding
the optimal solution to a mathematical equation, optimizing engineering designs[23], and improving financial
portfolios. One of the advantages of the Firefly Algorithm is its ability to find the global optimum solution in
a multi-modal search space[24].
Paper’s should be the fewest possible that accurately describe … (First Author)
10
❒ ISSN: 2252-8792
6. STATISTICAL ANALYSIS
In this paper, all three optimization algorithms were initialized with 5 sets of solutions randomly distributed
over the search space where lower and upper bounds are [0.0001 0.0001 0.0001 0.0001 0.0001 0.0001] and
[1.5 1.5 1.5 1.5 1.5 1.5] respectively for La,Ra,Kt,J,fo,Kb.The integral absolute error function is used as cost
function that needs to be minimized using the three algorithms.The model created using simulink
environment are similar in all the three algorithms.Algorithm parameters were defined after which
simulations were performed.
The advancement of best-cost over each cycle for the three optimization calculations were plotted as
follows:-
Int J Appl Power Eng, Vol. 99, No. 1, Month 2099: 1-1x
11
Int J Appl Power Eng ISSN: 2252-8792 ❒
(a)
(b)
Paper’s should be the fewest possible that accurately describe … (First Author)
12
❒ ISSN: 2252-8792
(c)
Fig 6.1 Evolution of cost-value over iteration for-(a) Genetic algorithm.(b)PSO.(c) firefly algorithm
After examination of PSO and firefly algorithm from figure 6.1 (b) and (c) we concluded that both the
algorithms converge to the solution that is similar. Whereas GA performed the worst in case of both best-cost
and time. When it comes to faster evolution of the best-cost PSO algorithm performs better i.e referring to
above figure it can be seen that at around 270th iteration PSO reached the lowest cost value, whereas firefly
algorithm and GA weren’t as fast as PSO. PSO is known for its fast convergence speed due to its ability to
efficiently explore the search space and converge towards the optimal solution. On the other hand, Firefly
Algorithm might require more iterations to converge, especially for complex problems.On the other hand
because genetic algorithm make use of genetic operators such as selection,crossover,mutation,elitism and
replacement which make it computationally complex and slow.Table provides an comparison of various
algorithms based on global best cost, values of DC-motor parameters and the gain and phase margin obtained
by frequency response estimation of the three predicted models:-
Int J Appl Power Eng, Vol. 99, No. 1, Month 2099: 1-1x
13
Int J Appl Power Eng ISSN: 2252-8792 ❒
Paper’s should be the fewest possible that accurately describe … (First Author)
14
❒ ISSN: 2252-8792
Int J Appl Power Eng, Vol. 99, No. 1, Month 2099: 1-1x
15
Int J Appl Power Eng ISSN: 2252-8792 ❒
From figure 6.2 it can be concluded that in spite of the fact that all four DC-servo motor models produced
the same time domain response, they don't appear to have the same frequency response. By comparing the
gain margins and phase margins of the models, it is seen that they are stable in closed loop in all the models.
Table 6.1 gives a comparison of different calculations based on best cost fetched, values of DC-motor
parameters and the frequency response gain margins of the three models along with the actual system:-
7. CONCLUSION
In this paper ,the genetic algorithm, particle swarm optimization algorithm and firefly algorithm were
independently implemented to predict the parameters of a simple DC-servo motor taking sensor quantization
into account by minimizing the integral absolute error from the actual system and the predicted system for
every solution of an iteration. From the above it can be concluded that predicting the parameters altogether
will give results very different from the actual system, though the test input data after the prediction give
satisfactory results, but frequency response the system predicted from the three algorithms were different.The
predicted systems were closed-loop stable as referring to the bode plots above.PSO was faster than firefly for
finding minimum whereas GA gave high cost function value and slower because of the genetic
operators.This approach must be done for controller design or system identification and not for parameter
measurement as the predicted results are different even though the time response were same.Further research
need to be done the find the actual parameters by constraining the number of variables to not more than two
or so.
Despite the contrasts within the algorithms, it is found that all three algorithms delivered closed-loop steady
response, as proven by the bode plots in figure 6.2 and table 6.1. Also, the particle swarm optimization (PSO)
calculation was quicker than the firefly algorithm in finding the least, whereas the genetic algorithm (GA)
had the next fetched work esteem and was slower due to the genetic operators included in its
implementation[25].The conclusion drawn from this is that the approach of foreseeing the parameters of the
DC-servo motor utilizing these algorithms is more reasonable for controller designing and simulation
purposes instead of parameter estimation. This is because the predicted parameters deviate from the actual
DC-servo motor, indeed in spite of the fact that the time response of the predicted models are very similar.
The authors recommend that assist investigate ought to be conducted to decide the genuine parameters of the
Paper’s should be the fewest possible that accurately describe … (First Author)
16
❒ ISSN: 2252-8792
optimization algorithms by restricting the number of parameters to not more than two, or a comparable
limitation so that the arrangement of a single parameter will converge to the actual value, when number of
parameters for the evaluation are less and subsequently the frequency response will be indistinguishable.
REFERENCES
[1] Decy Nataliana,R Syafruddin, Givy Devira Ramady, Yakob Liklikwatil and Andrew Ghea Mahardika et al 2019 J. Phys.: Conf.
Ser. 1424 012040. DOI 10.1088/1742-6596/1424/1/012040
[2] X. Mu, F. Cai, R. Zheng, D. Zhang and D. Gu,"A Predictive Current Control for Aerospace Servo Motor," 2021 3rd International
Conference on Applied Machine Learning (ICAML), Changsha, China, 2021, pp. 366-369,
doi:10.1109/ICAML54311.2021.00084.
[3] L. LJUNG, System Identification: Theory For The User, Prantice Hall, 1987.
[4] J. Miroslav, O. Lucie, J. Boril et J. Rudolf, «Parameter identification for pilot behavior model using the MATLAB system
identification toolbox.,» chez International Conference on Military Technologies, Brno,
2017.DOI:10.1109/MILTECHS.2017.7988824
[5] R. Sandeep, Y. Shih-Yu et T. Tsu-Chin, «Wind Turbine System Identification and Individual Pitch Control,» chez American
Control Conference, Seattle, 2017. DOI:10.23919/ACC.2017.7962144
[6] I. Rolf et M. Marco, Identification of Dynamic Systems: An Introduction with Applications, Springer, 2012. DOI:10.1007/978-3-
540-78879-9
[7] Jiahao Wang, Azzedine Boukerche,Non-parametric models with optimized training strategy for vehicles traffic flow
prediction,Computer Networks.Volume 187,2021,107791,ISSN 1389-1286.DOI:10.1016/j.comnet.2020.107791
[8] Ali M. Humada, Salih Y. Darweesh, Khalid G. Mohammed, Mohammed Kamil, Samen F. Mohammed, Naseer K. Kasim, Tahseen
Ahmad Tahseen, Omar I. Awad, Saad Mekhilef,Modeling of PV system and parameter extraction based on experimental data:
Review and investigation, Solar Energy,Volume 199,2020,Pages 742-760,ISSN 0038-092X,.DOI:10.1016/j.solener.2020.02.068
[9] Altbawi, S.M.A.; Khalid, S.B.A.; Mokhtar, A.S.B.; Shareef, H.; Husain, N.; Yahya, A.; Haider, S.A.; Moin, L.; Alsisi, R.H. An
Improved Gradient-Based Optimization Algorithm for Solving Complex Optimization Problems. Processes 2023, 11, 498.
DOI:10.3390/pr11020498
[10] Abhishek Kumar, Guohua Wu, Mostafa Z. Ali, Rammohan Mallipeddi, Ponnuthurai Nagaratnam Suganthan, Swagatam Das,A
test-suite of non-convex constrained optimization problems from the real-world and some baseline results,Swarm and Evolutionary
Computation,Volume 56,2020,100693,ISSN 2210-6502,DOI:10.1016/j.swevo.2020.100693
[11] M. S. Fakhar et al., "Conventional and Metaheuristic Optimization Algorithms for Solving Short Term Hydrothermal Scheduling
Problem: A Review," in IEEE Access, vol. 9, pp. 25993-26025, 2021, doi: 10.1109/ACCESS.2021.3055292.
[12] Dipraj, Dr AK. "Speed Control of DC Servo Motor By Fuzzy Controller." International Journal of Scientific & Technology
Research 1.8 (2012).
[13] Sami, Saif Sabah, et al. "Detailed modelling and simulation of different DC motor types for research and educational purposes."
Int. J. Power Electron. Drive Syst 12.2 (2021): 703-714.DOI:10.11591./ijpeds.v12.i2.pp703-714
[14] https://www.engr.siu.edu/staff/spezia/Web438A/Lecture%20Notes/lesson14et438a.pdf
[15] V. Veerasamy et al., "A Hankel Matrix Based Reduced Order Model for Stability Analysis of Hybrid Power System Using PSO-
GSA Optimized Cascade PI-PD Controller for Automatic Load Frequency Control," in IEEE Access, vol. 8, pp. 71422-71446,
2020, doi: 10.1109/ACCESS.2020.2987387.
[16] C.Sankar Rao , Simi Santosh , Dhanya Ram V,”Tuning optimal PID controllers for open loop unstable first order plus time delay
systems by minimizing ITAE criterion”.DOI:10.1016/j.ifacol.2020.06.021
[17] Alawan, Mazin Abdulelah, and Oday Jasim Mohammed Al-Furaiji. "Numerous speeds loads controller for DC shunt motor based
on PID controller with online parameters tuning supported by genetic algorithm." Indonesian Journal of Electrical Engineering
and Computer Science 21.1 (2021.)doi: 10.11591/ijeecs.v21.i1.pp64-73.
[18] Z. X. Loke, S. L. Goh, G. Kendall, S. Abdullah and N. R. Sabar, "Portfolio Optimization Problem: A Taxonomic Review of
Solution Methodologies," in IEEE Access, vol. 11, pp. 33100-33120, 2023, doi: 10.1109/ACCESS.2023.3263198.
[19] Rehman, Saif ur & Asghar, Sohail & Fong, Simon. (2018). Optimized and Frequent Subgraphs: How Are They Related?. IEEE
Access. PP. 1-1. 10.1109/ACCESS.2018.2846604.
[20] Gad, A.G. Particle Swarm Optimization Algorithm and Its Applications: A Systematic Review. Arch Computat Methods Eng 29,
2531–2561 (2022). https://doi.org/10.1007/s11831-021-09694-4
[21] Jinran Wu, You-Gan Wang, Kevin Burrage, Yu-Chu Tian, Brodie Lawson, Zhe Ding, An improved firefly algorithm for global
continuous optimization problems, Expert Systems with Applications,Volume 149,2020,113340,ISSN
0957-4174,DOI:10.1016/j.eswa.2020.113340
[22] F. Wahid et al., "An Enhanced Firefly Algorithm Using Pattern Search for Solving Optimization Problems," in IEEE Access, vol.
8, pp. 148264-148288, 2020, doi: 10.1109/ACCESS.2020.3015206.
[23] Bacanin, N.; Tuba, M. Firefly Algorithm for Cardinality Constrained Mean-Variance Portfolio Optimization Problem with
Entropy Diversity Constraint. Sci. World J. Spec. Issue Comput. Intell. Metaheuristic Algorithms Appl. 2014, 2014,
721521.DOI:10.1155/2014/721521
Int J Appl Power Eng, Vol. 99, No. 1, Month 2099: 1-1x
17
Int J Appl Power Eng ISSN: 2252-8792 ❒
[24] Bacanin, N.; Stoean, R.; Zivkovic, M.; Petrovic, A.; Rashid, T.A.; Bezdan, T. Performance of a Novel Chaotic Firefly Algorithm
with Enhanced Exploration for Tackling Global Optimization Problems: Application for Dropout Regularization. Mathematics
2021, 9, 2705. https://doi.org/10.3390/math9212705
[25] Debasri Saha , Member, IEEE, and Susmita Sur-Kolay, Senior Member, IEE,”Guided GA-Based Multiobjective Optimization of
placement and Assignment of TSVs in 3-D ICs”,IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI)
SYSTEMS, VOL. 27, NO. 8, AUGUST 2019:DOI:10.1109/TVLSI.2019.2908087
Sriram Swain
(https://orcid.org/0009-0001-8923-8773)
(https://scholar.google.com/citations?hl=en&authuser=1&user=RKWmYpcAAAAJ)
(https://www.webofscience.com/wos/author/record/ITT-0015-2023)
(https://id.elsevier.com/settings/redirect?code=3U0uNyz9xgQOs9tvyqAFkhCJAleDWzS0-
l445uG7)
is currently pursuing B. Tech degree in Electrical and Electronics Engineering at International
Institute of Information Technology, Bhubaneswar, Odisha, India (Batch 2020-2024).He is a
versatile professional with a deep passion for DSP, control systems, electronics, and aerospace
engineering. With a strong background in electrical engineering, he has conducted extensive
research and undertaken practical projects in signal processing, focusing on areas such as
signal filtering, noise reduction, and spectral analysis. He possesses a keen understanding of
control systems, having designed, implemented, and optimized PID controllers and control
strategies for various applications.
He can be contacted by email: [email protected]
Anubhav Gaur, a dedicated B. Tech student in Electrical and Electronics Engineering from
International Institute of Information Technology, Bhubaneswar, Odisha, India (Batch 2020-
2024), possesses a notable role as a core member of TARS, a prominent society within the
institute. In this capacity, Anubhav actively contributes to the organization's activities and
initiatives, showcasing his leadership abilities and unwavering commitment to fostering a
thriving community. Simultaneously, he engages in a rewarding internship at Ciena in
Gurugram, gaining valuable industry experience. Anubhav's technical acumen extends across
diverse domains, including DSP control systems, C/C++ programming, web development,
software engineering, and embedded systems. Furthermore, his achievements include
authoring a noteworthy conference paper on particle swarm optimization and genetic
algorithms for PID controller tuning. Anubhav's multifaceted involvement exemplifies his
passion for technology and his determination to excel both academically and professionally.
He can be contacted by email: [email protected]
Paper’s should be the fewest possible that accurately describe … (First Author)
18
❒ ISSN: 2252-8792
Suraj Chouhan
(https://orcid.org/0009-0007-5898-4435)
(https://scholar.google.com/citations?
hl=en&user=nc5em9oAAAAJ&scilu=&scisig=ACseELIAAAAAZIqKwpwmGQpbsMvJb-
b3fPDyIrk&gmla=AHoSzlWShINJgjPKWGRju3k_OIyu_Uw7meI2zqg3_7kCDOXYhiFsUb
C8AIh__fpJTN46kuhdhldFsDSfJ7_wn1CjfhSSKLcigcj7k8B-
jCU&sciund=15455261892091481985)
(https://www.webofscience.com/wos/author/record/ITT-0332-2023)
(https://id.elsevier.com/settings/redirect?code=Y-RfEbrj1lKo4kZ9iE7Y97-T-
sx6pk8Oo64wRq2d)
is currently pursuing B. Tech degree in Electrical and Electronics Engineering at International
Institute of Information Technology, Bhubaneswar, Odisha, India (Batch 2020-2024).With a
strong passion for electrical engineering, Suraj has acquired extensive knowledge in various
subjects including Control System Engineering, Signal & System, Digital Signal Processing,
Electrical Machines, Digital Electronics, Analog and Power Electronics, and 8085
Microprocessor. His commitment to learning and his enthusiasm for the field make him a
promising engineer in the making. He can be contacted at email:
[email protected].
Int J Appl Power Eng, Vol. 99, No. 1, Month 2099: 1-1x