Chat Bot
Chat Bot
By
I hereby declare that the work presented in this report entitled “Android Chatbot ” in
partial fulfilment of the requirements for the award of the degree of Bachelor of
Technology in Computer Science and Engineering/Information Technology submitted in
the department of Computer Science & Engineering and Information Technology, Jaypee
University of Information Technology Waknaghat is an authentic record of my own work
carried out over a period from Feb 2023 to May 2023 under the supervision of Dr. Nishant
Sharma Assistant Professor(Grade-2)Department of Computer Science And Engineering
. I also authenticate that I have carried out the above mentioned project work under the
proficiency stream Data Science.
The matter embodied in the report has not been submitted for the award of any other
degree or diploma.
ARPIT SOOD
Arpit Sood(191513)
This is to certify that the above statement made by the candidate is true to the best of my
knowledge.
(Supervisor Signature)
Dr. Nishant Sharma
Assistant Professor(Grade-2)
Department of Computer Science And Engineering
Dated:
i
PLAGIARISM CERTIFICATE
ii
ACKNOWLEDGMENT
Firstly, I express my heartiest thanks and gratefulness to almighty God for His divine
blessing makes us possible to complete the project work successfully.
I would also generously welcome each one of those individuals who have helped me
straight forwardly or in a roundabout way in making this project a win. In this unique
situation, I might want to thank the various staff individuals, both educating and non-
instructing, which have developed their convenient help and facilitated my undertaking
Finally, I must acknowledge with due respect the constant support and patients of my
parents.
Arpit Sood
iii
TABLE OF CONTENT
1 Candidate’s Declaration I
2 Plagiarism Certificate II
3 Acknowledgement III
4 Table of Content IV
6 List of Figures V
7 List of Tables VI
8 Abstract VII
14 References 65
iv
List of Figures
Fig No. Title Page No.
1.1 Incremental Model 6
v
4.3 Conversion flow 62
vi
List of Tables
vii
Abstract
Today the users face a lot of problem regarding booking of the hotels in any android
application because in most of the cases the user gets result more than what he expected
or he gets results which are not according to his convenience .This paper focuses on
automating the process of communication by use of chat-bot and it also focuses on
providing customized results to the user which makes the process of hotel booking
convenient and user friendly for him. An extensive research done on existing systems
gave us an insight into their shortcomings which this system attempts to overcome by
creating a chat-bot using Artificial Intelligence Markup Language and using various
algorithms such as Keyword Matching , String Similarity , Spell Checker and Natural
language parser. The implementation of this system has resulted in better resource
utilization and increased responsiveness of user behaviour. This system has been
implemented to integrate with any Hotel Management Android Application to ease the
process of hotel booking.
viii
CHAPTER 01
1.1 INTRODUCTION
1
database. When a user submits a message to a chatbot programme, the
chatbot programme formulates and sends the user a response in
accordance with the AIML response that corresponds to their message.
Under the terms of the GNU General Public Licence, it may be immediately
installed on a local server. Internet-based chatbots can be programmed to
respond to text, speech, and emotional cues. We took user input for this
post in the form of text and voice. Because the user may check the input to
see if there are any problems, text input and output is comparatively
efficient. Text entry, however, takes time. The introduction of a voice
interface with speech recognition technology is the answer. These
techniques greatly enhance the chatbot application's ability to
communicate with the user.
The way people engage with one another and with enterprises has been
completely transformed by digitization and the advent of mobile and
internet-connected devices (Eeuwen, M.V., 2017). As technology companies
integrate artificial intelligence (AI) into the products they sell, such as
Google Assistant, Google Home, and Amazon Alexa, millennials are
adopting and supporting new technologies into the routine of their
everyday lives. Businesses anticipate that the current and forthcoming
generations will be crucial, game-changing clients. They demand smooth
interactions, responses in a matter of seconds rather than minutes, and
more intelligent self-service alternatives. (2017) (Teller Vision). Among the
first businesses to use the technology was the banking and financial
services sector. This integration has significantly expanded and assists
banks in connecting with a larger consumer base, allowing.
2
to develop a chatbot that can process subsequent searches in light of earlier
ones. This feature will enhance the chatbot's capacity to process input in
the context of a chatbot. Approach: We redesigned the chatbot using a
relational database model strategy in an effort to expand the conventional
chatbot process mechanism.
1.3 Objectives :
3
• Understanding the user's needs and responding with the
appropriate information is a chatbot's main goal.
• Chatbots allow businesses to engage in personal client interaction
without needing to hire human staff.
• They work as online assistants who support curriculum changes,
paper grading, student and graduate data retrieval, and
management of the admissions process.
1.4 Methodology :
To guarantee that a realistic time period is created for each step of the
project and that requirements are properly specified, choosing a suitable
methodology is crucial to the overall development of any software
programme. The creation and design of this programme will take into
account a variety of development approaches. The development
methodology that is most suited for this project will be highlighted in this
section.
When you initially learn about software development, you are typically
introduced to this pretty traditional methodology. The requirements
collecting, analysis, design, implementation, and testing phases make up
the five phases of the waterfall model, a highly predictable method for
developing software. The stages are finished one after another. The
fundamental drawback of the waterfall approach is that because the project
is broken up into phases, it is quite rigid. To adhere to the overall project
schedule, each phase has a deadline to provide an output. Project
deliverables, design documentation, and test plans are used to gauge the
success and progress of a project. It is challenging since each project phase
is described early in the project life cycle and goals have been specified.
4
1.4.2 Incremental Model :
Bugs are found early in short, controllable cycles, making testing and
troubleshooting simpler. As new requirements are easily incorporated into
each build and an updated version is produced, this process is adaptable
during implementation (Khan et al., 2011).
5
stages of development and since each iterative build makes it simpler to
implement new requirements throughout the development process, the
incremental model's flexibility makes it the best choice for this project.
6
• ELIZA is speaking in a language she does not comprehend. It only offers
outcomes that follow pre-established rules.
• ALICE can only get information from its database because it lacks the
ability to learn.
Natasha is solely used to speak with users and provide them with
information that is accessible online. The objective of this system is not to
make hotel reservations.
This system interacts with users using a chat application that offers
insightful comments and guidance for obtaining the data required to
reserve hotel rooms. The user receives precise output from the system,
even in the case of small spelling mistakes. Additionally, parsing prevents
transmitting words to the system that don't fit into patterns. Frequent
travellers can maximise the system's potential.
7
When reserving a hotel room in the same city or when the user forgets to
specify where they want the accommodation, this location is helpful.
• Enable users who are not logged in to register and store their data to the
database.
• Once the user's credentials have been uploaded to the database, a QR code
using Google's two-factor authentication will be displayed to them, and a
special code will be generated and delivered to the user's mobile device.
• Users must be able to examine information about the accounts they have
access to, such as savings, loans, and checking accounts.
8
The chat bot will send users a transaction statement via email so they may
view their transactions.
• Users will be able to communicate with the chatbot via voice or text
instructions, and it will be able to understand what they are saying thanks
to Dialogflow API integration's natural language understanding processing.
• When the context from earlier messages and chats may be unclear, a
chatbot should be able to retain a conversational state.
• Give accurate comments and responses for the EEE521 Final Year Project
2017–2018 B00659303 19 input.• If feasible, handle unexpected inputs
effectively and alert the user in the right way.
9
.
• Tramp
• Laravel Homestead
• Nginx server
• MySQL DB 11
• PHP 7.1
• JavaScript
• Oracle VM VirtualBox
• Dialogue flow
• Laravel 5.6
• Sublime Text 2
• Ngrok
10
A collection of project milestones and deliverables is developed to show
how the project is broken down so that a management plan can be created
and followed throughout the project life cycle. Minor adjustments could
happen in each iteration, though, because some requirements might need
to be modified. To ensure that all project deliverables are met, the steps
necessary in each iteration are specified. Make a risk management strategy
and project schedule with project tasks for each semester displayed on a
Gantt chart.
11
Figure 1.2 Gantt Chart
12
CHAPTER 02
13
primarily influenced by "service quality, web design and content, security,
privacy, speed, and convenience" (Ling et al., 2016). This indicates that
there isn't enough technology to raise client satisfaction. A chatbot might
be integrated into the online banking experience to offer quick, convenient,
and customised services.
14
The following figure gives the information of our system architecture.
The system operates in both text and voice modes. The first mode is turned
on when the user enters data in text format. The middleware API receives
user input and returns a response. The second mode, on the other hand, is
engaged when the user speaks. To submit the speech data to the
middleware API in this voice mode, we first transform the voice to text.
15
CHAPTER 03
SYSTEM DEVELOPMENT
Design
Figure 3.1
16
Figure 3.2
Figure 3.3
From the online client and the Google Assistant app for Android devices,
users can communicate with the chatbot. Users will communicate with the
chatbot using natural language text or voice phrases. Users will find it
simpler to use and interact with the chatbot thanks to the Google Assistant
17
integration because there will be less typing and effort necessary when
using voice requests and rich responses like graphics and cards. The
chatbot may simply be expanded with Google Home thanks to the
integration of Google Assistant. The client side of the application is created
with Laravel Blade, HTML5, CSS, and JavaScript. The Laravel framework has
a templating engine that adheres to the MVC architectural design pattern.
Blade for Laravel compiles and caches
Additionally, a web client and Webhook that retrieve JSON data from NLU
via HTTPS POST requests are implemented using the Laravel framework.
As a controller class, Webhook is created, allowing the associated web route
to where the Webhook receives the payload in real time from the NLU.
When the NLU determines that a user has activated a client-side intent,
Google Assistant device, or Home device, the webhook receives data from
the NLU.
An API uses a RESTful design approach and asks data to be obtained, but a
webhook accepts data given to it from another service whenever an update
or event occurs (elastic.io, 2018).
The chatbot can be trained to recognise entities and intents in user speech
using Dialogflow's natural language understanding (NLU) engine. Intent
mapping on the Dialogflow console will be used in the design of Chabot to
direct user utterances to a set of phrases.
18
The diagram shows the data flow as a user invokes an intent: By instructing
Dialogflow to recognise a set of common user expressions, it may
determine whether an intent has been activated. The fulfilment webhook
then receives this data. A custom response is generated and returned to the
user from the webhook either visually or verbally when the required
entities and parameters are extracted along with the action. The webhook
decodes a generated JSON response and sends it back to the user for text
and voice interaction after parsing and validating the JSON from the
received payload.
Figure 83.2.1 depicts the database model for the proposed chatbot and
defines multiplicity. The users table contains details about application
users, including whether or not they have Google 2 Factor Authentication
enabled. The appointments table contains information on a user's
appointments, including "topic". Contains details provided by the user
19
about the purpose of the meeting. Whether the user already has an
appointment is indicated by the boolean value set for the 'booked' element.
Data pertaining to a user's bank accounts are contained in the Accounts and
Transactions table, for instance, a user may have several different types of
accounts with a single bank, such as checking and savings
20
Figure 3.6 UML
21
applications use the HTML5 Speech Recognition API for speech recognition
and communication.
The output voice is then transformed by Dialogflow into JSON objects that
can be read as input. A match score, sometimes referred to as the trust
score, is given to the JSON answer. According to DialogFlow (2015), this
rating demonstrates how successfully the NLU engine matches user input
to the destination indicated in the console. 1 is a real match, and scores
range from 0 to 1. The JSON response is described in more detail below.
22
dealing with chatbots and how the bots react to those behaviours through
procedures. These will interfere with the chatbot's many features. The logic
used by the chatbot when a user asks a currency conversion is shown in
Figure 10.3.5.1. The user statement is sent to the Fulfilment webhook once
the NLU analyses the user input and establishes the context. The NLU
chatbot's JSON responses are accepted via the webhook, which employs
POST endpoints. The accompanying intended action, which was predefined
in the Dialogflow console during training, is then determined by the
webhook and is mapped to the user-entered statement. If the behaviour
23
Figure 3.8 UML activity diagram
24
Figure 1 depicts the data flow when a user asks for their balance. NLU takes
the user's feedback into account while presenting the intended product.
The user's input is subsequently sent to the webhook in JSON format.
By removing the action from the intent that they receive and determining
if it is contained in the intent, a webhook is utilised to offer users with
appropriate bespoke fulfilment. Depending on the current action, it then
calls the relevant function. The getBalance() function is invoked if the
action is active. This method uses REST endpoints to access the True Layer
API. The NLU chatbot is then delivered the response, which has been
encoded in JSON format. An authentication package is included with the
Laravel framework to handle application authentication. The procedure for
logging in and registering is shown in Figure 123.5.3. Users must register
in the application first in order to create an account. The user will not be
permitted access to the site if they attempt to log in before registering.3.5
Sequence diagrams:
25
webhook in JSON format. By interpreting the JSON it gets, the webhook
decides what to do with the defined intent published from the NLU. The
webhook will contact the appropriate method in the
TrueLayerClientStarlingHelper class, which will call the True Layer Starling
API and return data to the webhook to generate a response if the action fits
the defined bank procedure. When the webhook receives the information
returned by the API, it encodes the information into JSON, which appears
as a response to the NLU.
Figure 3.11
Figure 3.12
26
Figure 3.13
The audience that will utilise the chatbot must be carefully considered in
order to implement it successfully. Because mobile banking is important to
a wide range of age groups, the chatbot's target audience is made up of
individuals from various age groups and technological backgrounds. The
chatbot will provide simple, direct responses in order to appeal to all age
groups. Developers can choose a voice for their bot that fits its personality
and use rich answers with Google Assistant (Actions on Google, 2018).
However, as the majority of users of a site like this currently engage online
primarily through messaging services, using a platform like this should be
simple (Interactions.acm.org, 2017).
The dialogue will take place inside the chatbot using natural language, and
conversational user interfaces (CUIs) will be designed with this interaction
in mind. According to Actions on Google (2018), "Conversation as a method
of interaction is frequently referred to as the new user interface." 2018
(cs.hmc.edu) establish a set of design guidelines for conversational user
interfaces.
27
28
Table 3.1 CUI Design Principles
29
Table 3.2 CUI Design Patterns
The design of the dialogue is an important factor to take into account when
building a chatbot because the interaction is mostly between the user and
the chatbot and occurs through natural language. To do this, the
appropriate dialogue will be created using follow-up and fallback intents in
the Dialogflow console. There will be a predetermined purpose that, when
used by the chatbot user, will end the conversation in its entirety,
independent of the circumstances. A dialogue box is a group of phrases and
words that react to user input. There will be conversation.
30
The chatbot utilises successive intents to ask the user for further
information, such as the date, time, and phone number because it is aware
that it needs this information to function properly. As this chatbot is a task-
based interactive chatbot that includes banking domain knowledge,
Dialogflow recommends designing a linear dialogue for interactions where
the data present in the conversation is extracted to help users achieve their
goal. A user statement may be expressed in a variety of ways:
To return the response to the user in this example, the chatbot will remove
the necessary "balance" and "account" inputs (Dialogflow, 2018).
31
responses, various colours will be employed. and user input.
the design for the web-based chatbot view. Users can see how
Users will be able to discover excessive spending by seeing how much they
spend and what they spend their money on in a graphical style. For
instance, they may realise they spend too much on coffee each week. Due
to the fact that each message is given a name and colour depending on
whether it was sent by the user or the bot, it is visually obvious to the user
what the bot's response is.
Users can use an authenticator app on their device to scan the QR code in
the image. Users will be required to enter a special code after each login,
32
which prevents users from seeing the chatbot view until they have verified
their account. After the QR code is scanned, a special 6-digit number will be
Users won't need to switch between several websites to routinely see their
transactions. An email will be sent to the user's registered email address
after they request to examine their transactions.
Implementation:
33
Step-by-step execution
Add the following to the build.gradle file by going to Application > Gradle
Scripts. dependency to it in dependencies section.
Add the following code to activity_main.xml by going to application > res >
layout. The code for the activity_main.xml file is shown below.XML
34
Create a modal class in step 5 to store our messages.
Go to application > java > package name of your programme > right-click
on it > New > Java class and give it the name "MessageModal" before adding
the following code. To make the code more comprehensible, comments are
added.JAVA
35
Step 6: Make a user message layout file.
The drawing folder contains the icons used in this file. Navigate to app > res
> layout, right-click on it, and choose "New > layout source file." Name the
file "user_msg," and then enter the following code there.
The drawable folder contains the characters used in this file. To create a
new layout file, select New > Layout File from the context menu of the
application, name it bot_msg, and then enter the following code.
36
Working with the Adapter class in step 8
We must develop an Adapter class in order to set the data for our Chat
RecyclerView objects. Navigate to application > java > package name of
your application > right-click it > New > Java Class, give your class the name
37
MessageRVAdapter, and then add the following code to it. To make the code
more comprehensible, comments are added.
38
39
Create an API key in step 9 to access the chatbot service.
Check out Brainshop.ai With your username and password, you can easily
establish an account. Just register for an account on our website. The screen
below is what you will see after making a new account. You must input your
email address after creating an account and select the Require password
option to request a new password.
After entering your email address, you must update your account's
password. We may now begin writing the API code.
40
Figure 3.19
Figure 3.20
41
Use the MainActivity.java file in step 10.
the code below into the MainActivity.java file. The code for the
MainActivity.java file is shown below.
JAVA:
42
43
44
45
CHAPTER 04
4.1 Testing:
The chatbot's comprehension of user speech for voice and text interactions,
even when it is not written, was examined as part of the dialog's efficacy
evaluation. With average response times for voice and text exchanges, the
response also indicates whether the intent was understood. Performance
data for the chatbot have been tracked through certain testing procedures.
46
Metrics were recorded for each simulated encounter, which was utilised to
mimic how users would communicate with the chatbot. The tool
instantiates the Dialogflow API on the command line, fires the welcome
event for it, and then shows the response it returns as the command-line
result to begin the simulated interaction for testing. The command line was
used to test each sort of use case, and JSON formatted results were
produced. Twenty simulated utterances in total were evaluated, and Table
5's findings are displayed there. Each remark tested was specifically word
47
Table 4.2 Simulated User Phrases: Test Results
48
typo.
Through the Actions on Google (AoG) emulator in the AoG interface, Google
Assistant and Home Agent were both evaluated. While Google Home was
tested using voice interaction on the AoG simulator, Google Assistant
integration was tested using text input through the simulator. If the
utterance matched the intent phrase, the simulator delivers a JSON object
with information on the chatbots' level of understanding. Similar language
was used in all facilities to describe any differences in how NLU was
understood.
49
50
51
Table 4.3 Simulated Device testing
As can be seen from the findings in Table 9, the Google Home device was
the primary means of communication with the chatbot since it was able to
accurately match the intent of each delivered speech, earning a score of
100%. When compared to text-based interactions, conversational
interfaces using spoken natural language performed better, with Google
Assistant scoring 57.14%. The two conversational interfaces are very
different from one another, indicating that speaking when interacting with
natural language conversational interfaces is far more effective. With an
average speech recognition rate of 85.71%, Google Assistant's speech
recognition outperforms the HTML5 Speech API, suggesting that Google's
platforms will function more effectively during actual user interactions.
The table shows how many attempts the chatbot made to understand the
meaning of the misspelt phrase defined in Table 87 7 before terminating
the discussion. The chatbot provided backup responses at every
opportunity, including "Sorry, I didn't quite get that, can you repeat that
53
please?" and "Sorry, I'm having trouble understanding what you said, can
you say it again?" When the chatbot doesn't understand a specific
statement or receives unexpected input, this functionality takes care of it.
Even if the user statement was the identical each time, this also describes
the use of various fallback intentions. Due to the user's frequent requests
for the acceptable phrase and its absence, as well as its inability to, the
chatbot left the conversation.
Bugs can be fixed before the agent is given access to technologies like
Google Assistant and Google Home or web-based chatbots since Postman
is used to verify the operation of the Webhook in the distinct context in
which it was constructed. Each time the response is returned to the event
number, this will identify the network and log the issue. The picture below
displays an example of a JSON response that the webhook returned.
54
To determine whether it could successfully contact the TrueLayer API and
determine any necessary parameters or authentication headers that had to
be given with each subsequent request, the integration of other APIs
implemented inside the project was also put to the test.
The Google Assistant, home, and web chatbots were chosen to be the three
chatbot communication channels for user testing. As determined by the
user's interaction experience, this will shed light on the chatbot's actual
quality and general usefulness. A list of inquiries was developed to assess
the chatbot during user testing. A group of users were given this
questionnaire, which can be found in the report's appendix, in response to
these inquiries.
4.5 Evaluation:
55
Subjective (n)Users Objective (n)Users
metrics metrics
Naturalness 44.00% Speech 86.67%80.00%
Recognition
Likeability 93.33% Response 86.67%
accuracy
Ease of use 86.66% Response rate 73.33%
Speech 80.00%
Recognition
Table 4.6 Subjective & objective measurement
Due to low user satisfaction when using their services, as was stated in
Section 1.2, the majority of banks have trouble encouraging their
consumers to adopt the technology. The incorporation of a chatbot into
their financial services would greatly raise the degree of user happiness,
according to the findings in Table 119. This is clear from the fact that
compared to other users, 73.33% of all users believed the chatbot was
"Extremely" fast. considerably quicker than the existing technologies their
bank offers, such an online banking app. The results of the user testing
phase confirmed that chatbots can successfully engage users and boost
satisfaction: 86.66% of all respondents said that the chatbot was either
"extremely very easy" or "moderately easy," easy to talk to, and achieved an
overall likeability of 93.33 out of 93.33%, as indicated in table 119. The
performance and quality of chatbots in actual contact are measured
objectively using the questionnaire results. Overall, the chatbot's voice
recognition rate was quite good; 80.00% of users said the chatbot
understood them "very well." This metric shows the efficacy of simulated
activities used in testing, as seen in Table 7. According to Table 6 of Section
5, the chatbot achieved an overall understanding score of 0.8894 (89.00%)
on the simulated exams. The understanding scale's upper limit is reached
with this outcome.
56
CHAPTER 05
5.1 CONCLUSIONS
57
process of allowing other applications to integrate with its review of
messenger apps, so this integration was not developed because Facebook
stopped allowing developers to create new apps or chatbots through the
service. Although Facebook has just recently reopened its app review
process, it is now allowing developers to re-integrate with Messenger,
allowing development of this integration feature for future prototypes
(Facebook, 2018). Google authentication integration would be a beneficial
feature for an implementation that allows users to link their Google account
across multiple devices where the agent is distributed, such as Google
Assistant, which improves the cross-platform experience for users.
A user can find out financial information about their account, such as how
much they spent in a week. Based on this feature, it was thought that during
the later stages of development, users should be able to directly query the
chatbot to find out where and what they are spending their money on.
Although users can view this information in a graphical format in a web
application, it would also be useful if they could discover it through
interaction with a chatbot.
From observing user interaction with the chatbot during user testing, there
were many suggestions to integrate the chatbot into other popular
platforms such as Amazon Echo or Dot, which would increase
58
REFERENCES
[1] Ling, G., Fern, Y., Boon, L. and Huat, T. (2016). Understanding Customer
Satisfaction of Internet Banking: A Case Study In Malacca. ‘Procedia
Economics and Finance’, 37, pp.80-85.n (Accessed 11/10/2017)
[3] Aburub, F., Odeh, M., & Beeson, I. (2007). ‘Modelling non-functional
requirements of business Processes’. Information and Software Technology,
49(1112),11621171.Availableat:http://dx.doi.org/10.1016/j.infsof.2006.
12.002 (Accessed 10/10/2017).
[4] Ling, G. M., Fern, Y. S., Boon, L. K., & Huat, T. S. (2016). Understanding
customer satisfaction of internet banking: A case study in malacca.
Available at doi:https://doi.org/10.1016/S2212- 5671(16)30096-X
[5] Asif Irshad Khan, A.I., Rizwan Jameel Qurashi, R. J and Usman Ali Khan,
U. A. (2011). 'A Comprehensive Study of Commonly Practiced Heavy and
Light Weight Software Methodologies ', IJCSI International Journal of
Computer Science Issues, 8(4), pp. [Online]. Available at:
https://arxiv.org/ftp/arxiv/papers/1111/1111.3001.pdf(Accessed:08/11
/2017)
59
60