0% found this document useful (0 votes)
13 views4 pages

imp points in AI

The AI project cycle consists of six stages: Problem Scoping, Data Acquisition, Data Exploration, Modeling, Evaluation, and Deployment. Each stage involves specific tasks such as defining the problem, gathering and analyzing data, creating models, and ensuring ethical considerations. Additionally, understanding data types, data literacy, and the importance of math in AI are essential for successful AI project implementation.

Uploaded by

zeebro581
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views4 pages

imp points in AI

The AI project cycle consists of six stages: Problem Scoping, Data Acquisition, Data Exploration, Modeling, Evaluation, and Deployment. Each stage involves specific tasks such as defining the problem, gathering and analyzing data, creating models, and ensuring ethical considerations. Additionally, understanding data types, data literacy, and the importance of math in AI are essential for successful AI project implementation.

Uploaded by

zeebro581
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

AI Project Cycle The AI project cycle includes the following stages:

 Problem Scoping
 Data Acquisition
 Data Exploration
 Modeling
 Evaluation
 Deployment

Problem Scoping Problem Scoping is the first stage of the AI project cycle. It involves defining
the problem you want to solve with your project. The 4Ws canvas is used to help with problem
scoping. The 4 W's are:

 Who are the stakeholders, and what do you know about them?
 What is the problem and how do you know it is a problem?
 Where is the context/situation the stakeholders experience the problem and where is the
problem located?
 Why is it important to solve the problem and how will the solution benefit the
stakeholders and society? A problem statement template can be used to put all the details
together in one place: "[Stakeholders] has/have a problem that [issue, problem, need]
when/while [context, situation]. An ideal situation would be [benefit of solution for
them].".

Data Acquisition Data acquisition is the second stage of the AI project cycle, and involves
gathering data for the project. Data can be facts or statistics collected for analysis. For an AI
project to be efficient, the training data should be authentic and relevant to the problem
statement. Data features are the characteristics or properties of the data you need to collect.
There are various sources for data, but it is important to ensure the data is authentic and reliable.
Data acquisition methods should also be authentic to avoid conflicts.

 Data can be collected from various sources, such as the internet, surveys, and sensors.
 Data must be accurate and reliable as it ensures the efficiency of the system.
 Data is the base on which the AI project is built.

Data Exploration Data exploration is the third stage of the AI project cycle, and involves
interpreting useful information from the data. Data exploration is also used to understand the
trends, relationships, and patterns in the data. Data visualization is a method used to comprehend
information quickly and communicate it to others.

 Data visualization uses graphs and charts to show information.


 Data exploration helps in better deciding on which model/models to use.
 Data exploration can help to identify patterns and trends.
 Data should be presented in a way that is understandable for humans.
 Data visualization is important in AI
 Data visualization can be done using tools like Tableau, Excel or Datawrapper
Modeling Modeling involves creating an AI model, which is a mathematical approach to
analyzing data. The ability to mathematically describe the relationship between parameters is the
heart of every AI model. AI models can be classified as Machine Learning (ML) or Deep
Learning (DL). Deep Learning enables software to train itself to perform tasks with large
amounts of data using multiple machine learning algorithms.

Evaluation Evaluation is the process of understanding the reliability of any AI model based on
outputs from feeding a test dataset into the model and comparing it with actual answers. It is not
recommended to use the same data to evaluate the model as was used to build it, because the
model will simply remember the training data, leading to overfitting.

Deployment Deployment is the stage where the AI solution is made ready to be used. AI can be
used on mobile apps and websites.

Types of Data There are different types of data used in AI.

 Qualitative Data
o Made up of words and phrases.
o Used for Natural Language Processing (NLP).
o Examples include search queries on the internet.
o Describes qualities like color or type.
o Textual data is a type of qualitative data.
 Quantitative Data
o Made up of numbers.
o Used for statistical data.
o Examples include measurements, readings, or values.
o Measurable like height or weight.
o Numeric data is a type of quantitative data
o Continuous data is numeric data that is continuous and can vary. Examples
include height, weight, and temperature.
o Discrete data is numeric data that contains only whole numbers and cannot be
fractional. Examples include the number of students in a class.
 Categorical data is used to categorize information.

Data can also be classified by how it is structured

 Structured data is organized into a tabular format with rows and columns and follows a
predefined schema. Examples include data in databases, spreadsheets and CSV files.

Data Literacy Data literacy means being able to understand, explain, and work with data. It's the
ability to explore, understand and communicate with data in a meaningful way. Data literacy is
important for making smart choices and avoiding false information. Data is raw facts and figures,
while information is processed data. Data is available in a raw form, and is not very useful in that
form. Data is processed to give us information. Information about the world leads to knowledge
of how things are happening, and wisdom allows us to understand why things are happening in a
particular way. Every data tells a story, but one must be careful before believing the story. A data
literate person can interact with data to understand the world around them. The Data Literacy
Process Framework is an iterative process that includes planning, communication, assessment,
culture development, prescriptive learning, and evaluation.

 Data literacy involves collecting, analyzing, and showing data in ways that make sense.
 Data literacy is essential because it enables individuals to make informed decisions.

AI Domains The three main domains of AI are:

 Computer Vision (CV) uses visual data like images and videos.
 Natural Language Processing (NLP) uses textual data like documents and PDF files.
 Statistical Data (SD) uses numeric data such as tables and excel sheets.

AI Ethics AI ethics involves the guiding principles that help to decide what is good or bad, and
these principles guide the creation of better and safer AI solutions. It is important to ensure that
AI solutions follow ethical principles such as human rights, and avoid bias, privacy violations,
and exclusion.

 AI must not discriminate against a particular group of the population or cause them any
kind of disadvantage.
 AI should not take away freedom or deprive people of jobs.
 AI systems can influence decision making, calling for a need for ethical principles that
govern AI.
 Personal data should be protected.
 Data privacy is the practice of protecting digital information from unauthorized access,
corruption or theft throughout its entire lifecycle.
 Data breaches at government, corporations, and hospitals can put sensitive information
into the wrong hands.

Math for AI Math is necessary for designing AI projects.

 Statistics is used for collecting, exploring, and analyzing data. It is used to describe and
understand data.
 Probability is a way to tell us how likely something is to happen. It is crucial in
predicting outcomes and analyzing uncertainties in real-world applications.
o Probability can be expressed in terms of certain, likely, unlikely, impossible and
equal probability events.
o If an event is certain, it has a probability of 1, and if an event is impossible, it has
a probability of 0.

Other terms

 System maps help to find relationships between different elements of a problem. System
maps can represent relationships through the use of arrows, where a longer arrow
represents a longer time for a change to happen.
 Training data is the data used to train an AI model.
 Testing data is the data used to test an AI model.
 Overfitting is when the model simply remembers the whole training set
 Data processing helps computers understand raw data.
 Data analysis is examining each component of the data to draw conclusions.
 Data interpretation is the process of making sense of processed data.

Examples of AI applications:

 Self-driving cars.
 Self-service kiosks.
 Games. For example:
o Quick, Draw, a game based on Computer Vision that challenges players to draw a
picture of an object or idea and uses AI to guess what the drawings represent.
o Semantris, a game based on Natural Language Processing that is a word
association game using AI to choose words that are most related to a clue
provided by the player.
 Systems to detect people not wearing masks.
 AI-based essay grading systems.
 Customized recommendations for products, songs, and videos.
 AI tools that can correct essays.
 AI can be used to transform the world into a better place.

You might also like