Machine Learning ML
Yusuf Uzun
Aksaray University P. hD. Student
Seminar Presentation
2
Machine Learning Introduction
• Humankind used machines to suport work force.
• Nowadays, humankind uses machines to support brain power
owing to the development of Machine Learning!
Machine Learning
Machine Learning Definiton
Machine Learning explores the study and
construction of algorithims that
• can learn from past data
• can make future predictions based on learnt data
• can improve their performance through experience
using training data
• Machine Learning is a set of Computational
Programs used to develop statistical models
• Typically, an algorithm has a number of
parameters whose values are learnt from the
data
• The ability to perform a task in a situation
which has never been encountered before
(Learning = Generalization)
Machine Learning Evolution
• ML is actually getting computers program themselves
• Programming ise the bottleneck, then let the data be used to construct a model that
would do the work
Machine Learning Importance
• “Machine learning is going to result in a real revolution” (Greg
Papadopoulos, CTO, Sun)
• “Machine learning is the next Internet” (Tony Tether,
Former Director, DARPA)
• “A breakthrough in machine learning would be worth ten Microsofts” (Bill Gates,
Microsoft)
• “Web rankings today are mostly a matter of machine learning” (Prabhakar
Raghavan, Former Dir. Research, Yahoo)
• “Machine learning is the hot new thing” (John
Hennessy, President, Stanford)
• “Machine learning today is one of the hottest aspects of computer science”
(Steve Ballmer, CEO, Microsoft)
• “Machine learning is today’s discontinuity”
(Jerry Yang, Founder, Yahoo)
Machine Learning Mission
• Needed and developped to solve complicated problems that are very hard or
impossible to program by hand
• We don’t know what program to write because we don’t know how our brain does
it.
• Even if we had a good idea about how to do it, the program might be horrendously
complicated.
• Instead of writing a program by hand, we collect lots of examples that specify the
correct output for a given input
• A machine learning algorithm then takes these examples and produces a program
that does the job
• Discover new useful knowledge from large databases (data mining)
• Ability to mimic human and replace certain monotonous tasks that require some
intelligence
• Develop systems that can automatically adapt and customize themselves to
individual users
Why now?
• Flood of available big data (especially with the advent
of the Internet)
• Increasing computational power
• Growing progress in available algorithms and theory
developed by researchers
• Increasing support from industries
8
Sample Applications
• Pattern recognition
• Speech recognition, Natural language processing
• Web search engines
• Recommendation systems
• Medical Diagnosis
• Computational biology
• Finance & Credit
• Fraud detection
• E-commerce
• Space exploration
• Robotics, unassisted control of a vehicle
• Information extraction
• Social networks
• …………………
The machine learning framework
• Apply a prediction function to a feature representation of the
image to get the desired output:
f( ) = “apple”
f( ) = “tomato”
f( ) = “cow”
The machine learning framework
y = f(x)
• Training: given a training set of labeled examples {(x1,y1), …, (xN,yN)},
estimate the prediction function f by minimizing the prediction error
on the training set
• Testing: apply f to a never before seen test example x and output the
predicted value y = f(x)
output prediction
function
Image
feature
Prediction
Steps
Training
Labels
Training
Images
Training
Training
Image
Features
Image
Features
Testing
Test Image
Learned
model
Learned
model
Using a validation set
Divide the total dataset into three subsets:
• Training data is used for learning the parameters of the model.
• Test data is used to get a final, unbiased estimate of how well the
network works. We expect this estimate to be worse than on the
validation data.
Algorithm Stages
Every machine learning algorithm has three
components:
– Representation
– Evaluation
– Optimization
Generalization
• Underfitting: model is too “simple” to
represent all the relevant class
characteristics
• Overfitting: model is too “complex” and fits
irrelevant characteristics (noise) in the data
Types of Learning
• Supervised (inductive) learning
– Training data includes desired outputs
• Unsupervised learning
– Training data does not include desired outputs
• Reinforcement learning
- The learner interacts with the world via “actions” and tries to
find an optimal policy of behavior with respect to “rewards” it
receives from the environment
• Learning to predict a discrete value from a predefined set of values (classification)
• Learning to predict a continuous/real value (regression)
• Group similar data points together (clustering)
• Find lower-dimensional manifold preserving some properties of the data
(dimension reduction)
Types of Learning
Linear Classification
Find a linear function to separate the classes:
Classification
• Decision rules divide input space into decision regions
separated by decision boundaries
20
Classification
• Example: Credit
scoring
• Differentiating
between low-risk
and high-risk
customers from their
income and savings
Discriminant: IF income > θ1 AND savings > θ2
THEN low-risk ELSE high-risk
Model
21
Classification: Applications
• Credit risk assessment
• Pattern recognition
• Email Spam Filtering
• Face recognition
• Optical Character Recognition: Different handwriting styles
• Speech recognition
• Medical disease diagnosis: From symptoms to illnesses
23
Prediction: Regression
• Example: Price of a used car
• x : car attributes
y : price
y = g (x )
g ( ) model
y = wx+w0
24
Regression: Applications
• Predicting Gold & Stock Future Prices
• Predict the Price of a Used Car
• Automatic Steering
25
Unsupervised Learning
• The challenging part here is to identify whether the data can be
separated into groups that are relatively distinct
• Training data contains inputs but does not include outputs
• Learning “what happens in normal life”
• Clustering: Grouping similar instances
• Example applications
• Customer segmentation in CRM
• Individualised marketing
• Image compression
Clustering: Discover Structure in data
Similarity Determination
Collaborative Filtering
predict how well a user will like an item that he has not rated given a
set of historical preference judgments for a community of users
Collaborative Filtering
31
Reinforcement Learning
• No supervised output but delayed reward
• Applications:
• Game playing
• Robot in a maze
• Multiple agents
Decision Trees
 Decision tree

A tree with nodes representing condition testing and leaves representing
classes
 Decision list

If condition 1 then class 1 elseif condition 2 then class 2 elseif ….
 Sample Algorithms: ID3, C4.5
Thank you

An Introduction to Machine Learning.pptx

  • 1.
    Machine Learning ML YusufUzun Aksaray University P. hD. Student Seminar Presentation
  • 2.
    2 Machine Learning Introduction •Humankind used machines to suport work force. • Nowadays, humankind uses machines to support brain power owing to the development of Machine Learning!
  • 3.
  • 4.
    Machine Learning Definiton MachineLearning explores the study and construction of algorithims that • can learn from past data • can make future predictions based on learnt data • can improve their performance through experience using training data • Machine Learning is a set of Computational Programs used to develop statistical models • Typically, an algorithm has a number of parameters whose values are learnt from the data • The ability to perform a task in a situation which has never been encountered before (Learning = Generalization)
  • 5.
    Machine Learning Evolution •ML is actually getting computers program themselves • Programming ise the bottleneck, then let the data be used to construct a model that would do the work
  • 6.
    Machine Learning Importance •“Machine learning is going to result in a real revolution” (Greg Papadopoulos, CTO, Sun) • “Machine learning is the next Internet” (Tony Tether, Former Director, DARPA) • “A breakthrough in machine learning would be worth ten Microsofts” (Bill Gates, Microsoft) • “Web rankings today are mostly a matter of machine learning” (Prabhakar Raghavan, Former Dir. Research, Yahoo) • “Machine learning is the hot new thing” (John Hennessy, President, Stanford) • “Machine learning today is one of the hottest aspects of computer science” (Steve Ballmer, CEO, Microsoft) • “Machine learning is today’s discontinuity” (Jerry Yang, Founder, Yahoo)
  • 7.
    Machine Learning Mission •Needed and developped to solve complicated problems that are very hard or impossible to program by hand • We don’t know what program to write because we don’t know how our brain does it. • Even if we had a good idea about how to do it, the program might be horrendously complicated. • Instead of writing a program by hand, we collect lots of examples that specify the correct output for a given input • A machine learning algorithm then takes these examples and produces a program that does the job • Discover new useful knowledge from large databases (data mining) • Ability to mimic human and replace certain monotonous tasks that require some intelligence • Develop systems that can automatically adapt and customize themselves to individual users
  • 8.
    Why now? • Floodof available big data (especially with the advent of the Internet) • Increasing computational power • Growing progress in available algorithms and theory developed by researchers • Increasing support from industries 8
  • 9.
    Sample Applications • Patternrecognition • Speech recognition, Natural language processing • Web search engines • Recommendation systems • Medical Diagnosis • Computational biology • Finance & Credit • Fraud detection • E-commerce • Space exploration • Robotics, unassisted control of a vehicle • Information extraction • Social networks • …………………
  • 10.
    The machine learningframework • Apply a prediction function to a feature representation of the image to get the desired output: f( ) = “apple” f( ) = “tomato” f( ) = “cow”
  • 11.
    The machine learningframework y = f(x) • Training: given a training set of labeled examples {(x1,y1), …, (xN,yN)}, estimate the prediction function f by minimizing the prediction error on the training set • Testing: apply f to a never before seen test example x and output the predicted value y = f(x) output prediction function Image feature
  • 12.
  • 13.
    Using a validationset Divide the total dataset into three subsets: • Training data is used for learning the parameters of the model. • Test data is used to get a final, unbiased estimate of how well the network works. We expect this estimate to be worse than on the validation data.
  • 14.
    Algorithm Stages Every machinelearning algorithm has three components: – Representation – Evaluation – Optimization
  • 15.
    Generalization • Underfitting: modelis too “simple” to represent all the relevant class characteristics • Overfitting: model is too “complex” and fits irrelevant characteristics (noise) in the data
  • 16.
    Types of Learning •Supervised (inductive) learning – Training data includes desired outputs • Unsupervised learning – Training data does not include desired outputs • Reinforcement learning - The learner interacts with the world via “actions” and tries to find an optimal policy of behavior with respect to “rewards” it receives from the environment
  • 17.
    • Learning topredict a discrete value from a predefined set of values (classification) • Learning to predict a continuous/real value (regression) • Group similar data points together (clustering) • Find lower-dimensional manifold preserving some properties of the data (dimension reduction) Types of Learning
  • 18.
    Linear Classification Find alinear function to separate the classes:
  • 19.
    Classification • Decision rulesdivide input space into decision regions separated by decision boundaries
  • 20.
    20 Classification • Example: Credit scoring •Differentiating between low-risk and high-risk customers from their income and savings Discriminant: IF income > θ1 AND savings > θ2 THEN low-risk ELSE high-risk Model
  • 21.
    21 Classification: Applications • Creditrisk assessment • Pattern recognition • Email Spam Filtering • Face recognition • Optical Character Recognition: Different handwriting styles • Speech recognition • Medical disease diagnosis: From symptoms to illnesses
  • 23.
    23 Prediction: Regression • Example:Price of a used car • x : car attributes y : price y = g (x ) g ( ) model y = wx+w0
  • 24.
    24 Regression: Applications • PredictingGold & Stock Future Prices • Predict the Price of a Used Car • Automatic Steering
  • 25.
    25 Unsupervised Learning • Thechallenging part here is to identify whether the data can be separated into groups that are relatively distinct • Training data contains inputs but does not include outputs • Learning “what happens in normal life” • Clustering: Grouping similar instances • Example applications • Customer segmentation in CRM • Individualised marketing • Image compression
  • 26.
  • 28.
  • 29.
    Collaborative Filtering predict howwell a user will like an item that he has not rated given a set of historical preference judgments for a community of users
  • 30.
  • 31.
    31 Reinforcement Learning • Nosupervised output but delayed reward • Applications: • Game playing • Robot in a maze • Multiple agents
  • 32.
    Decision Trees  Decisiontree  A tree with nodes representing condition testing and leaves representing classes  Decision list  If condition 1 then class 1 elseif condition 2 then class 2 elseif ….  Sample Algorithms: ID3, C4.5
  • 33.