ml作业代写 案例CSE 6363 – Machine Learning
Homework 1- Spring 2019
Due Date: Feb. 8 2019, 11:59 pm
MLE and MAP
- Inclass we covered the derivation of basic learning algorithms to derive a model for a coin flip Consider a similar problems where we monitor the time of the occurrence of a severe computer failure (which requires a system reboot) and which occurs according to a Poisson process (i.e. it is equally likely to happen at any point in time with an arrival rate of λ ). For a Poisson process the probability of the first event to occur at time x after a restart is described by an exponential distribution:
−
pλ(x) = λe
ml作业代写 项目介绍We are assuming here that the different data points we measured are independent, i.e. nothing changes between reboots.
- Derivethe performance function and the optimization result for analytic MLE optimization for a model learning algorithm that returns the MLE for the parameter λ of the model given a data set D = {k1, …kn}. Make sure you show your
- Apply the learning algorithm from a) to the followingdataset:
D = {1.5, 3, 2.5, 2.75, 2.9, 3} .
- Derive the optimization for a MAP approach using the conjugate prior, the Gamma
The Gamma distribution is:
pα,β
(λ) =
βα
Γ(α)
λα−1
−βλ
ml作业代写 Note that α and β are constants and that there still is only one parameter, λ, to be learned. Show your derivation and the result for the data in part b) and values for α and β of 5 and 10, respectively.
K Nearest Neighbor
- Consider the problem where we want to predict the gender of a person from a set of input parameters, namely height, weight, and age. Assume our training data is given asfollows:
D = { | ((170, 57, 32), | W ), | |
((192, 95, 28), | M ), | ||
((150, 45, 30), | W ), | ||
((170, 65, 29), | M ), | ||
((175, 78, 35), | M ), | ||
((185, 90, 32), | M ), | ||
((170, 65, 28), | W ), | ||
((155, 48, 31), | W ), | ||
((160, 55, 30), | W ), | ||
((182, 80, 30), | M ), | ||
((175, 69, 28), | W ), | ||
((180, 80, 27), | M ), | ||
((160, 50, 31), | W ), | ||
((175, 72, 30), | M ), | } |
- Using Cartesian distance as the similarity ml作业代写 measurements show the results of the gender prediction forthe following data items for values of K of 1, 3, and 5. Include the intermedia steps (i.e. distance calculation, neighbor selection, prediction).
(155, 40, 35), (170, 70, 32), (175, 70, 35), (180, 90, 20)
- Implement the KNN algorithm for this problem. ml作业代写 Your implementation should work with different training data sets and allow to input a data point for the
- Repeat the prediction using KNN when the age data is removed. Try to determine (using multiple target values) which data gives you better predictions. Show your intermediate
Gaussian Na¨ıve Bayes Classification
- Using the data from Problem 2, build a Gaussian Na¨ıve Bayes classifier for this problem. For this you haveto learn Gaussian distribution parameters for each input data feature, e. for p(height|W ), p(height|M ), p(weight|W ), p(weight|M ), p(age|W ), p(age|M ).
- Learn/derive the parameters for the Gaussian Na¨ıve Bayes Classifier and apply them to the same target as in problem 2b). Show your intermediate
- Implement the Gaussian Na¨ıve Bayes Classifier for this
- Repeat the experiment in part 2c) with the Gaussian Na¨ıve Bayes
- Compare the results of the two classifiers and discuss reasons why one might perform better than the