当前位置:天才代写 > Python代写,python代做代考-价格便宜,0时差服务 > 代写机器学习Machine Learning Theory Python实现:Machine Learning Theory (CSC 482A/581A)

代写机器学习Machine Learning Theory Python实现:Machine Learning Theory (CSC 482A/581A)

2018-02-03 08:00 星期六 所属: Python代写,python代做代考-价格便宜,0时差服务 浏览:751

Machine Learning Theory (CSC 482A/581A)

 

 

Problem Set 1 Due on Friday, February 2nd, 5pm

 

 

 

Instructions:

 

• You must write up your solutions individually.

 

• You may have high-level discussions with up to 2 other students registered in the course. If you discuss problems with others, include at the top of your submission: their names, V#’s, and the problems discussed.

 

• Your solutions should be clear and concise (and legible!). You are strongly encouraged to type up your solutions in LaTeX. For any problems where you only have a partial solution, be clear about any parts of your solution for which you have low confidence.

 

• If submitted through conneX, the due date is Friday, February 2nd, 7pm. This is a hard deadline. If submitted in writing, submit at the beginning of class (12:30pm) Friday February 2nd. You are thus incentivized to submit online (using LaTeX) as you get more time!

 

 

 

Questions:

 

1. Let X = R2 and take C to be the class of concentric circles C = {cr : r  0}, where for each nonnegative real number r  0, we have cr(x) = 1[kxk2  r]. Prove that C is PAC learnable.

In particular, show a PAC learning algorithm which, given a training sample of size n 

log

1δ

,

ε

finds with probability at least 1  δ a hypothesis fˆ  C for which R(fˆ)  ε.

 

2. Devise an efficient mistake bound learner for the concept class k-term DNF over X = {0, 1}d. The runtime and mistake bound of your algorithm both should be polynomial in d; you may treat k as a constant.

 

 

3. Let X = {0, 1}d and consider PAC learning a finite concept class C. Assume that the inputs are drawn i.i.d. from an unknown distribution P over X , and the labels are generated via the rule Y = c(X) for some c  C.

 

Let’s call this problem the “clean” problem; so, in the clean problem, the training sample consists of random examples of the form (X, Y ) for X P and Y = c(X).

 

Next, consider the following “corrupted” problem: Each time we request a random example (X, Y ), with probability α(X)  [0, 1] the value of the label Y is flipped. Call the resulting label Ye. Thus,

(

Y with probability α(X)

Ye =

Y with probability 1  α(X)

 

In the corrupted problem, the examples are of the form (X, Ye), and so the labels are noisy.


 


(a) Using c and α, derive an expression for the Bayes classifier for the corrupted problem.

 

(b) For the remaining questions, assume that α(x) = 14 for all x  X . What is the Bayes classifier for the corrupted problem?

 

(c) What is the Bayes risk for the corrupted problem?

 

(d) Let cε  C be a hypothesis for which Pr(cε(X) 6= c(X)) = ε > 0. What is the risk (expected zero-one loss) of cε for the corrupted problem?

 

 

(e) Design an algorithm for PAC learning C given access only to corrupted labeled examples (X1, Ye1), . . . , (Xn, Yen). That is, your algorithm should, with probability at least 1 − δ, output a concept fˆ  C for which EX P [fˆ(X) 6= c(X)]  ε. Your algorithm should be statistically efficient (you should mention the sample size n required, and n should be polynomial in 1ε and 1δ ), but it need not be computationally efficient. Please explain why your algorithm is correct.

代写计算机编程类/金融/高数/论文/英文


  u=199783060,2774173244&fm=58&s=188FA15AB1206D1108400056000040F6&bpow=121&bpoh=75.jpgalipay_pay_96px_533896_easyicon.net.pngpaypal_96px_533937_easyicon.net.pngchina_union_pay_96px_533911_easyicon.net.pngmastercard_pay_96px_533931_easyicon.net.pngasia_pay_96px_533902_easyicon.net.png

本网站支持淘宝 支付宝 微信支付  paypal等等交易。如果不放心可以用淘宝或者Upwork交易!

E-mail:850190831@qq.com   微信:BadGeniuscs  工作时间:无休息工作日-早上8点到凌晨3点


如果您用的手机请先保存二维码到手机里面,识别图中二维码。如果用电脑,直接掏出手机果断扫描。

qr.png

 

    关键字:

天才代写-代写联系方式