IE 468: Optimization Methods and Models (Spring 2019)
优化方法代写 Project requirements: The project is an integral part of IE 468, focusing on the devel- opmemt of some optimization algorithms
Project (40%) Uday V. Shanbhag
Project requirements: The project is an integral part of IE 468, focusing on the devel- opmemt of some optimization algorithms in the context of continuous optimization. The course will comprise of providing a typed report (with m-files and iteration logs in the appendix) that contains the following.优化方法代写
Outline of report.
Section I: Unconstrained optimization.优化方法代写
(i)A description of each of the four algorithms: (a) Gradient methods with exact line search; (b) Gradient method with backtracking line search; (c) Diagonally scaled gradientmethod with exact/backtracking line search; (d) Hybrid Newton (Most of this is already done)
(ii)Thisdescription should contain a short description of the method and provide state- ments regarding both the convergence of the iterates and the associated rate of con- vergence (if any).优化方法代写
(iii)Finally,each algorithm should be applied to Exercise 3 (quadratic minimization) and Example 5.8 (Rosenbrock’s function) and the report should contain a log of iterations where f (xk) and xf (xk) are tabulated. In addition, the termination criteria and the parameters in the line search etc. should be specified.
Section II: Constrained convex optimization. 优化方法代写
(i)Adescription of the projected gradient method (Pg. 175, Eq (9.15)).
(ii)Thisdescription should contain a short description of the method and provide state- ments regarding both the convergence of the iterates and the associated rate of con- vergence (if any).
(iii)Finally,this algorithm should be applied to three problems: 4(i), 8.4(ii), and 8.5. The report should contain a log of iterations where f (xk) and xk ΠC (xk γ xf (xk)) (for any γ > 0) are tabulated. In addition, the termination criteria and the parameters in the line search etc. should be specified.优化方法代写
Submission deadline: December 13th.
IE 468:优化方法和模型(2019年春季)
专案(40%)Uday V.Shangbhag
项目要求:该项目是IE 468不可或缺的一部分,着重于在连续优化的背景下开发某些优化算法。本课程将包括提供一份包含以下内容的类型化报告(在附录中包含m文件和迭代日志)。
报告大纲。优化方法代写
第一节:无约束优化。
(i)四种算法中每一种的描述:(a)带有精确线搜索的梯度方法; (b)带有回溯线搜索的梯度方法; (c)具有精确/回溯线搜索的对角比例梯度法; (d)混合牛顿法。 (大多数已经完成)
(ii)该描述应包含对该方法的简短描述,并提供有关迭代的收敛和相关的收敛速率(如果有)的说明。
(iii)最后,每种算法都应应用于练习4.3(二次最小化)和示例5.8(Rosenbrock函数),并且报告应包含迭代记录,其中将f(xk)和xf(xk)制成表格。另外,应指定终止条件和线路搜索等中的参数。
第二节:约束凸优化。
(i)投影梯度法的说明(第175页,等式(9.15))。
(ii)该描述应包含对该方法的简短描述,并提供有关迭代的收敛和相关的收敛速率(如果有)的说明。优化方法代写
(iii)最后,该算法应应用于三个问题:8.4(i),8.4(ii)和8.5。该报告应包含一个迭代日志,其中将f(xk)和xkΠC(xkγxf(xk))(对于任何γ> 0)制成表格。另外,应指定终止条件和线路搜索等中的参数。
提交截止日期:12月13日。
其他代写:algorithm代写 analysis代写 app代写 加拿大代写 jupyter notebook代assembly代写 assignment代写 C++代写 code代写 course代写 dataset代写 北美作业代写 编程代写 考试助攻 source code代写 dataset代写 金融经济统计代写 java代写 web代写