1. Homepage
  2. Homework
  3. MATH60005 Optimisation - Coursework 1: Unconstrained Optimisation
This question has been solved

MATH60005 Optimisation - Coursework 1: Unconstrained Optimisation

Engage in a Conversation
ICLMATH60005MATH70005Optimisation

MATH60005/70005: Optimisation (Autumn 23-24) CourseNana.COM

Instructions: read this first! CourseNana.COM

  • This coursework has a total of 20 marks and accounts for 10% of the module. CourseNana.COM

  • Students who want to take the final exam (90%) must submit this coursework. CourseNana.COM

  • Submission deadline: November 17th, 13:00 UK time, via Blackboard drop box. CourseNana.COM

  • Submit a single file, ideally a pdf typed in LaTeX or similar. CourseNana.COM

  • Marking criteria: Full marks will be awarded for work that 1) is mathematically correct, 2) shows an understanding of material presented in lectures, 3) gives details of all calculations and reasoning, and 4) is presented in a logical and clear manner. CourseNana.COM

  • Do not discuss your answers publicly via our forum. If you have any queries regarding your interpretation of the questions, please contact the lecturer at dkaliseb@imperial.ac.uk CourseNana.COM

  • Beware of plagiarism regulations. This is a group-based assessment with groups from 1 to 3 students. Make a single submission for the coursework indicating in the front page the CID of every group member. Do not include your name. CourseNana.COM

    Questions CourseNana.COM

    Part I: Unconstrained Optimisation (6 marks) CourseNana.COM

    Letf(x,y)=x2 2xy2 +12y4. CourseNana.COM

    1. i)  [3 marks] Is the function f coercive? Explain your answer. CourseNana.COM

    2. ii)  [3 marks] Find the stationary points of f and classify them according to whether they are saddle points, strict/nonstrict local/global minimum/maximum points. CourseNana.COM

    Part II: Linear Least Squares - Denoising (8 marks) CourseNana.COM

    You are given the noisy signal s R1000 shown in Figure 1 (corresponding to the file signal.mat), and the goal is to use linear least squares to denoise it. CourseNana.COM

    i) [3 marks] Write a regularised least squares formulation for the denoising problem of the form CourseNana.COM

    min Axb2 +λLx2, λ>0, xR1000 CourseNana.COM

    1 CourseNana.COM

Figure 1: A noisy signal composed of piecewise constant components. CourseNana.COM

where Lx2 = P999 (xi xi+1)2. Give precise definitions for A, b, and L. Cast an i=1 CourseNana.COM

equivalent ordinary least squares problem of the form CourseNana.COM

min Aˆxbˆ2, xR1000 CourseNana.COM

giving precise definitions for Aˆ and bˆ. Plot the solution of the regularised least squares problems for values of λ = 0.1,1,10,100,1000. Describe what you observe in terms of denoising level. CourseNana.COM

ii) We will now develop an alternative solution to the denoising problem. We consider instead the problem CourseNana.COM

min Aˆxbˆ1, xR1000 CourseNana.COM

with the same Aˆ and bˆ as before, and where x1 = P|xi|. Note that this problem is nonsmooth and we cannot directly apply least squares or gradient-based techniques. CourseNana.COM

iia) [3 marks] We will construct an algorithm to approximate the solution of the problem above. For this, we consider the weighted least squares problem CourseNana.COM

xR1000 CourseNana.COM

1999
min Xw|aˆxˆb|2, CourseNana.COM

iii i=1 CourseNana.COM

where aˆ denotes the ith row of the matrix Aˆ and w R1999 is a given weight vec- i CourseNana.COM

tor with positive entries. Proceeding similarly as in the linear least squares problem, derive an expression for the solution of this problem in terms of Aˆ , bˆ and W, which is a diagonal matrix with w in the diagonal. We call this solution x= x(Aˆ , bˆ, W). CourseNana.COM

2 CourseNana.COM

CourseNana.COM

iib) [2 marks] We now construct the following algorithm:
Step0: Setavector
w0 =1R1999,avectorfullof1’s. Setk=0,andatoleranceδ. Step 1: Solve xk = x(Aˆ , bˆ, Wk).
Step 2: Update the weights according to
CourseNana.COM

wk+1 = 1 . i max{δ,|aˆxk ˆb|} CourseNana.COM

Step3: Setk=k+1andgobacktoStep1. CourseNana.COM

Plot the final solution xof this algorithm for δ = 105, 100 iterations, and values of λ = 0.1, 1, 10, 100, 1000. This iterative procedure approximates the solution of the problem CourseNana.COM

min Aˆxbˆ1. xR1000 CourseNana.COM

Discuss the differences you observe in your results with respect to the regularized least squares from part i). CourseNana.COM

Part III: Gradient Descent (6 marks) Consider the problem (mq) CourseNana.COM

min f(x)X aixbi2+η2 , CourseNana.COM

xRn
where ai Rn,bi R,i = 1,2,...,m, and η > 0. If We define A Rm×n and the vector CourseNana.COM

am andthefunctiong:Rm Rasg(y)=Pm CourseNana.COM

f(x)=Ag(Axb), and 2f(x)=A2g(Axb)A. CourseNana.COM

i) [2 marks] Show that 2f(x) 0 for all x Rn. Choose only one of the following two questions: CourseNana.COM

iiA) [4 marks] Show that if A has full column rank, then there exists a unique global minimiser. CourseNana.COM

Hint: for existence of a minimiser, you can use that for any z Rm, it holds that z1 ≥ ∥z2. For uniqueness, use the following result: if a function f is such that 2f(x) 0 for all x Rn, then a stationary point of f is necessarily a strict global minimum. CourseNana.COM

iiB) [4marks]ShowthatfC1,1 withL= η . L A2 CourseNana.COM

Hint: you can use that for two matrices A and B of compatible dimensions, it holds that AB∥ ≤ ∥A∥∥B. CourseNana.COM

Get in Touch with Our Experts

WeChat WeChat
Whatsapp WhatsApp
ICL代写,MATH60005代写,MATH70005代写,Optimisation代写,ICL代编,MATH60005代编,MATH70005代编,Optimisation代编,ICL代考,MATH60005代考,MATH70005代考,Optimisation代考,ICLhelp,MATH60005help,MATH70005help,Optimisationhelp,ICL作业代写,MATH60005作业代写,MATH70005作业代写,Optimisation作业代写,ICL编程代写,MATH60005编程代写,MATH70005编程代写,Optimisation编程代写,ICLprogramming help,MATH60005programming help,MATH70005programming help,Optimisationprogramming help,ICLassignment help,MATH60005assignment help,MATH70005assignment help,Optimisationassignment help,ICLsolution,MATH60005solution,MATH70005solution,Optimisationsolution,