1. Homepage
  2. Programming
  3. Combinatorial Optimisation 2022-2023 Computer Assignment 2: Set covering problem and Lagrangian relaxation

Combinatorial Optimisation 2022-2023 Computer Assignment 2: Set covering problem and Lagrangian relaxation

Engage in a Conversation
NetherlandsErasmus University RotterdamCombinatorial OptimisationSet covering problemLagrangian relaxation

Combinatorial Optimisation 2022-2023 Computer Assignment 2

Important information: read carefully! Requirements • An understanding of Lagrangian Relaxation and Subgradient Optimisation. • This assignment requires programming in Matlab or Octave. CourseNana.COM

Additional instructions for your code • Your code should be valid Matlab 2022 or Octave 4.4.1 code. • Your code should include (brief) comments such that it is clear what each part does. • Each file should contain exactly one function and should call functions of other parts of the exercise when appropriate. CourseNana.COM

Codegrade scores for theory questions • Check the Codegrade output by clicking on your submission score. • For the theory questions: your score is either 0% or 100% until the official grading. • A score of 100% indicates that your answers are parsed correctly. After the deadline, your answers will be graded. • A score of 0% indicates that your answers cannot be parsed correctly. If the issue cannot be resolved, contact the staff member responsible for grading the assignment. CourseNana.COM

Codegrade scores for code • Check the Codegrade output by clicking on your submission score. • For your Matlab/Octave code, your score is a preliminary assessment of your code and will be between 0% and 100%. After the deadline, your code will be fully graded. • A score lower than 100% implies that your code contains flaws. • A score of 100% does not imply that your answer is flawless. Your final grade may be lower. CourseNana.COM

Exercise 1 Deriving lower and upper bounds for the set covering problem by using Lagrangian relaxation

Consider the set covering problem (for more details we refer to pages 17–19 of the slides on Formulations). The problem can be formulated as CourseNana.COM

As follows from the formulation, a problem instance is given by • the dimensions n, m ∈ N≥1 , • a strictly positive cost vector c ∈ Rn>0 , • an adjacency matrix A ∈ Bm×n with elements aij for i ∈ {1, . . . , m}, j ∈ {1, . . . , n}. CourseNana.COM

Furthermore, a solution is given by the vector x ∈ Bn and the optimal objective value is z ∗ ∈ R. CourseNana.COM

We will apply Lagrangian relaxation by dualizing the constraints (1). In this assignment we will solve the Lagrangian dual problem by subgradient optimisation resulting in a lower bound on the optimal objective value z ∗ . Moreover, we will derive an upper bound on z ∗ by using a Lagrangian heuristic. CourseNana.COM

In your implementation, keep the following points in mind: • Write each function in a separate file. • Do not use nested functions, anonymous functions, or subfunctions. • Call functions written for other parts of the exercise when appropriate. • All vectors that are input or output should be implemented as column vectors. • All input or output should use double-precision floating-point numbers (not logicals or integers). • Use the instances stored in small_instance.mat and large_instance.mat found on Canvas to call, test, and debug your functions. a. Create a function function [obj_lagrange, x_lagrange] = Lagrangian(c, A, lambda) with c: cost vector c of the instance, A: adjacency matrix A of the instance, lambda: vector of Lagrangian multipliers, x_lagrange: optimal solution of the Lagrangian relaxation for lambda, refer to Exercise 4a in Set 3 with any ties set to zero, obj_lagrange : optimal objective value of the Lagrangian relaxation for lambda. CourseNana.COM

Note: as specified above, of all optimal solutions of the Lagrangian relaxation for lambda you must construct the optimal solution x_lagrange with as many elements set to zero. b. Create a function CourseNana.COM

function [obj_feas, x_feas] = InfeasToFeas(c, A, x_infeas) with cost vector c of the instance, adjacency matrix A of the instance, potentially infeasible primal solution, feasible primal solution constructed by applying the Greedy heuristic of Exercise 4c in Set 3 to x_infeas where the smallest appropriate index is chosen in case of ties, primal objective value corresponding to x_feas. CourseNana.COM

Note: as specified above, if during the Greedy heuristic there are multiple choices, choose
from these the one with the smallest index.
c. Create a function
function lambda_next
= UpdateLambda(A, lambda, LB, UB, x_lagrange, rho)
with
A
adjacency matrix A of the instance, lambda
vector of Lagrangian multipliers, LB
lower bound on the optimal primal objective value z ∗ , UB
upper bound on the optimal primal objective value z ∗ , x_lagrange
optimal solution of the Lagrangian relaxation for lambda, rho
scaling factor ρ to be used in the subgradient step, refer to pages 36–38 of the slides on Relaxations and Exercise 7 in Set 3, lambda_next : new vector of Lagrangian multipliers constructed by applying the subgradient step, refer to pages 36–38 of the slides on Relaxations and Exercise 7 in Set 3. d. Create a function function [LB_best, UB_best, x_best, LB_list, UB_list] = SubgradientOpt(c, A, lambda_init, rho_init, k) with c
cost vector c of the instance, A
adjacency matrix A of the instance, lambda_init : initial vector of Lagrangian multipliers to be used, rho_init
initial scaling factor ρ to be used in the subgradient step, k
(strictly positive) number of subgradient iterations to perform, LB_best
best found lower bound on the optimal primal objective value z ∗ , UB_best
best found upper bound on the optimal primal objective value z ∗ , x_best
best found feasible primal solution, LB_list
vector of found lower bounds on the optimal primal objective value z ∗ for each of the k iterations, UB_list
vector of found upper bounds on the optimal primal objective value z ∗ for each of the k iterations. The SubgradientOpt function should use the functions in Parts a, b, and c. So do not replicate code. In particular, in a single iteration first a new lower and upper bound is computed, and then the vector of Lagrangian multipliers is updated. Furthermore, in each iteration UpdateLambda of Part c should be called with current vector of Lagrangian multipliers, current(!) lower bound on the optimal primal objective value z ∗ , best(!) upper bound on the optimal primal objective value z ∗ , current optimal solution of the Lagrangian relaxation for lambda, rho_init divided by the current iteration number, where the iteration number runs from 1 up to and including k.

e. Write a function that shows the graph of both the current and the best lower and upper bounds found in each iteration. (You do not hand in this function.) f. Derive a lower and upper bound for the small problem instance in small_instance.mat, by applying the subgradient optimisation of Part d with lambda_init = (1, . . . , 1), rho_init = 12 , and k = 50. Plot the corresponding graph using Part e. Observe and interpret the results. (You do not hand in these results, use it for Part h.) g. Derive a lower and upper bound for the large problem instance in large_instance.mat, by applying the subgradient optimisation of Part d with lambda_init = (1, . . . , 1), rho_init = 100, and k = 1000. Plot the corresponding graph using Part e. Observe and interpret the results. (You do not hand in these results, use it for Part h.) h. Use the previous parts to answer the multiple-choice questions in the online web-form. CourseNana.COM

Get in Touch with Our Experts

WeChat (微信) WeChat (微信)
Whatsapp WhatsApp
Netherlands代写,Erasmus University Rotterdam代写,Combinatorial Optimisation代写,Set covering problem代写,Lagrangian relaxation代写,Netherlands代编,Erasmus University Rotterdam代编,Combinatorial Optimisation代编,Set covering problem代编,Lagrangian relaxation代编,Netherlands代考,Erasmus University Rotterdam代考,Combinatorial Optimisation代考,Set covering problem代考,Lagrangian relaxation代考,Netherlandshelp,Erasmus University Rotterdamhelp,Combinatorial Optimisationhelp,Set covering problemhelp,Lagrangian relaxationhelp,Netherlands作业代写,Erasmus University Rotterdam作业代写,Combinatorial Optimisation作业代写,Set covering problem作业代写,Lagrangian relaxation作业代写,Netherlands编程代写,Erasmus University Rotterdam编程代写,Combinatorial Optimisation编程代写,Set covering problem编程代写,Lagrangian relaxation编程代写,Netherlandsprogramming help,Erasmus University Rotterdamprogramming help,Combinatorial Optimisationprogramming help,Set covering problemprogramming help,Lagrangian relaxationprogramming help,Netherlandsassignment help,Erasmus University Rotterdamassignment help,Combinatorial Optimisationassignment help,Set covering problemassignment help,Lagrangian relaxationassignment help,Netherlandssolution,Erasmus University Rotterdamsolution,Combinatorial Optimisationsolution,Set covering problemsolution,Lagrangian relaxationsolution,