# Algorithm Design Laboratory with Applications

## Schedule

- Monday 8:30 - 11:30. Room A1.2 C1.16.
- Thursday 14:30 - 16:30. Room A1.2.

**Office hours:** Thursday 16:30 - 18:30. Please send me an email or ask before/after the lectures.

## Lectures and Material

### Introduction and Prefix Sums

Course overview.

Introduction to laboratory exercises: exercises format, writing, compiling, evaluating, and debugging a solution. Programming tips, assertions.

The STL library: overview, basic types, containers (arrays, vectors, deques, queues, stacks, (muti-)sets, priority queues), iterators, algorithms: (heaps and sorting, linear search, binary search, deleting and replacing elements).

The prefix sums technique. Counting the number of contiguous subsequences with even total sum: cubic-, quadratic-, and linear-time solutions.

#### Material

### Sorting, Binary Searching, and Sliding Window

Sorting and binary searching with predicates. Computing good upper bounds using exponential search.

The sliding window technique. Example with cubic-, quadratic-, almost linear-, and linear-time solutions.

#### Material

### Greedy algorithms

Greedy algorithms. Interval scheduling: problem definition, the earliest finish time algorithm, proof of correctness (greedy stays ahead).

Interval partitioning: problem definition, greedy algorithm, proof of correctness (using structural properties).

Scheduling jobs with deadlines to minimize lateness. The Earliest Deadline First algorithm. Proof of correctness through an exchange argument.

#### Material

- Sections 4.1 and 4.2 of [KT].
- Section 6.1 of [CLRS].
- Slides: greedy algorithms.

### Divide and Conquer, Memoization, and Dynamic Programming

The divide and conquer technique and the polynomial multiplication problem.

Recursion and memoization: computing Fibonacci numbers recursively in linear-time.

Introduction to dynamic programming. A trivial example: computing Fibonacci numbers iteratively. The Longest Increasing Subsequence problem: a O(n^2) algorithm and a O(n log n) algorithm.

#### Material

- Section 5.5 of [KT], where the divide and conquer technique is applied to integer multiplication.
- Section 6.1 of [KT], where memoization is applied to weighted interval scheduling.
- Exercise 17 in Chapter 6 of [KT] asks to solve the longest increasing subsequence problem.
- Exercises 15.4-5 and 15.4-6 of [CLRS] ask to solve the longest increasing subsequence problem in O(n^2) and O(n log n) time, respectively.
- Section 2 of M. L. Fredman
*On computing the length of longest increasing subsequences*. - Slides: divide and conquer, memoization, dynamic programming.

### More Dynamic Programming

Maximum-weight independent set on paths (linear-time algorithm), Maximum-weight independent set on trees (linear-time algorithm), Maximum weight independent set on trees with cardinality constraints. Counting the number of ways to distribute budget with stars and bars. Dynamic programming algorithm to optimally distribute budget.

The minimum edit distance problem: definition, quadratic algorithm, techniques for reconstructing optimal solutions from the dynamic programming table.

#### Material

- Exercise 1 and Solved Exercise 1 in Chapter 6 of [KT].
- Excercise 15-6 of [CLRS] and section "Maximum-Weight Independent Set on Trees" in Chapter 10.2 of [KT].
- Chapter 3.7 of [E].
- Slides: Independent Set and Edit Distance.

### Even More Dynamic Programming and the Split and List Technique

The Subset-Sum problem: definition and a dynamic programming algorithm. Pseudopolynomial-time algorithms.

See Chapter 3.8 of [E] and Chapter 6.4 of [KT].The split and list technique: Improving the trivial O*(2^n)-time algorithm for subset-sum to a O*(2^(n/2))-time algorithm. Techniques for generating all subsets sums of a set, a trivial O(n2^n) algorithm, a O(2^n) algorithm (see also Chapter 17.1 of [CLRS]).

The Knapsack problem: a dynamic programming algorithm with polynomial running time in the number of items and in the maximum weight (see Chapter 6.4 of [KT]). A dynamic programming algorithm with polynomial running time in the number of items and in the optimal value. A split and list algorithm (see also Chapter 9.1 of [F]).

The 1-in-3 Positive SAT problem: definition, improving the trivial O*(2^n)-time algorithm to a O*(2^(n/2))-time split and list algorithm. See Chapter 9.1 of [F].

#### Material

- Chapter 3.8 of [E].
- Chapter 6.4 of [KT].
- Chapter 17.1 of [CLRS].
- Chapter 9.1 of [F].
- Slides: Subset-Sum and 1-in-3 Positive SAT.
- Slides: Knapsack.

## Laboratory Problems

**Asteroid mining:**statement, test-cases.**Number station:**statement, test-cases.**A massive bookwork:**statement, test-cases.**Mountain trip:**statement, test-cases.**Travelling salesman:**statement, test-cases.**Postal service:**statement, test-cases.**Lazy hikers:**statement, test-cases.**Two trees:**statement, test-cases.**HDMI cables:**statement, test-cases.**Tunnel:**statement, test-cases.**Souvenirs:**statement, test-cases.**Omicron persei 8:**statement, test-cases.

## References

**[CLRS]**: Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, Clifford Stein.*Introduction to Algorithms (3rd Edition)*. MIT Press. ISBN 978-0-262-53305-8.**[KT]**: Jon Kleinberg, Éva Tardos.*Algorithm Design*. Pearson. ISBN 0-321-29535-8.**[E]**: Jeff Erickson.*Algorithms*. 978-1-792-64483-2. Freely available under the Creative Commons Attribution 4.0 International License at the book website.**[F]**: Fedor V. Fomin, Dieter Kratsch.*Exact Exponential Algorithms*. Springer. ISBN 978-3-642-16533-7. Freely available for personal use here.