CSUMS2022


Monday Thursday Wednesday Thursday Friday
9:00 - 11:00 Basics in continuous optimization Part l
Alex Ferrer
Basics in continuous optimization Part ll
Alex Ferrer
Optimization in Computational Engineering
Alex Ferrer
Statistical Learning
Alex Ferrer
Physically informed neuronal networks (PINNS): Introduction I
Mohammad R. Hashemi
Coffee Break
11:30 - 13:30 Applications in Machine Learning
Alex Ferrer
Supervised Learning Part l
Alex Ferrer
Supervised Learning Part ll
Alex Ferrer
Unsupervised Learning
Alex Ferrer
Physically informed neuronal networks (PINNS): Introduction II
Mohammad R. Hashemi
Lunch Break
14:30 - 16:30 Practical Session
Antonio Darder
Practical Session
Antonio Darder
Practical Session
Antonio Darder
Practical Session
Antonio Darder

Drag to see the calendar


BASICS IN CONTINUOUS OPTIMIZATION l

Description: An introduction to continuous optimization will be presented followed by some basic examples on least squares and linear (and sequential quadratic) programming. Then, unconstrained optimization will be discussed jointly with gradient-based algorithms. This module will end presenting some line-search methods (or learning rates) jointly with some illustrative examples.


APPLICATIONS IN MACHINE LEARNING

Description: This module will serve as a first taste on machine learning. We will present the similarities and differences between machine learning and classical optimization. First examples on regression will allow us to internalise the optimization concepts explained in the first sessions. Mean square errors, sparse approximation, lasso problems, ridge regression, l1 norm optimization will be some examples.


BASICS IN CONTINUOUS OPTIMIZATION ll

Description: This second excursion to continuous optimization will be more focused on constrained optimization problems where duality will play a central role. Legendre-fenchel transform, lagrange multipliers, penalty methods, KKT conditions and other concepts in constrained optimization will be described. Useful algorithms like SQP, augmented lagrangian, trust-region and interior point methods will be presented. The theory will be assisted by several illustrative examples.


OPTIMIZATION IN COMPUTATIONAL ENGINEERING

Description: This module is a short stop in computational engineering. It will allow us to understand well-established numerical methods and physical problems from an optimization perspective. We will re-visit different problems like conjugate gradients, eigenvalues, adjoint method, schur complement, iterative methods, mechanical problem, Stokes problem, contact and plasticity. This module will allow the participant with a background in computational engineering to give physical sense to the optimization concepts presented in the first modules.


STATISTICAL LEARNING

Description: Machine learning requires some important concepts from statistics. In this module, the necessary ingredients of statistics will be gradually introduced. We will move from reviewing basic concepts from probability and statistics to the study of bayesian inference and some useful parameter estimators. The maximum likelihood estimator and maximum a posteriori estimation will allow us to provide a statistical insight to regression and classification models.


SUPERVISED LEARNING

Description: We will start with polynomial regression to understand the capacity of the machine learning models. We will analyze the underfitting-overfitting trade-off and the importance of regularisation and the hyperparameters. In this module, we will also present logistic regression and the main classification models including multi-classification. We will end by introducing the support vector machine problem.


UNSUPERVISED LEARNING

Description: We will present in this module the clustering problem to motivate the importance of unsupervised learning. We will start with the K-means algorithm and the expectation maximization algorithm. Then, we will present the PCA algorithm for dimensional reduction. We will end presenting some more advanced unsupervised learning like independent component analysis (ICA) and non-negative matrix factorization problems.


PHYSICALLY INFORMED NEURONAL NETWORKS (PINNS)

Description: To be defined.


PRACTICAL SESSIONS:

Description: In these sessions, we will develop from scratch a basic Python code for experiencing the machine learning concepts presented in the theoretical sessions


PRACTICAL SESSIONS 1:

Description: We will start with linear and polynomial regression, internalizing the importance of the selected norm and the training and test splitting, the regularisation and the hyperparameters selection.


PRACTICAL SESSIONS 2:

Description: We will continue with regression models practicing underfitting and overfitting concepts. Then, we will move to classification problems by solving our first logistic regression problem.


PRACTICAL SESSIONS 3:

Description: We will consolidate the classification concepts by solving the support vector machine problem. We will then code our own neural network using backpropagation, automatic differentiation and stochastic gradient. We will practice with several optimization strategies like the learning rate and acceleration selection.


PRACTICAL SESSIONS 4:

Description: In this last session, we will code the k-means algorithm for clustering. Then, we will then use the PCA algorithm for reducing the dimensions of the features. We will end experiencing how PCA can be coupled with other machine learning algorithms.