DL for PINN


Erfan Hamdi 1401
erfan.hamdi@gmail.com

Me
  • Mechanical Engineering at Sharif University of Technology
  • 3D Vision Head Developer at Opaltech

Headlines

Deep Learning
PiNN
Resources
Deep Learning PINN
DL with Python, Chollet Maziar Raissi Blog
DL with PyTorch, Stevens PINN Part I
PyTorch Docs PINN part II
Tools
Mechanics of ML

Tasks 🛠
  • Regression
  • Classification
  • Clustering
Paradigms of ML 🍰
  • Supervised Learning
    • Regression
    • Classification
  • Unsupervised Learning
    • Clustering
    • Dimensionality Reduction
  • Self-Supervised Learning
Building Blocks of ML
  • Data
  • Model
  • Optimizer
  • Loss
  • Metric
  • Evaluation
Data
  • Feature Vector
Model
  • Linear
  • Tree Based
  • Instance Based
  • Probabilistic Based
  • Kernel Based
  • Neural Networks
No Free Lunch!
Bias and Variance
Train, Validation and Test
Imbalanced Dataset
Target, Label 1, 150 0, 10
  • If we predict all 1:
    • Class 1:
      • Precision: 0.93
      • Recall: 1
    • Class 0:
      • Precision: 0
      • Recall: 0
    • Mean Performance:
      • Precision: 0.465
      • Recall: 0.5
sklearn

Stratify : For imbalanced Dataset, the same distribution of training would be used for testing

PyTorch
Cross Validation
  • K-fold Cross validation
    • Shuffle Dataset and randomly partion the training dataset into k groups of equal size.
    • choose on of the k groups and hold it out.
    • train on the other k-1 groups.
    • Do it k times.
    • Calculate the average error.
K Fold Cross Validation
Dimensionality Reduction
  • PCA
  • t-sne
  • Autoencoders
Clustering
Mechanics of DL
Feature Engineering !
Why so deep?
More Complex boundaries.

Tensorflow Playground

Computational Graph

\[\begin{aligned} f(x) & = (a+b)(e+1) \\ c & = a + b\\ d &= b + 1 \\ f &= c \cdot d \end{aligned} \]

derivative on edge

\[\begin{aligned} \frac{\partial e}{\partial a} &= \frac{\partial e}{\partial c} \cdot \frac{\partial c}{\partial a}= 2\cdot 1 = 2\end{aligned}\]

Building Blocks of DL
  • Tensor
  • Model
  • Optimizer
  • Loss
  • Metric
What is a Tensor?

Everything is a Tensor

Modeling Blocks
  • Linear, Convolutional, etc...
  • Batch Normalization
  • Activation Functions
  • Dropout
Batch Normalization
Batch Normalization
Benefits
  • Less Sensitive to initialization
  • Use higher Learning Rates
  • Regularization Effects
Batch Normalization
Activation Function
Add Nonlinearity to the Model
Dropout

To avoid overfitting, we can randomly dropout some of the neurons in the layer.

PyTorch
Loss Function (Cost Function, Error Function, Criterion)

Most used Loss Functions:

  • Mean Squared Error (MSE)
  • Mean Absolute Error (MAE)
  • Cross Entropy
  • A differential Equation!
Optimizers
  • SGD
  • Adam
  • RMSProp