Transformer for solving the problem in Stochastic Local Volatility
- Basic Component
- All the files can be found here
- The problem is denfined in Stochastic Local Volatility, with SDE of 2 dimensions in space and 1 dimension in time.
- Stages of the lab
- Experiment_0, sampling of the FokkerPlanck PDE, with the help of Quantlib.
- Experiment_1, FNO in solving the FokkerPlanck PDE.
- Experiment_2, FNO Transformer in solving the FokkerPlanck PDE.
- Basic Introduction
- There are also some RL based methods to solve it
- In solving
HJB PDE
, Actor-Critic Method for High Dimensional Static Hamilton-Jacobi-Bellman Partial Differential Equations based on Neural Networks - In solving
Elliptic PDE
, Neural Q-learning for solving elliptic PDEs
- In solving
- There are also some RL based methods to solve it
2022.May.09, 11:00AM~12:15Noon
, with Dr. Shuhao Cao
, Ziheng Wang
, and Deqing Jiang
at Mathematical Institute
.
- The interface bwtween Quantlib and Transform
- Sampling point format
- Pre-process data format on our own
- The result of the PDE is in a stochastic way, which is totally different from the
-
In order to getv
, thep
is what we want to solve, the transition density. It can be discribed by The Fokker-Planck Equation, which is 2 dimensions in space and 1 dimension in time.
#TODO
Problems still remains- How to define the input data format
- How to define the final result of our model
- Is there any advantage compared with the tradition method
Monte Carlo
?
- Reference paper
Fourier Neural Operator
Fourier Neural Operator for Parametric Partial Differential EquationsGalerkin Transformer
Choose a Transformer Fourier or GalerkinFourier Transformer
Adaptive Fourier Neural Operators: Efficient Token Mixers for TransformersDeepXDE
A deep learning library for solving differential equationsDocumentation
- Problems remian in this week
- Only a set of PDEs are suitable for Fourier Transform.
- Why sample is
256*256
and downsample is64*64
, what has downsampling done here? - How MCMC using FNO?
- To solve different PDEs, do we need to change the code in the FNO?
- Or the info has been included in the dataset by sampling.
- We need to solve a two dimensional difussion equation.
Fourier-Transformer/fourier_neural_operator/fourier_3d.py
Lines 1 to 7 in 6fd306e
- In the
class SpectralConv3d(nn.Module):
Fourier-Transformer/fourier_neural_operator/fourier_3d.py
Lines 61 to 76 in 90686ea
- How does the advantage of operator learning functions in this lab?
- Does it not need to train again with different intial condition or boundrary condition.
- Perhaps. As the Operator is learnt. So one of the three condtion has been solved, with the other two to be the intial condition and boundrary condition.
- Try with FNO for basic set up of the experiment.
- The FNO model for solving entire family of PDEs.
- FNO project
- FNO GitHub
- FNO introduction
- FNO data
- FNO models
- Here are the pre-trained models. It can be evaluated using eval.py or super_resolution.py
- By now there is no need to focus on the Galerkin Transformer, as it only provides a better precision on results.
- We will focus more on the FNO in the next stages.
- The FNO is more focused on solving a family of PDEs without try to train the neural network from the scratch, such as when the initial condition or the boundary condition change.