skip to main content

EXPLORING PHYSICS-INFORMED NEURAL NETWORKS FOR SOLVING BOUNDARY LAYER PROBLEMS

Muchamad Harry Yudha Pratama  -  Mathematics Master Study Program, Institut Teknologi Bandung, Indonesia
*Agus Yodi Gunawan  -  Industrial and Financial Mathematics Research Group, , Indonesia

Citation Format:
Abstract

In this paper, we explore a cutting-edge technique called as Physics- Informed Neural Networks (PINN) to tackle boundary layer problems. We here examine four different cases of boundary layers of second-order ODE: a linear ODE
with constant coefficients, a nonlinear ODE with homogeneous boundary conditions, an ODE with non-constant coefficients, and an ODE featuring multiple boundary layers. We adapt the line of PINN technique for handling those problems, and our results show that the accuracy of the resulted solutions depends on how we choose the most reliable and robust activation functions when designing the architecture of the PINN. Beside that, through our explorations, we aim to improve our understanding on how the PINN technique works better for boundary layer problems. Especially, the use of the SiLU (Sigmoid-Weighted Linear Unit) activation function in PINN has proven to be particularly remarkable in handling our boundary layer problems.

Fulltext View|Download

Article Metrics:

  1. K. B. Hansen and S. C. Shadden, “A reduced-dimensional model for near-wall transport
  2. in cardiovascular flows,” Biomechanics and Modeling in Mechanobiology, vol. 15, no. 3,
  3. pp. 713–722, 2016
  4. S. Wang, H. Wang, and P. Perdikaris, “On the eigenvector bias of
  5. fourier feature networks: From regression to solving multi-scale pdes with
  6. physics-informed neural networks,” Computer Methods in Applied Mechanics
  7. and Engineering, vol. 384, p. 113938, 2021. [Online]. Available:
  8. https://www.sciencedirect.com/science/article/pii/S0045782521002759
  9. A. D. Jagtap, E. Kharazmi, and G. E. Karniadakis, “Conservative physicsinformed
  10. neural networks on discrete domains for conservation laws: Applications
  11. to forward and inverse problems,” Computer Methods in Applied
  12. Mechanics and Engineering, vol. 365, p. 113028, 2020. [Online]. Available:
  13. https://www.sciencedirect.com/science/article/pii/S0045782520302127
  14. A. D. Jagtap and G. Em Karniadakis, “Extended physics-informed neural networks
  15. (xpinns): A generalized space-time domain decomposition based deep learning framework
  16. for nonlinear partial differential equations,” Communications in Computational
  17. Physics, vol. 28, no. 5, pp. 2002–2041, 2020. [Online]. Available: http://globalsci
  18. org/intro/articledetail/cicp/18403.html
  19. H. Wang, R. Planas, A. Chandramowlishwaran, and R. Bostanabad, “Train once
  20. and use forever: Solving boundary value problems in unseen domains with pretrained
  21. deep learning models,” ArXiv, vol. abs/2104.10873, 2021. [Online]. Available:
  22. https://api.semanticscholar.org/CorpusID:233346944
  23. Z. Mao, A. D. Jagtap, and G. E. Karniadakis, “Physics-informed neural networks for high-speed
  24. flows,” Computer Methods in Applied Mechanics and Engineering, vol. 360, p. 112789, 2020
  25. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0045782519306814
  26. M. A. Nabian, R. J. Gladstone, and H. Meidani, “Efficient training of physics-informed neural
  27. networks via importance sampling,” Computer-Aided Civil and Infrastructure Engineering,
  28. vol. 36, no. 8, pp. 962–977, 2021
  29. V. Dwivedi and B. Srinivasan, “Physics informed extreme learning machine
  30. (pielm)–a rapid method for the numerical solution of partial differential
  31. equations,” Neurocomputing, vol. 391, pp. 96–118, 2020. [Online]. Available:
  32. https://www.sciencedirect.com/science/article/pii/S0925231219318144
  33. Q. He and A. M. Tartakovsky, “Physics-informed neural network method for forward and
  34. backward advection-dispersion equations,” Water Resources Research, vol. 57, no. 7, p
  35. e2020WR029479, 2021
  36. T. de Wolff, H. Carrillo, L. Mart´ı, and N. Sanchez-Pi, “Towards optimally weighted physicsinformed
  37. neural networks in ocean modelling,” arXiv preprint arXiv:2106.08747, 2021
  38. R. Mojgani, M. Balajewicz, and P. Hassanzadeh, “Lagrangian pinns: A causalityconforming
  39. solution to failure modes of physics-informed neural networks,” arXiv preprint
  40. arXiv:2205.02902, 2022
  41. A. Arzani, J.-X.Wang, and R. M. D’Souza, “Uncovering near-wall blood flow from sparse data
  42. with physics-informed neural networks,” Physics of Fluids, vol. 33, no. 7, 2021
  43. L. Yang, X. Meng, and G. E. Karniadakis, “B-pinns: Bayesian physics-informed neural networks
  44. for forward and inverse pde problems with noisy data,” Journal of Computational
  45. Physics, vol. 425, p. 109913, 2021
  46. H. Bararnia and M. Esmaeilpour, “On the application of physics informed neural
  47. networks (pinn) to solve boundary layer thermal-fluid problems,” International Communications
  48. in Heat and Mass Transfer, vol. 132, p. 105890, 2022. [Online]. Available:
  49. https://www.sciencedirect.com/science/article/pii/S0735193322000124
  50. M. Holmes, Introduction to Perturbation Methods. Springer, 1995
  51. A. Arzani, K. W. Cassel, and R. M. D’Souza, “Theory-guided physics-informed
  52. neural networks for boundary layer problems with singular perturbation,” Journal
  53. of Computational Physics, vol. 473, p. 111768, 2023. [Online]. Available:
  54. https://www.sciencedirect.com/science/article/pii/S0021999122008312
  55. F. M. White, Viscous fluid flow. McGraw-Hill New York, 2006, vol. 3
  56. A. Nayfeh, Introduction to Perturbation Techniques. John Wiley Sons, 1981
  57. J. T. Barron, “Continuously differentiable exponential linear units,” arXiv preprint
  58. arXiv:1704.07483, 2017
  59. D.-A. Clevert, T. Unterthiner, and S. Hochreiter, “Fast and accurate deep network learning by
  60. exponential linear units (ELUs),” in International Conference on Learning Representations,
  61. D. Hendrycks and K. Gimpel, “Gaussian error linear units (gelus),” arXiv preprint
  62. arXiv:1606.08415, 2016
  63. A. L. Maas, A. Y. Hannun, and A. Y. Ng, “Rectifier nonlinearities improve neural network
  64. acoustic models,” in International Conference on Machine Learning, vol. 30, 2013, p. 3
  65. D. Misra, “Mish: A self regularized non-monotonic activation function,” 2020
  66. A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classification with deep convolutional
  67. neural networks,” in Advances in Neural Information Processing Systems, 2012, pp. 1097–
  68. G. Klambauer, T. Unterthiner, A. Mayr, and S. Hochreiter, “Self-normalizing neural networks,”
  69. in Advances in Neural Information Processing Systems, 2017, pp. 971–980
  70. S. Elfwing, E. Uchibe, and K. Doya, “Sigmoid-weighted linear units for neural network function
  71. approximation in reinforcement learning,” 2017
  72. X. Glorot, A. Bordes, and Y. Bengio, “Deep sparse rectifier neural networks,” in International
  73. Conference on Artificial Intelligence and Statistics, 2011, pp. 315–323
  74. H. Zheng, Z. Yang,W. Liu, J. Liang, and Y. Li, “Improving deep neural networks using softplus
  75. units,” in International Joint Conference on Neural Networks, 2015, pp. 1–4

Last update:

No citation recorded.

Last update:

No citation recorded.