The landscape of computing soon became even more thorny, as programmable general-purpose GPUs hit the scene. In 2007, NVIDIA released the CUDA software development kit and the first graphics processors designed for computing. While the trend then called "GPGPU" picked up speed, the first highly influential applications of GPUs for deep learning were published, and a new era unleashed \cite{Raina_2009}. Many of us went on long expeditions of adapting our best-loved algorithms for solving physical models formulated as differential equations to the new architecture and programming models. Some were decidedly successful, like those in the molecular modeling and quantum chemistry communities, for example. But for most it has either been a grind, or a trend to be avoided. In the last ten years, to cap all this, computational science was inundated with new approaches like machine learning and data-based models. Now, the trendy exploit is to try "physics-informed" neural networks in applications.