Sunday, February 18, 2018

Will it run? (or: Important things to ask yourself when programming)

Last fall, PVL MSc Giang spent a productive term with Raymond Pierrehumbert's group at Oxford. In this post, he reflects on his experience from the perspective of a little distance as he looks forward to summing up his MSc work and assesses PhD opportunities. Above (planetary photojournal image PIA01111), a view of one of Io's forced atmospheric components - sodium - which contrasts with the volcanic emission and condensation of sulfur compounds that Giang modelled.

By Tue Giang Nguyen


While I was interning at the University of Oxford, I was involved in atmospheric modelling projects for exoplanets and grateful for working with prominent scientists in my field. As I returned home from the UK, I had briefly forgotten what Canadian winter was like and was promptly reminded as I stepped outside of the airport. Now that I have returned to York University, it is time to reminisce about the things I learned during my short 3-months stay at Oxford.
The atmospheric model I worked with started by recreating Andrew Ingersoll’s 1985 work on modelling the wind flow on Io. Useful assumptions, some more justified than the others, such as making sure the Ionian atmosphere is hydrostatically bound and neglecting Io’s rotation allowed for a simple one-dimensional model of the shallow wave equation. The gist of the dynamics in the model is that sulfur dioxide, abundant on Io’s surface, would sublimate or evaporate when illuminated by the Sun. The sublimated sulfur dioxide would then flow onto the nightside where it is much colder and the atmosphere would condense back onto the surface. This insight on thin and condensable atmospheres is useful for exoplanet research where tidally locked rocky planets would evaporate or sublimate volatiles on the dayside where they would condense on the colder nightside.
  
Numerical simulation is interesting to say the least; it is a hearty mix of physics, math, and computer science. I have had previous experience in numerical analysis from a project I had worked on in a graduate course building and running a barotropic vorticity model useful for large scale weather prediction. What I have learned then is that even if your knowledge of physics and math is sublime, not knowing the intricacies of programming can be very detrimental.
The model is a system of three differential equations with three dependent variables: temperature, pressure, and flow velocity. The problem was essentially running this system of equations starting with fairly simple boundary conditions. The temperature was greatest (130 K) at the subsolar point where it would cool down until it hits nightside constant low of 50 K. The boundary flow condition is even simpler where it is zero at the subsolar point and it is zero again on the opposite side. The main core of the problem, I realized, is that there wasn’t really any boundary conditions given for the pressure and that part of what I have to do is to guess the initial pressure.
The flow on Io, as Ingersoll’s 1985 solutions show, actually surpasses the speed of sound. Runs with simple parameters and assumptions such as no drag and a high temperature gradient were shown to have flow several times higher than the speed of sound. This meant that the problem has inherent complications of instability akin to flow in a rocket nozzle and that I would have to employ various techniques to try and transition from subsonic flow to supersonic flow. This required a lot of precision to avoid round-off and truncation errors that adds to the instability as you approach the speed of sound.
Getting all the equations into Python wasn’t hard; making sure it runs like you expect was a different story. Even when I used all of the significant figures available for a 32-bit float, my solutions would either exponentially blow up or decay at Mach 0.6 or 0.7. I needed my solutions to get closer to Mach 1 so the next best thing I could do was to increase the resolution of my spatial grid. A shorter spatial step would minimize errors going from point to the next but it also forces the computer to do more calculations which can be very time-intensive at high resolution. The system of equations in the model that required the previous calculations to go into the next calculation meant that there’s no pathway for parallelization and that the code only runs from a single core. To save time, I would need to dynamically change the grid resolution as I approach Mach 1.
Another problem that I encountered was how to store the data as the code was running. Previous codes that I worked on were simple and the outputted dataset did not have to be stored right away. The Io code, with a small enough spatial grid, can generate billions of data points which may be enough to take up enough RAM to crash the computer; some of my more ambitious runs were force terminated as it took up too much RAM in the computer clusters at the Oxford physics department. Also adding diagnostics to your code is very important as most times when your code crashes, you won’t know why or how, you just know that it crashed.
Juggling the problems of making all your variables as precise as it can be, making sure the grid resolution is small enough and making sure it runs within the capabilities of your computer, I can tell you that I absolutely didn’t get close to getting a running code the first try. The problems I encountered weren’t explicitly mentioned in the paper upon which the model was based and I am still very curious how it was done in 1985 with far fewer computational capabilities than what we have now. It took me a lot of tries of changing how and where the resolution would increase  to reduce the instability while running on my computers at a reasonable time. When I changed planetary bodies from Io to a hypothesized snow-ball Earth along with more added physics, I still needed to make many small alterations just to get some good runs in.
You can learn a few things from this. First, reproducing experiments and work from earlier publications may be much harder than you think. Programming skills are vital in all sorts of fields within planetary sciences. And while building a program from various models is a science, through increments of small changes, getting your program to run within your capabilities is very much an art.

No comments:

Post a Comment