-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use solve_ivp without the py-pde wrapper, but retain fields and grids from py-pde, for more readable code #44
Use solve_ivp without the py-pde wrapper, but retain fields and grids from py-pde, for more readable code #44
Conversation
…eureux. Expressions become even more involved, unfortunately.
… is the opposite of what is expected. Also end up with equally unstable solutions, i.e. the well-known oscillations along the depth axis. Apparently the sparsity matrix does result in trying to find a better solution, but ultimately fails. Also fixed a bug in Derive_Jacobian.py which luckily was not present in the actual rhs computations. Implemented first steps in setting up the structure for Jacobian computations, so beyond the sparsity matrix. A lot of Numba obviously.
…ent yet. However, it is possible to integrate over longer times without an overflow error. But also running into memory overflows.
…r all Jacobian indices. I had to bypass a couple of Numba's limitations to get that working. However, it ultimately leads to a memory overflow for long integrations. Removing all njit decorators in Compute_jacobian.py makes it faster! So weird. So something is still wrong at the Numba level there. It is also clear now that, even with a Jacobian, the integrations fail and lead to bogus results such as negative cCO3 concentrations and negative values for Phi, so log Phi leads to a "FloatingPointError: invalid value encountered in log". Big bummer. Perhaps try Radau instead of BDF. Or LSODA?
…h slower than the noncompiled version. I still do not understand why and I tried some things to figure this out, e.g. by not using the jac00...jac44 functions, but that did not make a difference. The Jacobian_adjusted_to_reuse_common_factors_and_powers_cleared.txt file should have all the correct Jacobian terms since it comes directly from Derive_Jacobian.py after which I substituted some terms in an editor.
… a vectorize decorator to the overloaded np.heaviside function with the help of numba/numba@2ae154f
…ange has been replaced by range. This is done to investigate if there is an effect on memory use and run time. Less memory use is expected single threaded and perhaps the run times will not increase since integrations seem to scale very poorly with the number of available cores.
…threaded computations.
…ning that off may lead to more accurate results. Also, 'nogil = True' is no longer needed for single core computations.
… in combination with calling functional Jacobians. This was attributed to repeated Numba compilations, possibly from Numba problems with nested functions. That is why the compute_all_Jacobian_elements nested function has been removed (as a separate function), but slowness persisted.
…obians, since that was a long time ago, but as far as I remember none of these runs were successful. This is confirmed by more recent results when we provided a Jacobian sparsity matrix, see commit 2197188: the diagonals should be banded, i.e. have a width of more than 1 element, in order for the Jacobian sparsity matrix to enhance integration. The same should apply for analytical Jacobians. However, the off-diagional elements of an analytical Jacobian matrix will be very hard to compute. This means that at this point we will stop our efforts on deriving analytical Jacobian matrices and only provide Jacobian sparsity matrices. For some background on why Jacobian matrices are banded, please see the literature, e.g. equation 2.58 (page 28) from Finite Difference Methods Finite Difference Method for Differential Equationsby Randall J. Leveque (https://edisciplinas.usp.br/pluginfile.php/41896/mod_resource/content/1/LeVeque%20Finite%20Diff.pdf). This example is about the Jacobian for solving the pde describing the motion of a pendulum with a certain mass at the end of a rigid (but massless) bar.
…in'. This means that parameters will be in a separate file and not in ScenarioA.py. Also, we want to include the marlpde folder.
… in the literature.
…als, i.e. the diagonals will have the width of only one depth node. We now know that due to discretisation, there should also be non-zero elements adjacent to the diagonals, but these are very hard to compute.
…be derived after 'eq' has been defined. 'number_of_depths' --> 'Number_of_depths'.
…d in this module. However, it is now calculated in the parameters module, so we no longer need these imports here.
…__' it will complain that it does not know 'no_t_eval'.
…ons, times and metadata using py-pde's FileStorage class seems cumbersome when no tracking is applied. Therefore I reverted to the regular way of saving Numpy arrays in an hdf5 file. All the metadata, i.e. 'stored_parms', which is a single dict, can be stored as well, except for the Jacobian sparsity matrix, since that will give rise to 'TypeError: Object dtype dtype('O') has no native HDF5 equivalent'. This csr_matrix has to be converted to a ndarray first, I reckon.
…eeded for this branch.
…che=False may be noticeably slower for short runs, but sometimes causes a run to halt from start, when a compiled object is missing.
… slightly different data format that solve_ivp returns, i.e. solution.y contains the solutions for the five fields and the [:, -1] indexing gives the last ones across all depths. integrate_equations now gives six return values instead of five. solve_ivp uses 'first_step' instead of 'dt'.
… Numba-based evaluations of the right-hand sides. The Solver dataclass now provides for that. A conditional has been added to check if the 'jac_sparsity' attribute exists. It will not exist for explicit solvers.
…ield, this is more convenient than a single dimension covering all depths for all fields. Also store in this way, as an hdf5. Have 'integrate_equations' only return the final solution, since we only use that for plotting.
…stored results taking excessive disk space.
…of 6. And only the final solutions, which makes comparison with the ground truth somewhat simpler in terms of indexing.
… formatting of floats is often more readable.
…store them in the hdf5 file, one has to iterate over this list and create a separate dataset for each list item, i.e. for each ndarray.
I am currently looking at issue #43 , which is about documentation. Last bullet: In
and
without the @EmiliaJarochowska Pls let me know if tracking (and saving) U at the bottom has to be readded. |
After discussion: will not be readded anytime soon, instead a similar feature in rhythmite will be deployed. |
…en merged into 'main'. Grammar correction. A constant porosity diffusion coefficient is now in all branches. Functional Jacobians turn out not to applicable for this project, because of the discretization. The use of py-pde is now limited to its ScalarField and CartesianGrid.
FYI I managed to crash it:
by setting Phi0 to 0.9, Phi00 to 0.8, and b to 2.66667. Otherwise works and is indeed very fast. But the number of solver options and their implications for the numerical methods strengthens the argument that some additional documentation is needed. I will continue this discussion in the respective issue. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pylint highlights some minor issues here and there but the code works so I'll merge now.
I interrupted that run but here are the errors:
|
Yes, I think we discussed those cases as part of issue #36 . |
Btw, you shared the warning, was there also a crash? We do not want marlpde to crash on very small time steps, so that is why we have under="warn" |
I am not sure if it crashed, it got stuck at 26% and didn't move, just displayed the warning. Shall I re-try? |
Yes, please retry, I'd be surprised if it really halted. |
It didn't crash, just stuck at 26%. Been at 26% for > 1 h now without any progress. But again, this is a crash case. |
Okay, this is something we do not want to solve, let's leave it. |
Oh, now I discovered that it actually finished. But with the following:
and
but yes, it's not to be fixed. |
Given the arguments from this closed discussion, I think we should proceed with the methodology from the
Use_solve_ivp_without_py-pde_wrapper_branch
and merge it intomain
to include this into our release.In summary, the
Use solve ivp without py pde wrapper
branch offers these features:main
branch.solve_ivp
, such as a Jacobian sparsity matrixwhile the unique features of the
main
branch are less important to us.