Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Include ghost cells when clipping netcdf and handle large topography #540

Merged
merged 4 commits into from
Nov 10, 2023

Conversation

bolliger32
Copy link
Contributor

@bolliger32 bolliger32 commented Aug 13, 2022

Addresses #539. Also addresses one other issue. Previously, the total number of grid cells in your supplied topography was limited to a 4-byte integer's values (2**31 -1). This was violated, for example, when using a large chunk of a global 15-arcsecond DEM (which has occasionally been necessary when running a global model to avoid model instability due to boundary fluxes). This PR changes that limit to a 8-byte integer (2**63 - 1).

@mandli
Copy link
Member

mandli commented Aug 14, 2022

I assume this now works for you?

@bolliger32
Copy link
Contributor Author

@mandli debugging some strange behavior where i get a segfault for certain North-south domain choices. For example (-180, -20, 180, 80) gives a segfault, but (-180, -10, 180, 80) does not. And the periodic conditions are only on the East-west boundaries. I'll change this to draft status until i sort that out

@bolliger32 bolliger32 changed the title Include ghost cells when clipping netcdf Draft: Include ghost cells when clipping netcdf Aug 14, 2022
@mandli
Copy link
Member

mandli commented Aug 14, 2022

That is odd. I am not sure I have tried anything beyond [-60,60] in latitude but I am not sure why any of that would matter.

@bolliger32
Copy link
Contributor Author

It could be due to our DEM somehow. It's indexed by grid cell centers so the furthest west is just slightly above -180 and furthest east is just slightly below. So perhaps it's having issues integrating the boundary grid cells when the boundaries go right up to the 180/-180 line? I'm not sure when changing the N-S domain boundary would affect this though.

@bolliger32
Copy link
Contributor Author

Hmm that didn't seem to be the issue

@bolliger32
Copy link
Contributor Author

oof it was an integer overflow in calculating mx*my. I'm going through and changing these to (kind=8) and will see if I can get it all sorted

@bolliger32
Copy link
Contributor Author

OK think I've handled it here. But definitely could use some 👀 to make sure I didn't screw something else up

@bolliger32 bolliger32 changed the title Draft: Include ghost cells when clipping netcdf Include ghost cells when clipping netcdf Aug 15, 2022
@bolliger32 bolliger32 changed the title Include ghost cells when clipping netcdf Include ghost cells when clipping netcdf and handle large topography Aug 15, 2022
@bolliger32
Copy link
Contributor Author

@mandli this should be ready for review now

@mandli
Copy link
Member

mandli commented Aug 15, 2022

This looks good to me. @rjleveque and @mjberger do you have any comments?

@bolliger32
Copy link
Contributor Author

OK this seems to be working but may have opened up another issue. I'm now seeing patches that have ONLY ghost cells in them, which messes up the bc2amr subroutine. I believe all patches should have at least one interior row or column (?) But with a domain [-180, -10, 180, 80] with periodic BCs on the E-W boundaries and extrapolation on the N-S, I'm seeing a patch that has ylo_patch=80 and yhi_patch=82 (i.e. just the northern ghost cells. This is causing array indexing errors (caught when compiling with the -fbounds-check flag in gfortran) in bc2amr...does this sound familiar by any chance?

@bolliger32
Copy link
Contributor Author

FWIW I confirmed that this issue exists after just the first commit in this PR, so is likely unrelated to all of the changes from kind=4 integer to kind=8 integer in the second commit. I doubt that just those changes in how the netcdf topography is clipped changed this behavior, so my hunch is that this issue would occur using the current master branch as well. It's a little difficult to test for sure as I don't have a global non-netcdf DEM.

Also, i confirmed that the issue does NOT exist without setting the periodic boundary conditions in the E-W direction. So something about setting periodic BC's in the EW direction causes some patches to only contain ghost cells on the north or south side of the domain, which then causes array indexing problems in the bc2amr step

@bolliger32
Copy link
Contributor Author

FWIW this can be handled by inserting this block above any of the bc checking in bc2amr, but want to make sure this isn't surprising behavior that might be indicative of a larger issue:

    if (xlo_patch == xupper .or. xhi_patch == xlower .or. ylo_patch == yupper .or. yhi_patch == ylower) then
        return
    end if

@mandli
Copy link
Member

mandli commented Aug 16, 2022

There are mechanisms for preventing what you are describing from happening so I am a bit surprised. Have you tried out the global simulation at all and see if that has the same problem with similar settings?

@bolliger32
Copy link
Contributor Author

bolliger32 commented Aug 16, 2022

I haven't but can give that a whirl when I get a sec. do you have the topography files handy by any chance?

Relatedly, am I correct that the spherical BC implementation is not complete? In the docs, it looks like this is possible with boundary conditions set to 4. However, in the clawutil python codes, the ClawInputData object does not accept any values above 3. And in amr2.f90, "5" is recognized as the spherical domain and it is assumed that this will just be specified for the N-S direction, while E-W will be periodic.

Do you know if this would be expected to work with bc_lower = [2, 5] and bc_upper = [2, 5] with domain [-180, -90, 180, 90] (assuming the ClawInputData check was changed to allow this? Or is this functionality not yet fully implemented?

@mandli
Copy link
Member

mandli commented Aug 16, 2022

Not off hand. I think we tried to just take strips off of the NGDC site. I imagine you could get a faster set of data from GEBCO though.

In terms of the bounds, I would not expect the grid to work well that close to the poles given the distortion in the grids.

@bolliger32
Copy link
Contributor Author

cool will give it a try thanks! So sounds like even if we were to modify ClawInputData to allow bc's of type 5, that the spherical domain functionality probably doesn't work anyways given the distortion?

@bolliger32
Copy link
Contributor Author

The docs do point to this paper by @rjleveque and others about using the logically rectangular grid to represent a spherical domain, so I assume there was some attempt to make this work in clawpack. But perhaps it's not fully implemented?

@mandli
Copy link
Member

mandli commented Aug 16, 2022

That actually works pretty well but requires a significant mapping of the grid. It is also of course not lat-long, which is the real problem. If you really want to do the entire globe this would be a good way to do it. We would have to go in and change a bunch of stuff to handle the mapping though. Maybe coordinate system 3?

@bolliger32
Copy link
Contributor Author

ah interesting! At the moment, it was a question borne more out of curiosity than something that we're ready to dive into. I don't know if I follow how the mapping would occur, but I understand the issues with lat/lon grid distortion. Maybe something thats worth a conversation at a later date if this is something worth exploring. The periodic E-W boundary conditions are working great though (outside of this unexpected ghost-only patches issue) and are solving our issues with some storms that get close to the dateline and cause significant boundary fluxes when we place a domain boundary there to coincide with our [-180, 180] DEM.

In the meantime I'll see if that example script is giving the same error

@rjleveque
Copy link
Member

@bolliger32, Thanks for your work on this!

Regarding your question #540 (comment), I am not sure offhand why setting periodic BCs in the east-west direction would cause some grid patches to only have ghost cells. Maybe @mjberger has some thoughts on this?

@mjberger
Copy link
Contributor

mjberger commented Aug 24, 2022 via email

@mandli
Copy link
Member

mandli commented Aug 25, 2022

@mjberger I thought we had addressed this already but maybe we missed something?

@mandli
Copy link
Member

mandli commented Sep 7, 2022

Just want to see if you got this to work @mjberger

@mjberger
Copy link
Contributor

mjberger commented Oct 11, 2022 via email

@rjleveque
Copy link
Member

Does this PR now fix the periodic BC issue? I was looking again at this comment from Aug 16 suggesting adding

    if (xlo_patch == xupper .or. xhi_patch == xlower .or. ylo_patch == yupper .or. yhi_patch == ylower) then
        return
    end if

to bc2amr.f90, but I don't see that it has been added in this PR.

Is this really the right thing to do? It seems like this is just checking if the edge of the ghost cells on at least one side of the patch happens to align exactly with the domain boundary on that side. In which case no ghost cell BCs are needed on that side, but possibly still are needed on one of the other 3 sides?

But I may be missing the point here...

@bolliger32
Copy link
Contributor Author

@rjleveque good point. The title of this PR was updated to indicate that this does not address what i think is the root issue which is "why are we getting patches consisting only of ghost cells that exist entirely outside of the domain?". Instead, it just makes sure that when loading netcdf topo files, we clip the netcdf to make sure that we include the ghost cells rather than just the computational domain. This was what was originally causing the segfault and led me to finding out about these "fully ghost patches", but it is fixing issues that could arise more broadly (i.e. when users have large domains with large topo files, or if a user-defined boundary condition somehow uses the topography from ghost cells just outside the domain). Thus, I think this PR should only change the netcdf loading and handling of large topo files.

However, after implementing this fix, a separate segfault issue then arose when using periodic boundary conditions which is that the array indexing in bc2amr will be out of bounds when trying to set values within these "ghost patches". This PR does NOT address that. I could add one that does via a hacky approach using the snippet you posted, but my understanding was that this situation should not occur anyway, and thus represents something that should be investigated upstream rather than caught and handled within bc2amr. Does that sound right? I could always add that snippet to handle these cases if desired within this PR

@bolliger32
Copy link
Contributor Author

@mandli flagging for review

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants