Opened 2 months ago
Last modified 10 hours ago
#140 assigned defect
grid convergence should be computed for all georeferenced data
Reported by: | Eric C. Landgraf | Owned by: | Olly Betts |
---|---|---|---|
Priority: | minor | Milestone: | |
Component: | cavern | Version: | |
Keywords: | Cc: |
Description
Currently, grid convergence is calculated only when *declination auto is set (with a representative coordinate). For normal people dealing with normal data, this is fine.
However, if any survey has *declination set to a fixed value, this is treated as the difference between Magnetic north and North in the output coordinate system (i.e. grid north).
IF input and output coordinates systems are set, we should *always* compute grid convergence, as that is an expected (and potentially impactful) difference.
Attachments (4)
Change History (15)
comment:1 Changed 2 months ago by
comment:2 Changed 2 months ago by
I think "inheriting" the *declination auto
location by default is a reasonable and "expected" behaviour. allowing nil (different from 0°) declinations seems worrysome, but may be moot in practice. My guess is most survex users who *care* about grid convergence and declination shifts over an area do set local *declination auto
statements with centerpoints.
One thing missing in your example: *declination 3.1 degrees
is the format (and we do error when there are no units). There is also a potential for *calibrate declination -3.1 degrees
: grid convergence should still be applied in that case, as well (although it need not allow a location).
Since grid convergence depends both on location and output projection, I wonder if it would be semantically cleaner to set grid convergence centerpoint using e.g. *cs out epsg:1234 <x> <y> <z>
. But handling it as part of declination is not inappropriate, and the coordinate system needs to be set for *declination auto
to work, regardless.
comment:3 Changed 2 months ago by
allowing nil (different from 0°) declinations seems worrysome
I'm unclear why this is a worry - did you miss the "Then we could warn about data processed without a declination value" part?
One thing missing in your example:
*declination 3.1 degrees
is the format (and we do error when there are no units).
Oh yes, should have the units of course.
*calibrate declination -3.1 degrees
That's handled by the same mechanism as *declination
so should just work. It's not been the recommended way to specify declination since 2015 so definitely no need to try to extend it to support a location.
*cs out epsg:1234 <x> <y> <z>
There's some logic to that, but it seems a feature few users will actually care enough to use and nobody has actually even asked for. Also we'd also need to come up with rules for precedence (given most existing data we be relying on the documented use of the declination location for calculating convergence we need to still support that) which is then more complexity to have to explain to users (and to implement and write tests for). Overall I think it makes more sense to stick with the scheme of calculating them at the same location (I'll also note that this is forced by therion's approach, and you don't even get direct control of the location used with therion).
comment:4 Changed 2 months ago by
allowing nil (different from 0°) declinations seems worrysome
I'm unclear why this is a worry - did you miss the "Then we could warn about data processed without a declination value" part?
Rather than thinking about this as "nil" (which seems to be a term you've invented rather than one I've used to refer to this anywhere) perhaps it's more helpful to think of it as "implicit 0" vs "explicit 0". In fact this is really already the case - if you don't set a declination then 0 is used, but you can also set the declination to zero explicitly - these have the same effect but are really different cases. We don't currently distinguish between them, but it would probably be useful to warn if that "implicit 0" is actually used somewhere because that's probably a bug in the data.
comment:5 Changed 4 weeks ago by
Component: | Other → cavern |
---|---|
Status: | new → assigned |
comment:6 Changed 3 weeks ago by
it would probably be useful to warn if that "implicit 0" is actually used somewhere because that's probably a bug in the data.
Perhaps we should only warn about this if a coordinate system is set. If you just a simple cave surveyed on one trip (or trips over a small time interval), which isn't geolocated and has one or no fixed point then declination is not so relevant. That may seem like a lot of conditions, but it's actually a pretty common case. If we warn about it people will probably just add *declination 0
to shut the warning up, then there's a risk that doesn't get removed when the data is later geolocated and the warning doesn't then fire when it would be useful.
comment:7 Changed 3 weeks ago by
Rather than thinking about this as "nil" (which seems to be a term you've invented rather than one I've used to refer to this anywhere) perhaps it's more helpful to think of it as "implicit 0" vs "explicit 0".
We're thinking the same thing. nil
as in "unspecified" c.f.:
From The Collaborative International Dictionary of English v.0.48 [gcide]:
Nil \Nil\, n. (computers)
A special value for a variable used in certain computer languages to mean no assigned value, to be distinguished from the value zero. [PJC]
I used the word nil because -
is an explicit unspecified value, rather than an implicit unspecified value.
We should warn or even error for georeferenced data with an unspecified declination (neither auto nor explicit-valued). This would include data where *date
is not set or is invalid, and *declination
is not set. I would define georeferenced data as that with an explicit output coordinate system and an explicit fixed point. In this case, we cannot give the user "reasonable" data.
We might warn also when explicit declination is different from computed auto declination: this indicates any of 1) an error in the data, 2) using declination for compass correction, or 3) using compass correction for declination---the last of these is a "valid" usecase, but not a common one, and should be done with *declination 0 degrees
to not also trigger the warning case above.
Those warnings apply to compass-based survey. *data cartesian
is a special case where someone may want grid convergence applied without applying declination: many cave maps are in local coordinate systems with grid north = true north, although the case does exist where grid north = magnetic north and *declination 5 degrees
is meaningful. We could advise setting *declination 0 degrees
if cartesian grid north = true north, and again warn when there is an output coordinate system and no declination at all. I'm not sure what the least surprising behaviour here would be, as it requires knowledge of the input data's grid: in fact, grid north = grid north is also a valid case here!
To think about the flow for what I want:
- user sets input and output coordinate systems, and has at least one fixed point. If the grid convergence coordinate is not set, we warn on this ("treating true north as grid north") and don't compute grid convergence.
- if there is a point at which to compute grid convergence, we do so and carry this through using normal
*begin/end
scoping to apply it to user-set declination values. - if auto-declination is also set, we apply computed declination+grid convergence to all other compass data, erroring (?) on missing dates (we already do this).
- If there is an implicit 0 declination with no auto-declination, we error: the user asked for an output coordinate system and we can't give them "reasonable" data. (exception? cartesian data, where we may only warn).
There is a complex case where a user sets declination on some data but does not have auto declination configured. If the user sets no declinations, we could merely say "Grid north is magnetic north" in our statistics block or when we set our fixed point. If the user sets some declinations, we can't warn/error in useful places---only at the end! How should we handle this data? But I think this case is somewhat beyond the scope of this ticket.
comment:8 Changed 3 weeks ago by
We should warn or even error for georeferenced data with an unspecified declination (neither auto nor explicit-valued). This would include data where
*date
is not set or is invalid, and*declination
is not set.
We already do, and there's test coverage for it:
tests/cmd_declination_auto.out:./cmd_declination_auto.svx:11: warning: No survey date specified - using 0 for magnetic declination tests/cmd_declination_auto.out:./cmd_declination_auto.svx:27: warning: No survey date specified - using 0 for magnetic declination tests/utm.out:./backread.dat:22: warning: No survey date specified - using 0 for magnetic declination
If you have a situation where this warning isn't firing but should, please can you provide a testcase.
It would also be very useful to have testcases for the other situations you this should be warnings or errors, and would avoid me misunderstanding the situation you're trying to describe.
I would define georeferenced data as that with an explicit output coordinate system and an explicit fixed point.
Note that cavern won't invent a fixed point if there's a specified coordinate system so the "and an explicit fixed point" part is redundant.
We might warn also when explicit declination is different from computed auto declination
I'm dubious about this - it would need to be with a tolerance since they're never going to be exactly the same. A new IGRF version will also compute different values to the previous one for recent years, since each version of the model is based on observations for the past but predictive for the future. There also doesn't seem to be a good way to fix or suppress this warning, so maintaining a warning-free dataset becomes hard - you'd pretty much have to copy the declination from the IGRF model to avoid the warning, but you're presumably specifying the declination by hand instead of using the model for a reason.
Re warning vs error, we generally aim not to break processing of existing datasets, so new diagnostics for existing features are a warning when they are likely to exist in real-world datasets and the way they were already (quietly) being handled seems reasonable. I think using 0 for declination or grid convergence is reasonable - certainly more reasonable then Survex 1.4.x+1 refusing to process a dataset which Survex 1.4.x processed.
comment:9 Changed 31 hours ago by
It would also be very useful to have testcases for the other situations you this should be warnings or errors, and would avoid me misunderstanding the situation you're trying to describe.
I think you said in chat you were working on some testcases?
Changed 18 hours ago by
Attachment: | gc-manual-declination.svx added |
---|
Changed 18 hours ago by
Attachment: | gc-mixed-declination.svx added |
---|
Changed 18 hours ago by
Attachment: | gc-not-computed.svx added |
---|
Changed 18 hours ago by
Attachment: | gc-auto-declination.svx added |
---|
comment:10 Changed 18 hours ago by
these are my preliminary test cases. the general idea is that all of these except gc-not-computed should output identical svx files. gc-not-computed should warn that there's likely to be grid convergence and declination that is unaccounted for.
comment:11 Changed 10 hours ago by
A few thoughts:
output identical svx files
Presumably you mean .3d
files?
; yes we pick an out-of-zone coordinate intentionally!
OK, but why? It's not obvious to me.
Also better to use an older date in testcases involving auto declination. The IGRF 14 model is valid until 2030, but it's (inevitably) predictive for dates after it was released, and only definitive up to 2020 so the answers IGRF 15 will give for 2020 and later may well change, and they're especially likely to change for 2025. That's particularly fiddly to update for when we're hard-coding the answer in an equivalent manual declination test and simpler to use a date which shouldn't require anything to be updated for a new IGRF model.
Most of this we've now covered in chat, but summarising so it's all together somewhere less ephemeral:
We need a location to compute convergence at, so it's not possible to always compute it for current datasets because
*declination 3.1
doesn't specify a location (nor does the total absence of a*declination
command being in force for some/all survey data).Therion calculates convergence (and declination) at the average of all fixed point locations, but I really don't like that approach as it means the convergence (and declination) values all change each time you add a fixed point (albeit probably not by very much), and also if you have a lot of fixed points off in one direction that will skew the location used towards that side of the survey. The other benefit of requiring a location to be specified is that multiple locations can be specified for different areas of a large cave system. Really the only thing going for Therion's approach is that the user doesn't need to concern themselves with where the location is (but that also means they can't even if they want to).
Doing something automatic based on fixed points in Survex would also mean we'd then have two different ways of determining the location to compute convergence at, which seems an unnecessary complication for users to have to understand.
It would also be much harder than you might expect to implement in cavern because we compute an x,y,z vector for each leg as we read it which requires the convergence value which requires knowing where all the fixed points are but we may not have seen all (or indeed any) of them yet - to use the average of fixed points or even to reliably use a/some fixed points, we'd need to read the data twice, or buffer up all the read data.
I think we need to extend
*declination
to allow a location with an explicit value, not just withauto
, e.g.:Then we issue a warning for
*declination 3.1
saying we need a location to compute convergence (in the case where we have coordinate systems specified).It's possible that some people are specifying explicit
*declination
values which include convergence currently, though probably unlikely as we do explicitly document*declination
as the magnetic/true difference, not magnetic/grid. I tend to think we probably shouldn't support that and should just recommend that anyone currently doing that should remove the convergence from the explicit value (or just move to usingauto
which is already recommended by the manual).Perhaps we can allow inheritance of location, so this works:
If we do, perhaps the location should be omittable for
auto
too, so you can switch back toauto
explicitly (and for consistency betweenauto
and explicit variants).We could perhaps also allow just setting the location:
This would allow setting a location in one place for explicit declinations in each survey to inherit, but without having that one place needing to use either
auto
or an explicit declination value which would end up being a default. Then we could warn about data processed without a declination value.