diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 7a741fdb5..2b705618f 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -34,11 +34,11 @@ from an `Observable`. - `src/hatanaka/mod.rs`: the Hatanaka module contains the RINEX Compressor and Decompressor - `src/antex/antenna.rs`: defines the index structure of ANTEX format -NAV RINEX -========= +Navigation Data +=============== -Orbit instantaneous parameters, broadcasted by GNSS vehicles, are presented in different -forms depending on the RINEX revision and the GNSS constellation. +Orbit broadcasted parameters are presented in different form depending on the RINEX revisions +and also may differ in their nature depending on which constellation we're talking about. To solve that problem, we use a dictionary, in the form of `src/db/NAV/orbits.json`, which describes all fields per RINEX revision and GNSS constellation. @@ -56,3 +56,60 @@ Introducing a new RINEX type `src/meteo/mod.rs` is the easiest format and can serve as a guideline to follow. When introducing a new Navigation Data, the dictionary will most likely have to be updated (see previous paragraph). + +GNSS Constellations +=================== + +Supported constellations are defined in the Constellation Module. +This structure defines both Orbiting and Stationary vehicles. + +Adding new SBAS vehicles +======================== + +To add a newly launched SBAS vehicles, simply add it to the +rinex/db/SBAS/sbas.json database. + +The only mandatory fields are : +- the "constellation" field +- the SBAS "prn" field (which is 100 + prn number) +- "id": the name of that vehicle, for example "ASTRA-5B" +- "launched\_year": the year this vehicle was launched + +Other optional fields are: +- "launched\_month": month ths vehicle was launched +- "launched\_day": day of month this vehicle was launched + +We don't support undeployed vehicles (in advance). + +Build scripts +============= + +The build script is rinex/build.rs. + +It is responsible for building several important but hidden structures. + +1. Navigation RINEX specs, described by rinex/db/NAV +2. Geostationary vehicles identification in rinex/db/sbas/sbas.json, +that follows the L1-CA-PRN Code assignment specifications (see online specs). +3. rinex/db/SBAS/*.wkt contains geographic definitions for most +standard SBAS systems. We parse them as Geo::LineStrings to +define a contour area for a given SBAS system. This gives one method +to select a SBAS from given location on Earth + +Crate dependencies +================== + +- `qc-traits` and `sinex` are core libraries. +- `rinex` is the central dependency to most other libraries or applications. +- tiny applications like `rnx2crx`, `crx2rnx` and `ublox-rnx` only depend on the rinex crate +- `sp3` is a library that only depends on `rinex` +- `gnss-rtk` is a library that depends on `rinex`, `sp3` and `rinex-qc` +- `cli` is an application that exposes `rinex-qc`, `gnss-rtk`, `sp3` and `rinex` + +External key dependencies: + +- `Hifitime` (timing lib) is used by all libraries +- `Nyx-space` (navigation lib) is used by `gnss-rtk` +- `Ublox-rs` (UBX protocol) is used by `ublox-rnx` + + diff --git a/README.md b/README.md index 7c12243c6..9d7d64c9a 100644 --- a/README.md +++ b/README.md @@ -22,10 +22,12 @@ and we aim towards advanced geodesic and ionospheric analysis. - Seamless .gzip decompression with `flate2` compilation feature - RINEX V4 full support, that includes modern Navigation messages - Meteo RINEX full support -- IONEX and Clock RINEX partial support, will be concluded soon +- IONEX (2D) support, partial 3D support +- Clock RINEX partial support: to be concluded soon - File merging, splitting and pre processing - Modern constellations like BeiDou, Galileo and IRNSS - Supported time scales are GPST, BDT, GST, UTC +- Supports many SBAS, refer to online documentation - Full support of Military codes : if you're working with such signals you can at least run a -qc analysis, and possibly the position solver once it is merged - Supports high precision RINEX (scaled phase data with micro cycle precision) @@ -41,7 +43,6 @@ summon from the "cli" application directly. - QZNSST is represented as GPST at the moment - GLONASST and IRNSST are not supported : calculations (mostly orbits) will not be accurate -- Partial SBAS support : some features are not yet available - The command line tool does not accept BINEX or other proprietary formats - File production is not fully concluded to this day, some formats are still not correctly supported (mostly NAV). @@ -73,7 +74,7 @@ RINEX formats & applications |----------------------------|-------------------|---------------------|----------------------|----------------------|--------------------------| ---------------------| | Navigation (NAV) | :heavy_check_mark:| Ephemeris :construction: V4 :construction: | :heavy_check_mark: :chart_with_upwards_trend: | :construction: | Orbit parameters, Ionospheric models.. | Epoch iteration | | Observation (OBS) | :heavy_check_mark:| :heavy_check_mark: | :heavy_check_mark: :chart_with_upwards_trend: | :construction: | Phase, Pseudo Range, Doppler, SSI | Epoch iteration | -| CRINEX (Compressed OBS) | :heavy_check_mark:| RNX2CRX1 :heavy_check_mark: RNX2CRX3 :construction: | :heavy_check_mark: :chart_with_upwards_trend: | :construction: | see OBS Data | Epoch iteration | +| CRINEX (Compressed OBS) | :heavy_check_mark:| RNX2CRX1 :heavy_check_mark: RNX2CRX3 :construction: | :heavy_check_mark: :chart_with_upwards_trend: | :construction: | Phase, Pseudo Range, Doppler, SSI | Epoch iteration | | Meteorological data (MET) | :heavy_check_mark:| :heavy_check_mark: | :heavy_check_mark: :chart_with_upwards_trend: | :construction: | Meteo sensors data (Temperature, Moisture..) | Epoch iteration | | Clocks (CLK) | :heavy_check_mark:| :construction: | :construction: |:construction: | Clock comparison | Epoch iteration | | Antenna (ATX) | :heavy_check_mark:| :construction: | :construction: |:construction: | Antenna calibration data | Sorted by `antex::Antenna` | diff --git a/crx2rnx/Cargo.toml b/crx2rnx/Cargo.toml index 7a9881443..f3f811aea 100644 --- a/crx2rnx/Cargo.toml +++ b/crx2rnx/Cargo.toml @@ -12,4 +12,4 @@ readme = "README.md" [dependencies] clap = { version = "4", features = ["derive", "color"] } -rinex = { path = "../rinex", version = "=0.14.0", features = ["serde"] } +rinex = { path = "../rinex", version = "=0.14.1", features = ["serde"] } diff --git a/crx2rnx/src/cli.rs b/crx2rnx/src/cli.rs index f9f4f2957..b778e482d 100644 --- a/crx2rnx/src/cli.rs +++ b/crx2rnx/src/cli.rs @@ -33,7 +33,7 @@ impl Cli { } } pub fn input_path(&self) -> &str { - &self.matches.get_one::("filepath").unwrap() + self.matches.get_one::("filepath").unwrap() } pub fn output_path(&self) -> Option<&String> { self.matches.get_one::("output") diff --git a/crx2rnx/src/main.rs b/crx2rnx/src/main.rs index 8bb721222..46c8308bf 100644 --- a/crx2rnx/src/main.rs +++ b/crx2rnx/src/main.rs @@ -13,9 +13,9 @@ fn main() -> Result<(), rinex::Error> { Some(path) => path.clone(), _ => { // deduce from input path - match input_path.strip_suffix("d") { + match input_path.strip_suffix('d') { Some(prefix) => prefix.to_owned() + "o", - _ => match input_path.strip_suffix("D") { + _ => match input_path.strip_suffix('D') { Some(prefix) => prefix.to_owned() + "O", _ => match input_path.strip_suffix("crx") { Some(prefix) => prefix.to_owned() + "rnx", diff --git a/doc/dependencies.png b/doc/dependencies.png new file mode 100644 index 000000000..4a320af4b Binary files /dev/null and b/doc/dependencies.png differ diff --git a/doc/plots/sp3_residual.png b/doc/plots/sp3_residual.png old mode 100755 new mode 100644 diff --git a/doc/plots/tec.png b/doc/plots/tec.png new file mode 100644 index 000000000..bed8c6706 Binary files /dev/null and b/doc/plots/tec.png differ diff --git a/gnss-rtk/Cargo.toml b/gnss-rtk/Cargo.toml index e0f8bfba8..e51dc6e69 100644 --- a/gnss-rtk/Cargo.toml +++ b/gnss-rtk/Cargo.toml @@ -14,6 +14,9 @@ readme = "README.md" [dependencies] log = "0.4" thiserror = "1" +nalgebra = "=0.32" nyx-space = "2.0.0-alpha.2" -rinex-qc = { path = "../rinex-qc", features = ["serde"] } -rinex = { path = "../rinex", features = ["serde", "flate2", "sbas", "obs", "nav", "qc", "processing"] } +hifitime = { version = "3.8.4", features = ["serde", "std"] } +rinex-qc = { path = "../rinex-qc", version = "=0.1.4", features = ["serde"] } +rinex = { path = "../rinex", version = "=0.14.1", features = ["serde", "flate2", "sbas", "obs", "nav", "qc", "processing"] } +serde = { version = "1.0", optional = true, default-features = false, features = ["derive"] } diff --git a/gnss-rtk/README.md b/gnss-rtk/README.md index 8e6979f68..959d7d1ef 100644 --- a/gnss-rtk/README.md +++ b/gnss-rtk/README.md @@ -1,8 +1,106 @@ GNSS-RTK ======== -Precise position solver. +[![crates.io](https://img.shields.io/crates/v/gnss-rtk.svg)](https://crates.io/crates/gnss-rtk) +[![rustc](https://img.shields.io/badge/rustc-1.64%2B-blue.svg)](https://img.shields.io/badge/rustc-1.64%2B-blue.svg) +[![crates.io](https://docs.rs/gnss-rtk/badge.svg)](https://docs.rs/gnss-rtk/badge.svg) + +RTK precise position and timing solver, in rust. The Solver can work from a RINEX context blob, defined in this crate toolsuite, but is not exclusively tied to RINEX. The solver implements precise positioning algorithms, which are based on raw GNSS signals. + +Performances +============ + +I'm able to resolve every single Epoch in a modern 24h data context, in about 1 second, on my 8 core CPU. + +Solving method +============== + +Only a straightforward Matrix based resolution method is implemented. +Other solutions, like Kalman filter, exist and could potentially improve performances +at the expense of more complexity and possibly + +The matrix resolution technique gives the best result for every single epoch + +- there are no initialization iterations +- there is no iteration or recursive behavior + +Behavior and Output +=================== + +The solver will try to resolve a position for every single existing Epoch. + +When working with RINEX, preprocessing operations may apply. +If you're working with the attached "cli" application, this is done with `-P`. +For example, if the input context is huge, a smoothing or decimation + +The solver will output a SolverEstimate object on each resolved Epoch. +Refer to this structure's documentation for more information. + +Timing DOP and Position DOP are estimated and attached to every single result. + +SPP +=== + +The solver supports the spp strategy. This strategy is the only strategy we can deploy +on single carrier context. It is most likely the unique strategy you can deploy if you're working +with old RINEX (like GPS only V2), or single frequency RINEX data. + +When using SPP : + +- you can only hope for residual errors of a few meters +- an interpolation order above 9 makes no sense +- Ionospheric delay must be considered and modeled. Refer to the related section. + +If you're operating this library from the "cli" application integrated to this toolsuite, +a forced `--spp` mode exists. It is a convenient way to restrict this library to SPP solving +and compare it to PPP. + +PPP +=== + +The solver will adapt to PPP strategy if the context is sufficient (more than one carrier). +PPP simplifies the solving process greatly, ionospheric delay is cancelled and does not have to be taken into account. + +PPP is deployed if you're typically working with modern RINEX data. + +We allow the possibility to deploy a PPP strategy without SP3 data. This is not a typical use case. +Other tools like glab or rtklib probably do not allow this. +You need to understand that in this case, you want good navigation data quality in order to reduce +the error their interpolation will introduce. + +When working with PPP, we recommend the interpolation order to be set to 11 (or above). + +Ionospheric Delay +================= + +TODO + +SP3 and Broadcast Ephemeris +=========================== + +The solver will always prefer SP3 over Broadcast ephemeris. +That stands whatever the solving method and strategy might be. + +RTK from RINEX +============== + +The solver can be initialized from a RINEX context, defined as `QcContext` in the RINEX library suite. +This structure is adaptable and quite efficient. For example it allows the combination of both +SP3 and Broadcast Ephemeris. + +When initialized from RINEX, we can determine whether PPP is feasible or not + +RTK Configuration +================= + +The RTKConfiguration structure, describes all configuration and customization +the solver supports. + +It is important to understand how, when and what to customize depending on your goals. + +When working with the "cli" application, you can provide an RTKConfiguration +in the form of JSON, with `--rtk-cfg`. diff --git a/gnss-rtk/src/cfg.rs b/gnss-rtk/src/cfg.rs new file mode 100644 index 000000000..090552732 --- /dev/null +++ b/gnss-rtk/src/cfg.rs @@ -0,0 +1,131 @@ +use crate::model::Modeling; +use crate::SolverType; +use hifitime::prelude::TimeScale; + +use std::str::FromStr; + +#[cfg(feature = "serde")] +use serde::Deserialize; + +use rinex::prelude::GroundPosition; + +use rinex::observation::Snr; + +fn default_timescale() -> TimeScale { + TimeScale::GPST +} + +fn default_interp() -> usize { + 7 +} + +fn default_max_sv() -> usize { + 10 +} + +fn default_smoothing() -> bool { + false +} + +fn default_iono() -> bool { + false +} + +fn default_tropo() -> bool { + false +} + +#[derive(Default, Debug, Clone, PartialEq)] +#[cfg_attr(feature = "serde", derive(Deserialize))] +pub struct RTKConfig { + /// Time scale + #[cfg_attr(feature = "serde", serde(default = "default_timescale"))] + pub timescale: TimeScale, + /// positioning mode + #[cfg_attr(feature = "serde", serde(default))] + pub mode: SolverMode, + /// (Position) interpolation filter order. + /// A minimal order must be respected for correct results. + /// - 7 when working with broadcast ephemeris + /// - 11 when working with SP3 + #[cfg_attr(feature = "serde", serde(default = "default_interp"))] + pub interp_order: usize, + /// Whether the solver is working in fixed altitude mode or not + #[cfg_attr(feature = "serde", serde(default))] + pub fixed_altitude: Option, + /// Position receveir position, if known before hand + pub rcvr_position: Option, + /// PR code smoothing filter before moving forward + #[cfg_attr(feature = "serde", serde(default = "default_smoothing"))] + pub code_smoothing: bool, + /// true if we're using troposphere modeling + #[cfg_attr(feature = "serde", serde(default = "default_tropo"))] + pub tropo: bool, + /// true if we're using ionosphere modeling + #[cfg_attr(feature = "serde", serde(default = "default_iono"))] + pub iono: bool, + /// Minimal percentage ]0; 1[ of Sun light to be received by an SV + /// for not to be considered in Eclipse. + /// A value closer to 0 means we tolerate fast Eclipse exit. + /// A value closer to 1 is a stringent criteria: eclipse must be totally exited. + #[cfg_attr(feature = "serde", serde(default))] + pub min_sv_sunlight_rate: Option, + /// Minimal elevation angle. SV below that angle will not be considered. + pub min_sv_elev: Option, + /// Minimal SNR for an SV to be considered. + pub min_sv_snr: Option, + /// modeling + #[cfg_attr(feature = "serde", serde(default))] + pub modeling: Modeling, + /// Max. number of vehicules to consider. + /// The more the merrier, but it also means heavier computations + #[cfg_attr(feature = "serde", serde(default = "default_max_sv"))] + pub max_sv: usize, +} + +impl RTKConfig { + pub fn default(solver: SolverType) -> Self { + match solver { + SolverType::SPP => Self { + timescale: default_timescale(), + mode: SolverMode::default(), + fixed_altitude: None, + rcvr_position: None, + interp_order: default_interp(), + code_smoothing: default_smoothing(), + tropo: default_tropo(), + iono: default_iono(), + min_sv_sunlight_rate: None, + min_sv_elev: Some(10.0), + min_sv_snr: Some(Snr::from_str("weak").unwrap()), + modeling: Modeling::default(), + max_sv: default_max_sv(), + }, + SolverType::PPP => Self { + timescale: default_timescale(), + mode: SolverMode::default(), + fixed_altitude: None, + rcvr_position: None, + interp_order: 11, + code_smoothing: default_smoothing(), + tropo: default_tropo(), + iono: default_iono(), + min_sv_sunlight_rate: Some(0.75), + min_sv_elev: Some(25.0), + min_sv_snr: Some(Snr::from_str("strong").unwrap()), + modeling: Modeling::default(), + max_sv: default_max_sv(), + }, + } + } +} + +#[derive(Default, Debug, Clone, Copy, PartialEq)] +#[cfg_attr(feature = "serde", derive(Deserialize))] +pub enum SolverMode { + /// Receiver is kept at fixed location + #[default] + Static, + /// Receiver is not static + Kinematic, +} diff --git a/gnss-rtk/src/estimate.rs b/gnss-rtk/src/estimate.rs new file mode 100644 index 000000000..4e98a7002 --- /dev/null +++ b/gnss-rtk/src/estimate.rs @@ -0,0 +1,63 @@ +use nyx_space::cosmic::SPEED_OF_LIGHT; +// use nalgebra::linalg::svd::SVD; +use nalgebra::base::{DVector, MatrixXx4}; + +#[cfg(feature = "serde")] +use serde::{Deserialize, Serialize}; + +/* + * Solver solution estimate + * is always expressed as a correction of an 'a priori' position +*/ +#[derive(Debug, Copy, Clone, Default)] +#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] +pub struct SolverEstimate { + /// X coordinates correction + pub dx: f64, + /// Y coordinates correction + pub dy: f64, + /// Z coordinates correction + pub dz: f64, + /// Time correction + pub dt: f64, + /// Dilution of Position Precision, horizontal component + pub hdop: f64, + /// Dilution of Position Precision, vertical component + pub vdop: f64, + /// Time Dilution of Precision + pub tdop: f64, +} + +impl SolverEstimate { + /* + * Builds a new SolverEstimate from `g` Nav Matrix, + * and `y` Nav Vector + */ + pub fn new(g: MatrixXx4, y: DVector) -> Option { + //let svd = g.clone().svd(true, true); + //let u = svd.u?; + //let v = svd.v_t?; + //let s = svd.singular_values; + //let s_inv = s.pseudo_inverse(1.0E-8).unwrap(); + //let x = v * u.transpose() * y * s_inv; + + let g_prime = g.clone().transpose(); + let q = (g_prime.clone() * g.clone()).try_inverse()?; + let x = q * g_prime.clone(); + let x = x * y; + + let hdop = (q[(0, 0)] + q[(1, 1)]).sqrt(); + let vdop = q[(2, 2)].sqrt(); + let tdop = q[(3, 3)].sqrt(); + + Some(Self { + dx: x[0], + dy: x[1], + dz: x[2], + dt: x[3] / SPEED_OF_LIGHT, + hdop, + vdop, + tdop, + }) + } +} diff --git a/gnss-rtk/src/lib.rs b/gnss-rtk/src/lib.rs index 1474df897..9195b3e30 100644 --- a/gnss-rtk/src/lib.rs +++ b/gnss-rtk/src/lib.rs @@ -1,35 +1,68 @@ use nyx_space::cosmic::eclipse::{eclipse_state, EclipseState}; use nyx_space::cosmic::{Orbit, SPEED_OF_LIGHT}; use nyx_space::md::prelude::{Bodies, LightTimeCalc}; -use rinex::prelude::{Duration, Epoch, Sv}; +use rinex::navigation::Ephemeris; +use rinex::prelude::{ + //Duration, + Epoch, + Sv, +}; use rinex_qc::QcContext; use std::collections::HashMap; +use hifitime::{Duration, TimeScale, Unit}; + extern crate nyx_space as nyx; +use nalgebra::base::{ + DVector, + MatrixXx4, + //Vector1, + //Vector3, + //Vector4, +}; use nyx::md::prelude::{Arc, Cosm}; -mod models; -mod opts; +mod cfg; +mod estimate; +mod model; pub mod prelude { - pub use crate::opts::PositioningMode; - pub use crate::opts::SolverOpts; + pub use crate::cfg::RTKConfig; + pub use crate::cfg::SolverMode; + pub use crate::estimate::SolverEstimate; + pub use crate::model::Modeling; pub use crate::Solver; - pub use crate::SolverEstimate; + pub use crate::SolverError; pub use crate::SolverType; } -use opts::SolverOpts; +use cfg::RTKConfig; +use estimate::SolverEstimate; +use model::Modeling; -use log::{debug, trace, warn}; +use log::{debug, error, trace, warn}; use thiserror::Error; -#[derive(Debug, Clone, Copy, Error)] -pub enum Error { - #[error("provided context is either unsufficient or invalid for any position solving")] +#[derive(Debug, Clone, Error)] +pub enum SolverError { + #[error("provided context is either not sufficient or invalid")] Unfeasible, + #[error("apriori position is not defined")] + UndefinedAprioriPosition, + #[error("failed to initialize solver - \"{0}\"")] + InitializationError(String), + #[error("no vehicles elected @{0}")] + NoSv(Epoch), + #[error("not enough vehicles elected @{0}")] + LessThan4Sv(Epoch), + #[error("failed to retrieve work epoch (index: {0})")] + EpochDetermination(usize), + #[error("badop: solver not initialized")] + NotInitialized, + #[error("failed to invert navigation matrix @{0}")] + SolvingError(Epoch), } #[derive(Default, Debug, Clone, Copy, PartialEq)] @@ -53,60 +86,42 @@ impl std::fmt::Display for SolverType { } impl SolverType { - fn from(ctx: &QcContext) -> Result { + fn from(ctx: &QcContext) -> Result { if ctx.primary_data().is_observation_rinex() { - if ctx.has_sp3() { - Ok(Self::PPP) - } else { - if ctx.has_navigation_data() { - Ok(Self::SPP) - } else { - Err(Error::Unfeasible) - } - } + //TODO : multi carrier for selected constellations + Ok(Self::SPP) } else { - Err(Error::Unfeasible) + Err(SolverError::Unfeasible) } } } #[derive(Debug)] pub struct Solver { - /// Cosmic model - cosmic: Arc, /// Solver parametrization - pub opts: SolverOpts, - /// Whether this solver is initiated (ready to iterate) or not - initiated: bool, + pub cfg: RTKConfig, /// Type of solver implemented pub solver: SolverType, - /// Current epoch + /// cosmic model + cosmic: Arc, + /// true if self has been initiated and is ready to compute + initiated: bool, + /// current epoch nth_epoch: usize, - /// current estimate - pub estimate: SolverEstimate, -} - -#[derive(Debug, Copy, Clone, Default)] -pub struct SolverEstimate { - /// Position estimate - pub pos: (f64, f64, f64), - /// Time offset estimate - pub clock_offset: Duration, } impl Solver { - pub fn from(context: &QcContext) -> Result { + pub fn from(context: &QcContext) -> Result { let solver = SolverType::from(context)?; Ok(Self { cosmic: Cosm::de438(), solver, initiated: false, - opts: SolverOpts::default(solver), + cfg: RTKConfig::default(solver), nth_epoch: 0, - estimate: SolverEstimate::default(), }) } - pub fn init(&mut self, ctx: &mut QcContext) { + pub fn init(&mut self, ctx: &mut QcContext) -> Result<(), SolverError> { trace!("{} solver initialization..", self.solver); //TODO: Preprocessing: // only for ppp solver @@ -123,140 +138,332 @@ impl Solver { // total_dropped, // total // ); + /* + * Solving needs a ref. position + */ + if self.cfg.rcvr_position.is_none() { + // defined in context ? + let position = ctx.ground_position(); + if let Some(position) = position { + self.cfg.rcvr_position = Some(position); + } else { + return Err(SolverError::UndefinedAprioriPosition); + } + } - // 2: interpolate: if need be - //if !ctx.interpolated { - // trace!("orbit interpolation.."); - // let order = self.opts.interp_order; - // ctx.orbit_interpolation(order, None); - // //TODO could be nice to have some kind of timing/perf evaluation here - // // and also total number of required interpolations - //} + /* + * print some infos on latched config + */ + if self.cfg.modeling.earth_rotation { + warn!("can't compensate for earth rotation at the moment"); + } + if self.cfg.modeling.relativistic_clock_corr { + warn!("relativistic clock corr. is not feasible at the moment"); + } + if self.solver == SolverType::PPP && self.cfg.min_sv_sunlight_rate.is_some() { + warn!("eclipse filter is not meaningful when using spp strategy"); + } - //initialization self.nth_epoch = 0; - self.estimate.pos = self.opts.rcvr_position.into(); self.initiated = true; + Ok(()) } - pub fn run(&mut self, ctx: &mut QcContext) -> Option<(Epoch, SolverEstimate)> { + pub fn run(&mut self, ctx: &mut QcContext) -> Result<(Epoch, SolverEstimate), SolverError> { if !self.initiated { - self.init(ctx); - trace!("solver initiated"); - } else { - // move on to next epoch + return Err(SolverError::NotInitialized); + } + + let pos0 = self + .cfg + .rcvr_position + .ok_or(SolverError::UndefinedAprioriPosition)?; + + let (x0, y0, z0): (f64, f64, f64) = pos0.into(); + + let modeling = self.cfg.modeling; + let interp_order = self.cfg.interp_order; + + // 0: grab work instant + let t = ctx.primary_data().epoch().nth(self.nth_epoch); + + if t.is_none() { + self.nth_epoch += 1; + return Err(SolverError::EpochDetermination(self.nth_epoch)); + } + let t = t.unwrap(); + + // 1: elect sv + let sv = Self::sv_at_epoch(ctx, t); + if sv.is_none() { + warn!("no vehicles found @ {}", t); self.nth_epoch += 1; + return Err(SolverError::NoSv(t)); } - // grab work instant - let t = ctx.primary_data().epoch().nth(self.nth_epoch)?; + let mut elected_sv: Vec = sv.unwrap().into_iter().take(self.cfg.max_sv).collect(); + + trace!("{:?}: {} candidates", t, elected_sv.len()); + + // retrieve associated PR + let pr: Vec<_> = ctx + .primary_data() + .pseudo_range_ok() + .filter_map(|(epoch, svnn, _, pr)| { + if epoch == t && elected_sv.contains(&svnn) { + Some((svnn, pr)) + } else { + None + } + }) + .collect(); - let interp_order = self.opts.interp_order; + // apply first set of filters : on OBSERVATION + // - no pseudo range: nothing is feasible + // - if we're in ppp mode: must be compliant + // - if an SNR mask is defined: SNR must be good enough + elected_sv.retain(|sv| { + let has_pr = pr + .iter() + .filter_map(|(svnn, pr)| if svnn == sv { Some(pr) } else { None }) + .reduce(|pr, _| pr) + .is_some(); - /* elect vehicles */ - let elected_sv = Self::sv_election(ctx, t); - if elected_sv.is_none() { - warn!("no vehicles elected @ {}", t); - return Some((t, self.estimate)); + let mut ppp_ok = !(self.solver == SolverType::PPP); + if self.solver == SolverType::PPP { + //TODO: verify PPP compliancy + } + + let mut snr_ok = self.cfg.min_sv_snr.is_none(); + if let Some(min_snr) = self.cfg.min_sv_snr { + let snr = ctx + .primary_data() + .snr() + .filter_map(|((epoch, _), svnn, _, snr)| { + if epoch == t && svnn == *sv { + Some(snr) + } else { + None + } + }) + .reduce(|snr, _| snr); + if let Some(snr) = snr { + snr_ok = snr >= min_snr; + } + } + + if !has_pr { + trace!("{:?}: {} no pseudo range", t, sv); + } + if !ppp_ok { + trace!("{:?}: {} not ppp compliant", t, sv); + } + if !snr_ok { + trace!("{:?}: {} snr below criteria", t, sv); + } + + has_pr && snr_ok & ppp_ok + }); + + // make sure we still have enough SV + if elected_sv.len() < 4 { + debug!("{:?}: not enough vehicles elected", t); + self.nth_epoch += 1; + return Err(SolverError::LessThan4Sv(t)); } - let mut elected_sv = elected_sv.unwrap(); - debug!("elected sv : {:?}", elected_sv); + debug!("{:?}: {} elected sv", t, elected_sv.len()); - /* determine sv positions */ - /* TODO: SP3 APC corrections: Self::eval_sun_vector3d */ + let mut sv_data: HashMap = HashMap::new(); - let mut sv_pos: HashMap = HashMap::new(); + // 3: sv position evaluation for sv in &elected_sv { - if let Some(sp3) = ctx.sp3_data() { - if let Some((x_km, y_km, z_km)) = sp3.sv_position_interpolate(*sv, t, interp_order) - { - sv_pos.insert(*sv, (x_km, y_km, z_km)); - } else if let Some(nav) = ctx.navigation_data() { - if let Some((x_km, y_km, z_km)) = - nav.sv_position_interpolate(*sv, t, interp_order) - { - sv_pos.insert(*sv, (x_km, y_km, z_km)); - } - } - } else { - if let Some(nav) = ctx.navigation_data() { - if let Some((x_km, y_km, z_km)) = - nav.sv_position_interpolate(*sv, t, interp_order) - { - sv_pos.insert(*sv, (x_km, y_km, z_km)); + // retrieve pr for this SV @ t + let pr = pr + .iter() + .filter_map(|(svnn, pr)| if svnn == sv { Some(*pr) } else { None }) + .reduce(|pr, _| pr) + .unwrap(); // can't fail at this point + + let ts = sv.timescale().unwrap(); // can't fail at this point ? + + let nav = ctx.navigation_data().unwrap(); // can't fail at this point ? + + let ephemeris = nav.sv_ephemeris(*sv, t); + if ephemeris.is_none() { + error!("{:?} : {} no valid ephemeris", t, sv); + continue; + } + + let (toe, eph) = ephemeris.unwrap(); + let clock_bias = eph.sv_clock(); + let (t_tx, dt_sat) = + Self::sv_transmission_time(t, *sv, toe, pr, eph, modeling, clock_bias, ts); + + if modeling.earth_rotation { + //TODO + // dt = || rsat - rcvr0 || /c + // rsat = R3 * we * dt * rsat + // we = 7.2921151467 E-5 + } + + if modeling.relativistic_clock_corr { + //TODO + let e = 1.204112719279E-2; + let sqrt_a = 5.153704689026E3; + let sqrt_mu = (3986004.418E8_f64).sqrt(); + //let dt = -2.0_f64 * sqrt_a * sqrt_mu / SPEED_OF_LIGHT / SPEED_OF_LIGHT * e * elev.sin(); + } + + // interpolate + let pos: Option<(f64, f64, f64)> = match ctx.sp3_data() { + Some(sp3) => { + /* + * SP3 always prefered + */ + let pos = sp3.sv_position_interpolate(*sv, t_tx, interp_order); + if let Some(pos) = pos { + Some(pos) + } else { + /* try to fall back to ephemeris nav */ + nav.sv_position_interpolate(*sv, t_tx, interp_order) } + }, + _ => nav.sv_position_interpolate(*sv, t_tx, interp_order), + }; + + if pos.is_none() { + trace!("{:?} : {} interpolation failed", t, sv); + continue; + } + + let (x_km, y_km, z_km) = pos.unwrap(); + + // Elevation filter + if let Some(min_elev) = self.cfg.min_sv_elev { + let (e, _) = Ephemeris::elevation_azimuth( + (x_km * 1.0E3, y_km * 1.0E3, z_km * 1.0E3), + pos0.into(), + ); + if e < min_elev { + trace!("{:?} : {} elev below mask", t, sv); + continue; } } - } - /* remove sv in eclipse */ - if let Some(min_rate) = self.opts.min_sv_sunlight_rate { - sv_pos.retain(|sv, (x_km, y_km, z_km)| { - let state = self.eclipse_state(*x_km, *y_km, *z_km, t); + // Eclipse filter + if let Some(min_rate) = self.cfg.min_sv_sunlight_rate { + let state = self.eclipse_state(x_km, y_km, z_km, t_tx); let eclipsed = match state { EclipseState::Umbra => true, EclipseState::Visibilis => false, - EclipseState::Penumbra(r) => { - debug!("{} state: {}", sv, state); - r < min_rate - }, + EclipseState::Penumbra(r) => r < min_rate, }; - if eclipsed { - debug!("dropping eclipsed {}", sv); + debug!("{:?} : dropping eclipsed {}", t, sv); + } else { + sv_data.insert(*sv, (x_km * 1.0E3, y_km * 1.0E3, z_km * 1.0E3, pr, dt_sat)); } - !eclipsed - }); + } else { + sv_data.insert(*sv, (x_km * 1.0E3, y_km * 1.0E3, z_km * 1.0E3, pr, dt_sat)); + } } - // 3: t_tx - let mut t_tx: HashMap = HashMap::new(); - for sv in &elected_sv { - if let Some(sv_t_tx) = Self::sv_transmission_time(ctx, *sv, t) { - t_tx.insert(*sv, sv_t_tx); - } + // 6: form matrix + let mut y = DVector::::zeros(elected_sv.len()); + let mut g = MatrixXx4::::zeros(elected_sv.len()); + + if sv_data.iter().count() < 4 { + error!("{:?} : not enough sv to resolve", t); + self.nth_epoch += 1; + return Err(SolverError::LessThan4Sv(t)); } - //TODO - // add other models + for (index, (sv, data)) in sv_data.iter().enumerate() { + let pr = data.3; + let dt_sat = data.4.to_seconds(); + let (sv_x, sv_y, sv_z) = (data.0, data.1, data.2); + + let rho = ((sv_x - x0).powi(2) + (sv_y - y0).powi(2) + (sv_z - z0).powi(2)).sqrt(); + + //TODO + let mut models = -SPEED_OF_LIGHT * dt_sat; + //let models = models + // .iter() + // .filter_map(|sv, model| { + // if sv == svnn { + // Some(model) + // } else { + + // } + // }) + // .reduce(|m, _| m) + // .unwrap(); + + y[index] = pr - rho - models; - // form matrix - // resolve - Some((t, self.estimate)) + g[(index, 0)] = (x0 - sv_x) / rho; + g[(index, 1)] = (y0 - sv_y) / rho; + g[(index, 2)] = (z0 - sv_z) / rho; + g[(index, 3)] = 1.0_f64; + } + + // 7: resolve + //trace!("y: {} | g: {}", y, g); + let estimate = SolverEstimate::new(g, y); + self.nth_epoch += 1; + + if estimate.is_none() { + return Err(SolverError::SolvingError(t)); + } else { + Ok((t, estimate.unwrap())) + } } /* - * Evalutes T_tx transmission time, for given Sv at desired 't' + * Evalutes Sv position */ - fn sv_transmission_time(ctx: &QcContext, sv: Sv, t: Epoch) -> Option { - let nav = ctx.navigation_data()?; - // need one pseudo range observation for this SV @ 't' - let mut pr = ctx - .primary_data() - .pseudo_range() - .filter_map(|((e, flag), svnn, _, p)| { - if e == t && flag.is_ok() && svnn == sv { - Some(p) - } else { - None - } - }) - .take(1); - if let Some(pr) = pr.next() { - let t_tx = Duration::from_seconds(t.to_duration().to_seconds() - pr / SPEED_OF_LIGHT); - debug!("t_tx(pr): {}@{} : {}", sv, t, t_tx); + fn sv_transmission_time( + t: Epoch, + sv: Sv, + toe: Epoch, + pr: f64, + eph: &Ephemeris, + m: Modeling, + clock_bias: (f64, f64, f64), + ts: TimeScale, + ) -> (Epoch, Duration) { + let seconds_ts = t.to_duration().to_seconds(); - let mut e_tx = Epoch::from_duration(t_tx, sv.constellation.timescale()?); - let dt_sat = nav.sv_clock_bias(sv, e_tx)?; - debug!("clock bias: {}@{} : {}", sv, t, dt_sat); + let dt_tx = seconds_ts - pr / SPEED_OF_LIGHT; + let mut e_tx = Epoch::from_duration(dt_tx * Unit::Second, t.time_scale); + let mut dt_sat = Duration::default(); + if m.sv_clock_bias { + dt_sat = Ephemeris::sv_clock_corr(sv, clock_bias, t, toe); + debug!("{:?}: {} dt_sat {}", t, sv, dt_sat); e_tx -= dt_sat; - debug!("{} : t(obs): {} | t(tx) {}", sv, t, e_tx); + } - Some(e_tx) - } else { - debug!("missing PR measurement"); - None + if m.sv_total_group_delay { + if let Some(tgd) = eph.tgd() { + let tgd = tgd * Unit::Second; + debug!("{:?}: {} tgd {}", t, sv, tgd); + e_tx -= tgd; + } } + + debug!("{:?}: {} t_tx {:?}", t, sv, e_tx); + + /* + * physical verification on result + */ + let dt = (t - e_tx).to_seconds(); + assert!(dt > 0.0, "t_tx can't physically be after t_rx..!"); + assert!( + dt < 1.0, + "|t - t_tx| < 1s is physically impossible (signal propagation..)" + ); + + (e_tx, dt_sat) } /* * Evaluates Sun/Earth vector, expressed in Km @@ -291,9 +498,9 @@ impl Solver { eclipse_state(&sv_orbit, sun_frame, earth_frame, &self.cosmic) } /* - * Elects sv for this epoch + * Returns all Sv at "t" */ - fn sv_election(ctx: &QcContext, t: Epoch) -> Option> { + fn sv_at_epoch(ctx: &QcContext, t: Epoch) -> Option> { ctx.primary_data() .sv_epoch() .filter_map(|(epoch, svs)| if epoch == t { Some(svs) } else { None }) diff --git a/gnss-rtk/src/model.rs b/gnss-rtk/src/model.rs new file mode 100644 index 000000000..f77344b8d --- /dev/null +++ b/gnss-rtk/src/model.rs @@ -0,0 +1,58 @@ +use crate::SolverType; + +#[cfg(feature = "serde")] +use serde::{Deserialize, Serialize}; + +fn default_sv_clock() -> bool { + true +} + +fn default_sv_tgd() -> bool { + true +} + +fn default_earth_rot() -> bool { + false +} + +fn default_rel_clock_corr() -> bool { + false +} + +#[derive(Copy, Clone, Debug, PartialEq)] +#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] +pub struct Modeling { + #[cfg_attr(feature = "serde", serde(default = "default_sv_clock"))] + pub sv_clock_bias: bool, + #[cfg_attr(feature = "serde", serde(default = "default_sv_tgd"))] + pub sv_total_group_delay: bool, + #[cfg_attr(feature = "serde", serde(default = "default_earth_rot"))] + pub earth_rotation: bool, + #[cfg_attr(feature = "serde", serde(default = "default_rel_clock_corr"))] + pub relativistic_clock_corr: bool, +} + +impl Default for Modeling { + fn default() -> Self { + Self { + sv_clock_bias: default_sv_clock(), + sv_total_group_delay: default_sv_tgd(), + earth_rotation: default_earth_rot(), + relativistic_clock_corr: default_rel_clock_corr(), + } + } +} + +impl From for Modeling { + fn from(solver: SolverType) -> Self { + let mut s = Self::default(); + match solver { + SolverType::PPP => { + s.earth_rotation = true; + s.relativistic_clock_corr = false; + }, + _ => {}, + } + s + } +} diff --git a/gnss-rtk/src/opts/mod.rs b/gnss-rtk/src/opts/mod.rs deleted file mode 100644 index 35a355065..000000000 --- a/gnss-rtk/src/opts/mod.rs +++ /dev/null @@ -1,107 +0,0 @@ -use crate::SolverType; -use rinex::prelude::{Constellation, GroundPosition}; - -#[derive(Default, Debug, Clone, PartialEq)] -pub struct SolverOpts { - /// Criteria (for convergence) - pub epsilon: f64, - /// (Position) interpolation filter order. - /// A minimal order must be respected for correct results. - /// - 7 when working with broadcast ephemeris - /// - 11 when working with SP3 - pub interp_order: usize, - /// positioning mode - pub positioning: PositioningMode, - /// Whether the solver is working in fixed altitude mode or not - pub fixed_altitude: Option, - /// Position receveir position, if known before hand - pub rcvr_position: GroundPosition, - /// constellation to consider, - pub gnss: Vec, - /// PR code smoothing filter before moving forward - pub code_smoothing: bool, - /// true if we're using troposphere modeling - pub tropo: bool, - /// true if we're using ionosphere modeling - pub iono: bool, - /// true if we're using total group delay modeling - pub tgd: bool, - /// Minimal percentage ]0; 1[ of Sun light to be received by an SV - /// for not to be considered in Eclipse. - /// A value closer to 0 means we tolerate fast Eclipse exit. - /// A value closer to 1 is a stringent criteria: eclipse must be totally exited. - pub min_sv_sunlight_rate: Option, -} - -impl SolverOpts { - pub fn default(solver: SolverType) -> Self { - match solver { - SolverType::SPP => Self { - epsilon: 5.0_f64, - gnss: vec![Constellation::GPS, Constellation::Galileo], - fixed_altitude: None, - rcvr_position: GroundPosition::default(), - interp_order: 7, - positioning: PositioningMode::default(), - code_smoothing: false, - tropo: false, - iono: false, - tgd: false, - min_sv_sunlight_rate: None, - }, - SolverType::PPP => Self { - epsilon: 0.1_f64, - gnss: vec![Constellation::GPS, Constellation::Galileo], - fixed_altitude: None, - rcvr_position: GroundPosition::default(), - interp_order: 11, - positioning: PositioningMode::default(), - code_smoothing: false, - tropo: false, - iono: false, - tgd: false, - min_sv_sunlight_rate: Some(0.3), - }, - } - } -} - -#[derive(Default, Debug, Clone, Copy, PartialEq)] -pub enum PositioningMode { - /// Receiver is kept at fixed location - #[default] - Static, - /// Receiver is not static - Kinematic, -} - -#[derive(Debug, Clone, Copy, PartialEq)] -#[allow(dead_code)] -pub enum SpecificOpts { - /// SPP solver specific parameters - SPPSpecificOpts(SppOpts), - /// PPP solver specific parameters - PPPSpecificOpts(PppOpts), -} - -#[allow(dead_code)] -impl SpecificOpts { - fn spp(&self) -> Option { - match self { - Self::SPPSpecificOpts(opts) => Some(*opts), - _ => None, - } - } - fn ppp(&self) -> Option { - match self { - Self::PPPSpecificOpts(opts) => Some(*opts), - _ => None, - } - } -} - -#[derive(Default, Debug, Clone, Copy, PartialEq)] -pub struct SppOpts {} - -#[derive(Default, Debug, Clone, Copy, PartialEq)] -pub struct PppOpts {} diff --git a/gnss-rtk/src/solver.rs b/gnss-rtk/src/solver.rs deleted file mode 100644 index e69de29bb..000000000 diff --git a/rinex-cli/Cargo.toml b/rinex-cli/Cargo.toml index c5bfc0d68..ee6d703d4 100644 --- a/rinex-cli/Cargo.toml +++ b/rinex-cli/Cargo.toml @@ -1,6 +1,6 @@ [package] name = "rinex-cli" -version = "0.9.3" +version = "0.9.4" license = "MIT OR Apache-2.0" authors = ["Guillaume W. Bres "] description = "Command line tool parse and analyze RINEX data" @@ -14,19 +14,23 @@ rust-version = "1.64" [dependencies] log = "0.4" -pretty_env_logger = "0.5" +env_logger = "0.10" clap = { version = "4.4.3", features = ["derive", "color"] } rand = "0.8" serde_json = "1" -sp3 = { path = "../sp3", version = "=1.0.4", features = ["serde", "flate2"] } +sp3 = { path = "../sp3", version = "=1.0.5", features = ["serde", "flate2"] } rinex-qc = { path = "../rinex-qc", version = "=0.1.4", features = ["serde"] } -rinex = { path = "../rinex", version = "=0.14.0", features = ["full"] } -gnss-rtk = { path = "../gnss-rtk", version = "=0.0.1" } +rinex = { path = "../rinex", version = "=0.14.1", features = ["full"] } +gnss-rtk = { path = "../gnss-rtk", version = "=0.0.1", features = ["serde"] } thiserror = "1" itertools = "0.11" -plotly = "0.8.4" +# plotly = "0.8.4" +plotly = { git = "https://github.com/gwbres/plotly", branch = "density-mapbox" } +# plotly = { path = "../../plotly-rs/plotly" } map_3d = "0.1.5" ndarray = "0.15" colorous = "1.0" horrorshow = "0.8" nyx-space = "2.0.0-alpha.2" +hifitime = { version = "3.8.4", features = ["serde", "std"] } +serde = { version = "1.0", default-features = false, features = ["derive"] } diff --git a/rinex-cli/README.md b/rinex-cli/README.md index 3879d12c8..1c27ac975 100644 --- a/rinex-cli/README.md +++ b/rinex-cli/README.md @@ -108,12 +108,13 @@ if you know how to operate the preprocessing toolkit - [quality check](doc/qc.md): RINEX data quality analysis (mainly statistics and only on OBS RINEX at the moment) - other advanced operations are documented in the [processing](doc/processing.md) suite -## Positioning +## Positioning (RTK) -`rinex-cli` integrates a position solver that will resolve the user location -the best it can, from the provided RINEX context. This mode in requested with `-p`. +`rinex-cli` integrates a position solver that will resolve the radio receiver location +the best it can, by post processing the provided RINEX context. +This mode in requested with `-r` or `--rtk` and is turned off by default. -To learn how to operate the solver, refer to [the dedicated page](doc/positioning.md). +To learn how to operate the solver, refer to [the dedicated page](doc/rtk.md). ## Getting started @@ -236,10 +237,10 @@ rinex-cli -f OBS/V2/KOSG0010.95O --epochs rinex-cli -f test_resources/OBS/V2/KOSG0010.95O --epochs --sv ``` -The `--pretty` option is there to make the datasets more readable (json format): +The `--pretty` (`-p`) option is there to make the datasets more readable (json format): ```bash -rinex-cli -f test_resources/OBS/V2/KOSG0010.95O --epochs --sv --pretty +rinex-cli -f test_resources/OBS/V2/KOSG0010.95O -g --epochs --sv -p ``` ## Data analysis diff --git a/rinex-cli/config/gnss_snr30db.json b/rinex-cli/config/qc/gnss_snr30db.json similarity index 100% rename from rinex-cli/config/gnss_snr30db.json rename to rinex-cli/config/qc/gnss_snr30db.json diff --git a/rinex-cli/config/sv_manual_gap.json b/rinex-cli/config/qc/sv_manual_gap.json similarity index 100% rename from rinex-cli/config/sv_manual_gap.json rename to rinex-cli/config/qc/sv_manual_gap.json diff --git a/rinex-cli/config/rtk/gpst_10sv_basic.json b/rinex-cli/config/rtk/gpst_10sv_basic.json new file mode 100644 index 000000000..0a219e729 --- /dev/null +++ b/rinex-cli/config/rtk/gpst_10sv_basic.json @@ -0,0 +1,9 @@ +{ + "timescale": "GPST", + "interp_order": 11, + "max_sv": 10, + "modeling": { + "sv_clock_bias": true, + "sv_total_group_delay": true + } +} diff --git a/rinex-cli/config/rtk/gpst_4sv_basic.json b/rinex-cli/config/rtk/gpst_4sv_basic.json new file mode 100644 index 000000000..15f2165b0 --- /dev/null +++ b/rinex-cli/config/rtk/gpst_4sv_basic.json @@ -0,0 +1,9 @@ +{ + "timescale": "GPST", + "interp_order": 11, + "max_sv": 4, + "modeling": { + "sv_clock_bias": true, + "sv_total_group_delay": true + } +} diff --git a/rinex-cli/doc/file-combination.md b/rinex-cli/doc/file-combination.md index 3e9cf5eaa..de3325969 100644 --- a/rinex-cli/doc/file-combination.md +++ b/rinex-cli/doc/file-combination.md @@ -100,10 +100,25 @@ joint `--nav` and `--sp3` context yourself. ## IONEX analysis -To analyze a IONEX file, a primary file of this type should be passed -to `--fp` (or `-f`). In this case, you get a world map visualization -of the provided TEC map. Unfortunately we can only visualize the TEC map -at a single epoch, because we cannot animate the world map at the moment. -Therefore, it makes sense to zoom in on the Epoch you're interested in, -with the proper `-P` preprocessor command. Refer to related section. +IONEX is one of those files that can only serve as primary files. +Thefore all IONEX files should be passed with `--fp` (`-f`). +We can then plot the TEC map. Unfortunately we have no means to animate the plot +at the moment, so we create a TEC visualization for every single Epochs. +Usually IONEX files comprise 12 to 24 Epochs, so it's not that much but the HTML +graphs might come heavy. + +We recommend zooming on the time frame you're interested in, for example with something like this + +```bash +./target/release/rinex-cli \ + -f CKMG0090.21I.gz --epochs + +["2021-01-09T00:00:00 UTC","2021-01-09T01:00:00 UTC", ..., "2021-01-10T00:00:00 UTC"] + +./target/release/rinex-cli \ + -f CKMG0090.21I.gz \ + -P ">=2021-01-09T19:00:00 UTC" +``` + + diff --git a/rinex-cli/doc/positioning.md b/rinex-cli/doc/positioning.md deleted file mode 100644 index 94f241f22..000000000 --- a/rinex-cli/doc/positioning.md +++ /dev/null @@ -1,68 +0,0 @@ -Position solver -=============== - -The position solver is currently an "advanced" SPP solver. -SPP stands for Single Frequency Precice Point solver which means -you get a precise point location (ideally with metric accuracy) for a minimal -- down to single frequency data context. - -When we say "advanced" SPP it means it supports more than the minimal prerequisites -for a pure SPP solver. For example it is possible to still require SPP solving -but use other criteria that makes it a little closer to PPP. - -Command line interface -====================== - -* use `-p` to request position solving. -From the provided data context, we will try to evaluate the user position -the best we can - -* use `--spp` to force to SPP solving. - -* `--ppp` to force to PPP solving. It exists but not entirely supported to this day. - -Minimal data context -==================== - -A minimum of one primary RINEX Observation file with broadcast Ephemeris -valid for that particular time frame is required. - -SP3 can be stacked, broadcast Ephemeris are still required, we will prefer SP3 -for certain steps in the solving process. - -Example of minimum requirement : - -```bash -./target/release/rinex-cli -P GPS,GLO --spp \ - --fp DATA/2023/OBS/256/ANK200TUR_S_20232560000_01D_30S_MO.crx \ - --nav DATA/2023/NAV/255 \ - --nav DATA/2023/NAV/256 -``` - -Example of SP3 extension : - -```bash -./target/release/rinex-cli -P GPS,GLO --spp \ - --fp DATA/2023/OBS/256/ANK200TUR_S_20232560000_01D_30S_MO.crx \ - --nav DATA/2023/NAV/255 \ - --nav DATA/2023/NAV/256 \ - --sp3 DATA/2023/SP3/255 \ - --sp3 DATA/2023/SP3/256 -``` - -Position solver and results -=========================== - -The solver will try to resolve the navigation equations for every single Epoch -for which : - -* enough raw GNSS signals were observed in the Observation RINEX -* enough SV fit the Navigation requirements -* all minimal or requested models were correctly modelized - -The solver can totally work with its default configuration, as long as the previous points stand. -But you need to understand that in this configuration, you can't hope for an optimal result accuracy. - -Mastering and operating a position solver is a complex task. -To fully understand what can be achieved and how to achieve such results, -refer to the [gnss-rtk](../gnss-rtk/README.md) library documentation. diff --git a/rinex-cli/doc/preprocessing.md b/rinex-cli/doc/preprocessing.md index 03986f4c8..cecc46fe0 100644 --- a/rinex-cli/doc/preprocessing.md +++ b/rinex-cli/doc/preprocessing.md @@ -72,18 +72,20 @@ advanced mask filters. ## Stacked preprocessing ops -A whitespace separates two preprocessing operations. +A whitespace separates two preprocessing operations (ie., two sets of CSV). +Therefore it is considered as two separate filters. For example here, we're only left with +G08 and R03 data. ```bash rinex-cli \ --fp test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz \ - -P GPS,GLO G08,R03 + -P GPS,GLO,BDS G08,R03 ``` -Therefore, if a filter operation involves a whitespace, it requires to be wrapped +If one opeation requies a whitespace, it needs to be wrapped in between inverted commas. Most common example is the [Epoch](epoch-target) description. -## Epoch target +## Epoch filter Any valid Hifitime::Epoch string description is supported. @@ -105,11 +107,7 @@ rinex-cli \ -P ">2020-06-12T08:00:00 UTC" "<=2020-06-25T16:00:00 UTC" GPS >G08 ``` -## Duration target - -TODO - -## Sv target +## SV filter A comma separated list of Sv (of any length) is supported. For example, retain _R03_ and _E10_ with the following: @@ -121,8 +119,8 @@ rinex-cli \ ``` `Sv` target is the only one amongst CSV arrays that supports more than "=" or "!=" operands. -For example we can select PRN above 08 for GPS and below 10 for Galileo constellations (only, others are untouched) -with this command: +This is used to filter on SV PRN. +For example here we can select PRN above 08 for GPS and below (included) 10 for Galileo: ```bash rinex-cli \ @@ -130,25 +128,66 @@ rinex-cli \ -P >G08 "<=E10" ``` -## Constellations +## Constellations filter + +Retain specific constellations. For example we only retain GPS with this: + +```bash +rinex-cli \ + --fp test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz \ + -P GPS +``` -A comma separated list of Constellations is supported. -For example, with the following, we are left with data from Glonass and GPS +You can stack as many filters as you want, using csv. For example, retain +BeiDou also: ```bash rinex-cli \ --fp test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz \ - -P !=BDS GPS,GLO # ineq(BDS) AND eq(GPS,GLO) + -P GPS,BDS ``` -`teqc` like quick GNSS filters also exist: +Inequality is also supported. For example: retain everything but Glonass -- `-G` to remove GPS -- `-C` to remove BDS -- `-E` to remove Galileo -- `-R` to remove Glonnass -- `-J` to remove QZSS -- `-S` to remove SBAS vehicles +```bash +rinex-cli \ + --fp test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz \ + -P !=GLO +``` + +SBAS is a special case. If you use "SBAS", you can retain or discard +SBAS systems, whatever their actual constellation. For example we +retain all GPS and any SBAS with this: + +```bash +rinex-cli \ + --fp test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz -P GPS,SBAS +``` + +If you want to retain specific SBAS, you have to name them precisely, we support all of them +(see Constellation module API). For example, retain GPS, EGNOS and SDCM with this: + +```bash +rinex-cli \ + --fp test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz -P GPS,EGNOS,SDCM +``` + +Note that the following `teqc` equivalent filters are also supported. + +- `-G` removes GPS (equivalent to `-P !=GPS`) +- `-C` removes BDS +- `-E` removes Galileo +- `-R` removes Glonnass +- `-J` removes QZSS +- `-S` removes all SBAS vehicles + +If you want to remove specific SBAS constellations, for example EGNOS, you have to use +`-P`: + +```bash +rinex-cli \ + --fp test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz -P !=EGNOS +``` ## Observables diff --git a/rinex-cli/doc/qc.md b/rinex-cli/doc/qc.md index dd560ca9f..52da524ec 100644 --- a/rinex-cli/doc/qc.md +++ b/rinex-cli/doc/qc.md @@ -1,7 +1,7 @@ Quality Check (QC) ================== -RINEX quality check is a special mode, activated with `--qc`. +RINEX quality check is a special mode. It is activated with `--qc` and is turned off by default. QC is first developed for Observation files analysis, but this tool will accept other RINEX files, for which it will compute basic statistical analysis. @@ -85,7 +85,7 @@ Run this configuration for the most basic QC: rinex-cli \ -P GPS,GLO \ --qc-only \ - --qc-cfg rinex-cli/config/gnss_snr30db.json \ + --qc-cfg rinex-cli/config/qc/gnss_snr30db.json \ --fp test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz ``` @@ -134,7 +134,7 @@ To this one : ./target/release/rinex-cli \ --fp test_resources/CRNX/V3/MOJN00DNK_R_20201770000_01D_30S_MO.crx.gz \ -P G08,G15,G16,R23,C19,C09 \ - --qc --qc-cfg rinex-cli/config/sv_manual_gap.json + --qc --qc-cfg rinex-cli/config/qc/sv_manual_gap.json ``` ### SNR parametrization @@ -157,7 +157,7 @@ the go out of sight more rapidly, due to the stringent elevation criteria : rinex-cli \ -P gps,glo \ -P G08,G15,G16,R23,C19,C09 \ - --qc --qc-cfg rinex-cli/config/sv_manual_gap_ev35.json + --qc --qc-cfg rinex-cli/config/qc/sv_manual_gap_ev35.json --fp test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz \ --nav test_resources/OBS/V3/ESBC00DNK_R_20201770000_01D_MN.rnx.gz ``` diff --git a/rinex-cli/doc/rtk.md b/rinex-cli/doc/rtk.md new file mode 100644 index 000000000..b48033ec1 --- /dev/null +++ b/rinex-cli/doc/rtk.md @@ -0,0 +1,128 @@ +RTK solver +========== + +RTK mode is requested with `-r` or `--rtk`. + +RTK (position solving) is feasible if you provide at least RINEX Observations +(`-f`) and overlapping RINEX Navigation data (`--nav`). + +Currently it is also mandatory to provide overlapping SP3 with `--sp3` but that should be fixed +in near future. + +As an example (this dataset is not provided), the most basic command line would look like this, +where observations are imported for day 256 and we combine several NAV/SP3 by lazyili importing entire folders: + +```bash +./target/release/rinex-cli -P GPS,GLO -r \ + --fp DATA/2023/OBS/256/ANK200TUR_S_20232560000_01D_30S_MO.crx \ + --nav DATA/2023/NAV/256 \ + --sp3 DATA/2023/SP3/256 +``` + +Current limitations +=================== + +Several limitations exit to this day and must be kept in mind. + +- Glonass and SBAS vehicles cannot be pushed into the pool of eligible vehicles. +Until further notice, one must combine -R and -S to the rtk mode. + +- We've only tested the solver against mixed GPS, Galileo and BeiDou vehicles + +- We only support GPST, GST and BDT. QZSST is expressed as GPST and I'm not 100% sure this +is correct. + +- The estimated clock offset is expressed against the timescale for which the Observation file is referenced to. +We don't have the flexibility to change that at the moment. +So far the solver has only be tested against Observations referenced against GPST. + +RTK (only) +========== + +Use `-r` (or `--rtk-only`) to disable other opmodes. This gives you the quickest results. + +```bash +./target/release/rinex-cli -R -S -r \ + --fp DATA/2023/OBS/256/ANK200TUR_S_20232560000_01D_30S_MO.crx \ + --nav DATA/2023/NAV/256 \ + --sp3 DATA/2023/SP3/256 +``` + +RTK configuration +================= + +The solver can be customized, either to improve performances +or improve the final resolution. Refer to the library section +that defines the [RTK configuration](https://github.com/georust/rinex/gnss-rtk/doc/cfg.md) +to understand the physics and what they imply on the end result. + +A few configuration files are provided in the rinex-cli/config/rtk directory. + +You can use them with `--rtk-cfg`: + +Forced SPP mode +=============== + +By default the solver will adapt to the provided context and will deploy the best strategy. + +You can force the strategy to SPP with `--spp` + +It is possible to use the configuration file, even in forced SPP mode, to improve the end results: + +In this scenario, one wants to define Ionospheric delay model + +Provide SP3 +=========== + +When SP3 is provided, they are prefered over NAV RINEX. +Refer to the library documentation [TODO](TODO) + +Example of SP3 extension : + +```bash +./target/release/rinex-cli -R -S --spp \ + --fp DATA/2023/OBS/256/ANK200TUR_S_20232560000_01D_30S_MO.crx \ + --nav DATA/2023/NAV/255 \ + --nav DATA/2023/NAV/256 \ + --sp3 DATA/2023/SP3/255 \ + --sp3 DATA/2023/SP3/256 +``` + +It is totally possible to combine SP3 to a single frequency context, +or a forced --spp strategy. + +Results +======= + +The solver will try to resolve the navigation equations for every single Epoch +for which : + +* enough raw GNSS signals were observed in the Observation RINEX +* enough SV fit the Navigation requirements +* all minimal or requested models were correctly modelized + +The solver can totally work with its default configuration, as long as the previous points stand. +But you need to understand that in this configuration, you can't hope for an optimal result accuracy. + +Mastering and operating a position solver is a complex task. +To fully understand what can be achieved and how to achieve such results, +refer to the [gnss-rtk](../gnss-rtk/README.md) library documentation. + +RTK and logger +============== + +The RTK solver and its dependencies, make extensive use of the Rust logger. +Turn it on so you have meaningful information on what is happening: + +- Epochs for which we perform the calculations +- Navigation context evolution +- Results and meaningful information +- More information on the configuration and what can be achieved + +The Rust logger sensitivity is controlled by the RUST\_LOG environment variable, +which you can either export or adjust for a single run. `trace` is the most sensitive, +`info` is the standard value. + +The output is directed towards Stdout, therefore it can be streamed into a text file for example, +to easily compare runs between them. + diff --git a/rinex-cli/src/cli.rs b/rinex-cli/src/cli.rs index 5b8d79c19..b287da5ed 100644 --- a/rinex-cli/src/cli.rs +++ b/rinex-cli/src/cli.rs @@ -1,4 +1,5 @@ use clap::{Arg, ArgAction, ArgMatches, ColorChoice, Command}; +use gnss_rtk::prelude::RTKConfig; use log::{error, info}; use rinex::prelude::*; use rinex_qc::QcOpts; @@ -27,8 +28,8 @@ impl Cli { .long("fp") .value_name("FILE") .help("Input RINEX file. Serves as primary data. -In advanced usage, this must be Observation Data. -Observation, Meteo and IONEX, can only serve as primary data.") +Must be Observation Data for --rtk. +Observation, Meteo and IONEX can only serve as primary data.") .action(ArgAction::Append) .required(true)) .next_help_heading("General") @@ -37,8 +38,9 @@ Observation, Meteo and IONEX, can only serve as primary data.") .long("quiet") .action(ArgAction::SetTrue) .help("Disable all terminal output. Also disables auto HTML reports opener.")) - .arg(Arg::new("readable") - .short('r') + .arg(Arg::new("pretty") + .short('p') + .long("pretty") .action(ArgAction::SetTrue) .help("Make terminal output more readable.")) .arg(Arg::new("workspace") @@ -48,15 +50,19 @@ Observation, Meteo and IONEX, can only serve as primary data.") .help("Customize workspace location (folder does not have to exist). The default workspace is rinex-cli/workspace")) .next_help_heading("Data identification") + .arg(Arg::new("full-id") + .short('i') + .action(ArgAction::SetTrue) + .help("Turn all identifications ON")) .arg(Arg::new("epochs") .long("epochs") .action(ArgAction::SetTrue) .help("Enumerate all epochs")) - .arg(Arg::new("constellations") - .long("constellations") - .short('c') + .arg(Arg::new("gnss") + .long("gnss") + .short('g') .action(ArgAction::SetTrue) - .help("Enumerate GNSS constellations")) + .help("Enumerate GNSS constellations present in entire context.")) .arg(Arg::new("sv") .long("sv") .action(ArgAction::SetTrue) @@ -109,6 +115,7 @@ Useful to determine common Epochs or compare sample rates in between .num_args(1..) .help("Design preprocessing operations, like data filtering or resampling, prior further analysis. You can stack as many ops as you need. +Preprocessing ops apply prior entering both -q and --rtk modes. Refer to rinex-cli/doc/preprocessing.md to learn how to operate this interface.")) .next_help_heading("Observation RINEX") .arg(Arg::new("observables") @@ -250,27 +257,38 @@ The summary report by default is integrated to the global HTML report.")) .long("qc-only") .action(ArgAction::SetTrue) .help("Activates QC mode and disables all other features: quickest qc rendition.")) - .next_help_heading("Position Solver") - .arg(Arg::new("positioning") - .short('p') - .long("positioning") + .next_help_heading("RTK (Positioning)") + .arg(Arg::new("rtk") + .long("rtk") .action(ArgAction::SetTrue) .help("Activate GNSS receiver position solver. This is only possible if provided context is sufficient. Depending on provided context, either SPP (high accuracy) or PPP (ultra high accuracy) -method is deployed. -This is turned of by default, because it involves quite heavy computations. +solver is deployed. +This mode is turned off by default because it involves quite heavy computations. +Use the RUST_LOG env. variable for verbosity. See [spp] for more information. ")) .arg(Arg::new("spp") .long("spp") .action(ArgAction::SetTrue) .help("Enables Positioning forced to Single Frequency SPP solver mode. Disregards whether the provided context is PPP compatible. -NB: we do not account for Relativistic effects in clock bias estimates.")) - .arg(Arg::new("positioning-only") - .long("pos-only") +NB: we do not account for Relativistic effects by default and raw pseudo range are used. +For indepth customization, refer to the configuration file and online documentation.")) + .arg(Arg::new("rtk-only") + .long("rtk-only") + .short('r') .action(ArgAction::SetTrue) - .help("Activates GNSS position solver, disables all other modes: most performant solver.")) + .help("Activates GNSS position solver, disables all other modes. +This is the most performant mode to solve a position.")) + .arg(Arg::new("rtk-config") + .long("rtk-cfg") + .value_name("FILE") + .help("Pass RTK custom configuration.")) + .arg(Arg::new("kml") + .long("kml") + .help("Form a KML track with resolved positions. +This turns off the default visualization.")) .next_help_heading("File operations") .arg(Arg::new("merge") .short('m') @@ -394,6 +412,7 @@ Refer to README")) | self.matches.get_flag("orbits") | self.matches.get_flag("nav-msg") | self.matches.get_flag("anomalies") + | self.matches.get_flag("full-id") } /// Returns true if Sv accross epoch display is requested pub fn sv_epoch(&self) -> bool { @@ -413,38 +432,52 @@ Refer to README")) } /// Returns list of requested data to extract pub fn identification_ops(&self) -> Vec<&str> { - let flags = vec![ - "sv", - "epochs", - "header", - "constellations", - "observables", - "ssi-range", - "ssi-sv-range", - "orbits", - "nav-msg", - "anomalies", - ]; - flags - .iter() - .filter(|x| self.matches.get_flag(x)) - .map(|x| *x) - .collect() + if self.matches.get_flag("full-id") { + vec![ + "sv", + "epochs", + "gnss", + "observables", + "ssi-range", + "ssi-sv-range", + "orbits", + "nav-msg", + "anomalies", + ] + } else { + let flags = vec![ + "sv", + "header", + "epochs", + "gnss", + "observables", + "ssi-range", + "ssi-sv-range", + "orbits", + "nav-msg", + "anomalies", + ]; + flags + .iter() + .filter(|x| self.matches.get_flag(x)) + .map(|x| *x) + .collect() + } } fn get_flag(&self, flag: &str) -> bool { self.matches.get_flag(flag) } /// returns true if pretty JSON is requested - pub fn readable_json(&self) -> bool { - self.get_flag("readable") + pub fn pretty(&self) -> bool { + self.get_flag("pretty") } /// Returns true if quiet mode is activated pub fn quiet(&self) -> bool { self.matches.get_flag("quiet") } /// Returns true if position solver is enabled - pub fn positioning(&self) -> bool { - self.matches.get_flag("positioning") || self.forced_spp() || self.forced_ppp() + pub fn rtk(&self) -> bool { + self.matches.get_flag("rtk") || self.forced_spp() || self.forced_ppp() } /// Returns true if position solver forced to SPP pub fn forced_spp(&self) -> bool { @@ -454,8 +487,26 @@ Refer to README")) pub fn forced_ppp(&self) -> bool { self.matches.get_flag("spp") } - pub fn positioning_only(&self) -> bool { - self.matches.get_flag("positioning-only") + pub fn rtk_only(&self) -> bool { + self.matches.get_flag("rtk-only") + } + pub fn rtk_config(&self) -> Option { + if let Some(path) = self.matches.get_one::("rtk-config") { + if let Ok(content) = std::fs::read_to_string(path) { + let opts = serde_json::from_str(&content); + if let Ok(opts) = opts { + info!("loaded rtk config: \"{}\"", path); + return Some(opts); + } else { + error!("failed to parse config file \"{}\"", path); + info!("using default parameters"); + } + } else { + error!("failed to read config file \"{}\"", path); + info!("using default parameters"); + } + } + None } pub fn cs_graph(&self) -> bool { self.matches.get_flag("cs") diff --git a/rinex-cli/src/identification.rs b/rinex-cli/src/identification.rs index 53e2e50ae..491a5f311 100644 --- a/rinex-cli/src/identification.rs +++ b/rinex-cli/src/identification.rs @@ -1,4 +1,6 @@ use crate::Cli; +use hifitime::Epoch; +use rinex::observation::Snr; use rinex::*; use rinex_qc::QcContext; @@ -6,70 +8,109 @@ use rinex_qc::QcContext; * Basic identification operations */ pub fn rinex_identification(ctx: &QcContext, cli: &Cli) { - let pretty = cli.readable_json(); + let pretty = cli.pretty(); let ops = cli.identification_ops(); - identification(&ctx.primary_data(), pretty, ops.clone()); + identification( + &ctx.primary_data(), + &ctx.primary_path().to_string_lossy().to_string(), + pretty, + ops.clone(), + ); + if let Some(nav) = &ctx.navigation_data() { - identification(&nav, pretty, ops.clone()); + identification(&nav, "Navigation Context blob", pretty, ops.clone()); } } -fn identification(rnx: &Rinex, pretty: bool, ops: Vec<&str>) { +use serde::Serialize; + +#[derive(Clone, Debug, Serialize)] +struct EpochReport { + pub first: String, + pub last: String, +} + +#[derive(Clone, Debug, Serialize)] +struct SSIReport { + pub min: Option, + pub max: Option, +} + +fn identification(rnx: &Rinex, path: &str, pretty: bool, ops: Vec<&str>) { for op in ops { + debug!("identification: {}", op); if op.eq("header") { let content = match pretty { true => serde_json::to_string_pretty(&rnx.header).unwrap(), false => serde_json::to_string(&rnx.header).unwrap(), }; - println!("{}", content); + println!("[{}]: {}", path, content); } else if op.eq("epochs") { - let data: Vec = rnx.epoch().map(|e| e.to_string()).collect(); - let content = match pretty { - true => serde_json::to_string_pretty(&data).unwrap(), - false => serde_json::to_string(&data).unwrap(), - }; - println!("{}", content); - } else if op.eq("sv") { - let data: Vec<_> = rnx.sv().collect(); - let content = match pretty { - true => serde_json::to_string_pretty(&data).unwrap(), - false => serde_json::to_string(&data).unwrap(), + let report = EpochReport { + first: format!("{:?}", rnx.first_epoch()), + last: format!("{:?}", rnx.last_epoch()), }; - println!("{}", content); - } else if op.eq("observables") { - let data: Vec<_> = rnx.observable().collect(); let content = match pretty { - true => serde_json::to_string_pretty(&data).unwrap(), - false => serde_json::to_string(&data).unwrap(), + true => serde_json::to_string_pretty(&report).unwrap(), + false => serde_json::to_string(&report).unwrap(), }; - println!("{}", content); + println!("[{}]: {}", path, content); + } else if op.eq("sv") { + let mut csv = String::new(); + for (i, sv) in rnx.sv().enumerate() { + if i == rnx.sv().count() - 1 { + csv.push_str(&format!("{}\n", sv.to_string())); + } else { + csv.push_str(&format!("{}, ", sv.to_string())); + } + } + println!("[{}]: {}", path, csv); + } else if op.eq("observables") && rnx.is_observation_rinex() { + let mut data: Vec<_> = rnx.observable().collect(); + data.sort(); + //let content = match pretty { + // true => serde_json::to_string_pretty(&data).unwrap(), + // false => serde_json::to_string(&data).unwrap(), + //}; + println!("[{}]: {:?}", path, data); } else if op.eq("gnss") { let data: Vec<_> = rnx.constellation().collect(); let content = match pretty { true => serde_json::to_string_pretty(&data).unwrap(), false => serde_json::to_string(&data).unwrap(), }; - println!("{}", content); - } else if op.eq("ssi-range") { - let data = &rnx.observation_ssi_minmax(); + println!("[{}]: {}", path, content); + } else if op.eq("ssi-range") && rnx.is_observation_rinex() { + let ssi = SSIReport { + min: { + rnx.snr() + .min_by(|(_, _, _, snr_a), (_, _, _, snr_b)| snr_a.cmp(snr_b)) + .map(|(_, _, _, snr)| snr) + }, + max: { + rnx.snr() + .max_by(|(_, _, _, snr_a), (_, _, _, snr_b)| snr_a.cmp(snr_b)) + .map(|(_, _, _, snr)| snr) + }, + }; let content = match pretty { - true => serde_json::to_string_pretty(data).unwrap(), - false => serde_json::to_string(data).unwrap(), + true => serde_json::to_string_pretty(&ssi).unwrap(), + false => serde_json::to_string(&ssi).unwrap(), }; - println!("{}", content); - } else if op.eq("orbits") { - unimplemented!("nav::orbits"); + println!("[{}]: {}", path, content); + } else if op.eq("orbits") && rnx.is_navigation_rinex() { + error!("nav::orbits not available yet"); //let data: Vec<_> = rnx.orbit_fields(); //let content = match pretty { // true => serde_json::to_string_pretty(&data).unwrap(), // false => serde_json::to_string(&data).unwrap(), //}; //println!("{}", content); - } else if op.eq("nav-msg") { + } else if op.eq("nav-msg") && rnx.is_navigation_rinex() { let data: Vec<_> = rnx.nav_msg_type().collect(); println!("{:?}", data); - } else if op.eq("anomalies") { + } else if op.eq("anomalies") && rnx.is_observation_rinex() { let data: Vec<_> = rnx.epoch_anomalies().collect(); println!("{:#?}", data); } diff --git a/rinex-cli/src/main.rs b/rinex-cli/src/main.rs index c63fd5890..e0cadfcef 100644 --- a/rinex-cli/src/main.rs +++ b/rinex-cli/src/main.rs @@ -14,14 +14,14 @@ use preprocessing::preprocess; //use horrorshow::Template; use rinex::{ merge::Merge, - observation::{Combine, Dcb, IonoDelay, Mp}, + observation::{Combine, Dcb, IonoDelay}, //Mp}, prelude::RinexType, prelude::*, split::Split, }; extern crate gnss_rtk as rtk; -use rtk::prelude::{Solver, SolverOpts, SolverType}; +use rtk::prelude::{Solver, SolverError, SolverEstimate, SolverType}; use rinex_qc::*; @@ -29,8 +29,8 @@ use cli::Cli; use identification::rinex_identification; use plot::PlotContext; -extern crate pretty_env_logger; -use pretty_env_logger::env_logger::Builder; +//extern crate pretty_env_logger; +use env_logger::{Builder, Target}; #[macro_use] extern crate log; @@ -299,7 +299,7 @@ fn create_context(cli: &Cli) -> QcContext { * Returns true if Skyplot view if feasible */ fn skyplot_allowed(ctx: &QcContext, cli: &Cli) -> bool { - if cli.quality_check_only() || cli.positioning_only() { + if cli.quality_check_only() || cli.rtk_only() { /* * Special modes: no plots allowed */ @@ -318,6 +318,7 @@ fn skyplot_allowed(ctx: &QcContext, cli: &Cli) -> bool { pub fn main() -> Result<(), rinex::Error> { let mut builder = Builder::from_default_env(); builder + .target(Target::Stdout) .format_timestamp_secs() .format_module_path(false) .init(); @@ -329,8 +330,8 @@ pub fn main() -> Result<(), rinex::Error> { let qc_only = cli.quality_check_only(); let qc = cli.quality_check() || qc_only; - let positioning_only = cli.positioning_only(); - let positioning = cli.positioning() || positioning_only; + let rtk_only = cli.rtk_only(); + let rtk = cli.rtk() || rtk_only; // Initiate plot context let mut plot_ctx = PlotContext::new(); @@ -343,12 +344,43 @@ pub fn main() -> Result<(), rinex::Error> { // Position solver let mut solver = Solver::from(&ctx); + if let Ok(ref mut solver) = solver { + info!( + "provided context is compatible with {} position solver", + solver.solver + ); + // custom config ? apply it + if let Some(cfg) = cli.rtk_config() { + solver.cfg = cfg.clone(); + } + if !rtk { + warn!("position solver currently turned off"); + } else { + if cli.forced_spp() { + warn!("forced method to spp"); + solver.solver = SolverType::SPP; + } + // print config to be used + info!("{:#?}", solver.cfg); + } + } else { + warn!("context is not sufficient or not compatible with --rtk"); + } // Workspace let workspace = workspace_path(&ctx); info!("workspace is \"{}\"", workspace.to_string_lossy()); create_workspace(workspace.clone()); + /* + * Print more info on special primary data cases + */ + if ctx.primary_data().is_meteo_rinex() { + info!("meteo special primary data"); + } else if ctx.primary_data().is_ionex() { + info!("ionex special primary data"); + } + /* * Emphasize which reference position is to be used. * This will help user make sure everything is correct. @@ -364,30 +396,6 @@ pub fn main() -> Result<(), rinex::Error> { } else { info!("no reference position given or identified"); } - /* - * print more info on possible solver to deploy - */ - if let Ok(ref mut solver) = solver { - info!( - "provided context is compatible with {} position solver", - solver.solver - ); - if !positioning { - warn!("position solver currently turned off"); - } else { - if cli.forced_spp() { - solver.solver = SolverType::SPP; - solver.opts = SolverOpts::default(SolverType::SPP); - warn!("position solver restricted to SPP mode"); - } else if cli.forced_ppp() { - solver.solver = SolverType::PPP; - solver.opts = SolverOpts::default(SolverType::PPP); - warn!("position solver forced to PPP mode"); - } - } - } else { - info!("context is not sufficient for any position solving method"); - } /* * Preprocessing */ @@ -609,14 +617,14 @@ pub fn main() -> Result<(), rinex::Error> { * Record analysis / visualization * analysis depends on the provided record type */ - if !qc_only && !positioning_only { + if !qc_only && !rtk_only { info!("entering record analysis"); plot::plot_record(&ctx, &mut plot_ctx); } /* * Render Graphs (HTML) */ - if !qc_only && !positioning_only { + if !qc_only && !rtk_only { let html_path = workspace_path(&ctx).join("graphs.html"); let html_path = html_path.to_str().unwrap(); @@ -673,13 +681,43 @@ pub fn main() -> Result<(), rinex::Error> { } if let Ok(ref mut solver) = solver { // position solver is feasible, with provided context - if positioning { - info!("entering positioning mode\n"); - while let Some((t, estimate)) = solver.run(&mut ctx) { - trace!("epoch: {}", t); - // info!("%%%%%%%%% Iteration : {} %%%%%%%%%%%", iteration +1); - //info!("%%%%%%%%% Position : {:?}, Time: {:?}", position, time); - // iteration += 1; + let mut solving = true; + let mut results: HashMap = HashMap::new(); + + if rtk { + match solver.init(&mut ctx) { + Err(e) => panic!("failed to initialize rtk solver"), + Ok(_) => info!("entering rtk mode"), + } + while solving { + match solver.run(&mut ctx) { + Ok((t, estimate)) => { + trace!( + "epoch: {} +position error: {:.6E}, {:.6E}, {:.6E} +HDOP {:.5E} | VDOP {:.5E} +clock offset: {:.6E} | TDOP {:.5E}", + t, + estimate.dx, + estimate.dy, + estimate.dz, + estimate.hdop, + estimate.vdop, + estimate.dt, + estimate.tdop + ); + results.insert(t, estimate); + }, + Err(SolverError::NoSv(t)) => info!("no SV elected @{}", t), + Err(SolverError::LessThan4Sv(t)) => info!("less than 4 SV @{}", t), + Err(SolverError::SolvingError(t)) => { + error!("failed to invert navigation matrix @ {}", t) + }, + Err(SolverError::EpochDetermination(_)) => { + solving = false; // abort + }, + Err(e) => panic!("fatal error {:?}", e), + } } info!("done"); } diff --git a/rinex-cli/src/plot/context.rs b/rinex-cli/src/plot/context.rs index 01e597d0f..6ebaa4502 100644 --- a/rinex-cli/src/plot/context.rs +++ b/rinex-cli/src/plot/context.rs @@ -30,8 +30,16 @@ impl PlotContext { pub fn add_polar2d_plot(&mut self, title: &str) { self.plots.push(build_default_polar_plot(title)); } - pub fn add_world_map(&mut self, style: MapboxStyle, center: (f64, f64), zoom: u8) { - self.plots.push(build_world_map(style, center, zoom)); + pub fn add_world_map( + &mut self, + title: &str, + show_legend: bool, + map_style: MapboxStyle, + center: (f64, f64), + zoom: u8, + ) { + self.plots + .push(build_world_map(title, show_legend, map_style, center, zoom)); } pub fn add_trace(&mut self, trace: Box) { let len = self.plots.len() - 1; diff --git a/rinex-cli/src/plot/mod.rs b/rinex-cli/src/plot/mod.rs index 3c3f3ee57..bbc3bcae1 100644 --- a/rinex-cli/src/plot/mod.rs +++ b/rinex-cli/src/plot/mod.rs @@ -279,14 +279,22 @@ pub fn build_default_polar_plot(title: &str) -> Plot { * centered on given locations, in decimal degrees, * zoom factor */ -pub fn build_world_map(style: MapboxStyle, center: (f64, f64), zoom: u8) -> Plot { +pub fn build_world_map( + title: &str, + show_legend: bool, + map_style: MapboxStyle, + center: (f64, f64), + zoom: u8, +) -> Plot { let mut p = Plot::new(); let layout = Layout::new() + .title(Title::new(title).font(Font::default())) .drag_mode(DragMode::Zoom) .margin(Margin::new().top(0).left(0).bottom(0).right(0)) + .show_legend(show_legend) .mapbox( Mapbox::new() - .style(style) + .style(map_style) .center(Center::new(center.0, center.1)) .zoom(zoom), ); @@ -377,7 +385,7 @@ pub fn build_chart_epoch_axis( let txt: Vec = epochs.iter().map(|e| e.to_string()).collect(); Scatter::new(epochs.iter().map(|e| e.to_utc_seconds()).collect(), data_y) .mode(mode) - .web_gl_mode(true) + //.web_gl_mode(true) .name(name) .hover_text_array(txt) .hover_info(HoverInfo::All) @@ -392,7 +400,7 @@ pub fn plot_record(ctx: &QcContext, plot_ctx: &mut PlotContext) { } else if ctx.primary_data().is_meteo_rinex() { record::plot_meteo(ctx, plot_ctx); } else if ctx.primary_data().is_ionex() { - if let Some(borders) = ctx.primary_data().ionex_map_borders() { + if let Some(borders) = ctx.primary_data().tec_map_borders() { record::plot_tec_map(ctx, borders, plot_ctx); } } diff --git a/rinex-cli/src/plot/record/ionex.rs b/rinex-cli/src/plot/record/ionex.rs index 456145846..8160e8da6 100644 --- a/rinex-cli/src/plot/record/ionex.rs +++ b/rinex-cli/src/plot/record/ionex.rs @@ -1,13 +1,9 @@ //use itertools::Itertools; use crate::plot::PlotContext; -use plotly::{ - color::NamedColor, - common::{Marker, MarkerSymbol}, //color::Rgba}, - layout::MapboxStyle, - //scatter_mapbox::Fill, - ScatterMapbox, -}; +use plotly::layout::MapboxStyle; +use rinex::prelude::Epoch; use rinex_qc::QcContext; +use std::collections::HashMap; pub fn plot_tec_map( ctx: &QcContext, @@ -15,73 +11,62 @@ pub fn plot_tec_map( plot_ctx: &mut PlotContext, ) { let _cmap = colorous::TURBO; - plot_ctx.add_world_map(MapboxStyle::OpenStreetMap, (32.5, -40.0), 1); + //let hover_text: Vec = ctx.primary_data().epoch().map(|e| e.to_string()).collect(); + /* + * TEC map visualization + * plotly-rs has no means to animate plots at the moment + * therefore.. we create one plot for all existing epochs + */ + for (_index, epoch) in ctx.primary_data().epoch().enumerate() { + let content: Vec<_> = ctx + .primary_data() + .tec() + .filter_map(|(t, lat, lon, h, tec)| { + if t == epoch { + Some((lat, lon, h, tec)) + } else { + None + } + }) + .collect(); - let record = ctx.primary_data().record.as_ionex().unwrap(); // cannot fail + plot_ctx.add_world_map( + &epoch.to_string(), + true, + MapboxStyle::StamenTerrain, + (32.5, -40.0), + 1, + ); - let mut grid_lat: Vec = Vec::new(); - let mut grid_lon: Vec = Vec::new(); - let mut tec_max = -f64::INFINITY; - for (e_index, (_e, (tec, _, _))) in record.iter().enumerate() { - for point in tec { - if e_index == 0 { - // grab grid definition - grid_lat.push(point.latitude.into()); - grid_lon.push(point.longitude.into()); - } - if point.value > tec_max { - tec_max = point.value; - } + let mut lat: HashMap = HashMap::new(); + let mut lon: HashMap = HashMap::new(); + let mut z: HashMap = HashMap::new(); + for (tec_lat, tec_lon, _, tec) in content { + lat.insert(Epoch::default().to_string(), tec_lat); + lon.insert(Epoch::default().to_string(), tec_lon); + z.insert(Epoch::default().to_string(), tec); } - } - let grid = ScatterMapbox::new(grid_lat, grid_lon) - .marker( - Marker::new() - .size(5) - .symbol(MarkerSymbol::Circle) - .color(NamedColor::Black) - .opacity(0.5), - ) - .name("TEC Grid"); - plot_ctx.add_trace(grid); + /* plot the map grid */ + //let grid = ScatterMapbox::new(lat.clone(), lon.clone()) + // .marker( + // Marker::new() + // .size(3) + // .symbol(MarkerSymbol::Circle) + // .color(NamedColor::Black) + // .opacity(0.5), + // ) + // .name("grid"); + //plot_ctx.add_trace(grid); - /* - * Build heat map, - * we have no means to plot several heat map (day course) - * at the moment, we just plot the 1st epoch - */ - /* - for (e_index, (e, (tec, _, _))) in record.iter().enumerate() { - if e_index == 0 { - // form the smallest area from that grid - for rows in - for i in 0..tec.len() / 4 { - println!("MAPS {}", i); - //for i in 0..maps.len() / 2 { - /* - let mut row1 = maps[0] - let mut row2 = maps[1] - if let Some(points12) = row1.next() { - if let Some(points34) = row2.next() { - // average TEC in that area - let tec = (points12[0].value + points12[1].value - +points34[0].value + points34[1].value ) / 4.0; - let color = cmap.eval_continuous(tec / tec_max); - let area = ScatterMapbox::new( - vec![points12[0].latitude, points12[1].latitude, points34[0].latitude, points34[1].latitude], - vec![points12[0].longitude, points12[1].longitude, points34[0].longitude, points34[1].longitude], - ) - .opacity(0.5) - .fill(Fill::ToSelf) - .fill_color(Rgba::new(color.r, color.g, color.b, 0.9)); - ctx.add_trace(area); - println!("YES"); - } - }*/ - } - } else { - break ; // we need animations for that - } - }*/ + //let map = AnimatedDensityMapbox::new(lat.clone(), lon.clone(), z) + // .title("TEST") + // .name(epoch.to_string()) + // .opacity(0.66) + // .hover_text_array(hover_text.clone()) + // .zauto(true) + // //.animation_frame("test") + // .zoom(3); + //plot_ctx.add_trace(map); + } } diff --git a/rinex-cli/src/plot/record/navigation.rs b/rinex-cli/src/plot/record/navigation.rs index 2fc67d31e..68efd4e7b 100644 --- a/rinex-cli/src/plot/record/navigation.rs +++ b/rinex-cli/src/plot/record/navigation.rs @@ -67,7 +67,7 @@ fn plot_nav_data(rinex: &Rinex, sp3: Option<&SP3>, plot_ctx: &mut PlotContext) { .collect(); let trace = build_chart_epoch_axis( - &format!("{}(clk)", sv), + &format!("{:X}(clk)", sv), Mode::LinesMarkers, sv_epochs.clone(), sv_clock, @@ -86,7 +86,7 @@ fn plot_nav_data(rinex: &Rinex, sp3: Option<&SP3>, plot_ctx: &mut PlotContext) { plot_ctx.add_trace(trace); let trace = build_chart_epoch_axis( - &format!("{}(drift)", sv), + &format!("{:X}(drift)", sv), Mode::LinesMarkers, sv_epochs.clone(), sv_drift, @@ -134,7 +134,7 @@ fn plot_nav_data(rinex: &Rinex, sp3: Option<&SP3>, plot_ctx: &mut PlotContext) { ) .collect(); let trace = - build_chart_epoch_axis(&format!("{}(sp3_clk)", sv), Mode::Markers, epochs, data) + build_chart_epoch_axis(&format!("{:X}(sp3_clk)", sv), Mode::Markers, epochs, data) .visible({ if sv_index == 0 { // Clock data differs too much: plot only one to begin with @@ -182,7 +182,7 @@ fn plot_nav_data(rinex: &Rinex, sp3: Option<&SP3>, plot_ctx: &mut PlotContext) { ) .collect(); let trace = - build_chart_epoch_axis(&format!("{}(x)", sv), Mode::Markers, epochs.clone(), x_km) + build_chart_epoch_axis(&format!("{:X}(x)", sv), Mode::Markers, epochs.clone(), x_km) .visible({ if sv_index == 0 { Visible::True @@ -206,7 +206,7 @@ fn plot_nav_data(rinex: &Rinex, sp3: Option<&SP3>, plot_ctx: &mut PlotContext) { .collect(); let trace = - build_chart_epoch_axis(&format!("{}(y)", sv), Mode::Markers, epochs.clone(), y_km) + build_chart_epoch_axis(&format!("{:X}(y)", sv), Mode::Markers, epochs.clone(), y_km) .y_axis("y2") .visible({ if sv_index == 0 { @@ -245,7 +245,7 @@ fn plot_nav_data(rinex: &Rinex, sp3: Option<&SP3>, plot_ctx: &mut PlotContext) { ) .collect(); let trace = build_chart_epoch_axis( - &format!("{}(sp3_x)", sv), + &format!("{:X}(sp3_x)", sv), Mode::LinesMarkers, epochs.clone(), x, @@ -271,7 +271,7 @@ fn plot_nav_data(rinex: &Rinex, sp3: Option<&SP3>, plot_ctx: &mut PlotContext) { ) .collect(); let trace = build_chart_epoch_axis( - &format!("{}(sp3_y)", sv), + &format!("{:X}(sp3_y)", sv), Mode::LinesMarkers, epochs.clone(), y, @@ -317,7 +317,7 @@ fn plot_nav_data(rinex: &Rinex, sp3: Option<&SP3>, plot_ctx: &mut PlotContext) { }, ) .collect(); - let trace = build_chart_epoch_axis(&format!("{}(z)", sv), Mode::Markers, epochs, z_km) + let trace = build_chart_epoch_axis(&format!("{:X}(z)", sv), Mode::Markers, epochs, z_km) .visible({ if sv_index == 0 { Visible::True @@ -354,15 +354,19 @@ fn plot_nav_data(rinex: &Rinex, sp3: Option<&SP3>, plot_ctx: &mut PlotContext) { }, ) .collect(); - let trace = - build_chart_epoch_axis(&format!("{}(sp3_z)", sv), Mode::LinesMarkers, epochs, z_km) - .visible({ - if sv_index == 0 { - Visible::True - } else { - Visible::LegendOnly - } - }); + let trace = build_chart_epoch_axis( + &format!("{:X}(sp3_z)", sv), + Mode::LinesMarkers, + epochs, + z_km, + ) + .visible({ + if sv_index == 0 { + Visible::True + } else { + Visible::LegendOnly + } + }); plot_ctx.add_trace(trace); } } diff --git a/rinex-cli/src/plot/record/observation.rs b/rinex-cli/src/plot/record/observation.rs index 1b89eb254..c333dda98 100644 --- a/rinex-cli/src/plot/record/observation.rs +++ b/rinex-cli/src/plot/record/observation.rs @@ -113,7 +113,7 @@ pub fn plot_observation(ctx: &QcContext, plot_context: &mut PlotContext) { let data_y: Vec = data.iter().map(|(_cs, _e, y)| *y).collect(); let trace = build_chart_epoch_axis( - &format!("{}({})", sv, observable), + &format!("{:X}({})", sv, observable), Mode::Markers, data_x, data_y, @@ -142,7 +142,7 @@ pub fn plot_observation(ctx: &QcContext, plot_context: &mut PlotContext) { let epochs: Vec = data.iter().map(|(e, _)| *e).collect(); let elev: Vec = data.iter().map(|(_, f)| *f).collect(); let trace = build_chart_epoch_axis( - &format!("Elev({})", sv), + &format!("Elev({:X})", sv), Mode::LinesMarkers, epochs, elev, diff --git a/rinex-cli/src/plot/record/sp3.rs b/rinex-cli/src/plot/record/sp3.rs index d1962a3ad..784784a5a 100644 --- a/rinex-cli/src/plot/record/sp3.rs +++ b/rinex-cli/src/plot/record/sp3.rs @@ -81,7 +81,7 @@ pub fn plot_residual_ephemeris(ctx: &QcContext, plot_ctx: &mut PlotContext) { } } let trace = - build_chart_epoch_axis(&format!("|{}_err|", sv), Mode::Markers, epochs, residuals) + build_chart_epoch_axis(&format!("|{:X}_err|", sv), Mode::Markers, epochs, residuals) .visible({ if sv_index < 4 { Visible::True diff --git a/rinex-cli/src/plot/skyplot.rs b/rinex-cli/src/plot/skyplot.rs index 281c4a470..6129a7b43 100644 --- a/rinex-cli/src/plot/skyplot.rs +++ b/rinex-cli/src/plot/skyplot.rs @@ -55,7 +55,7 @@ pub fn skyplot(ctx: &QcContext, plot_context: &mut PlotContext) { } }) .connect_gaps(false) - .name(svnn.to_string()); + .name(format!("{:X}", svnn)); plot_context.add_trace(trace); } } diff --git a/rinex-qc/Cargo.toml b/rinex-qc/Cargo.toml index b37fe6639..99659467f 100644 --- a/rinex-qc/Cargo.toml +++ b/rinex-qc/Cargo.toml @@ -20,7 +20,7 @@ rustdoc-args = ["--cfg", "docrs", "--generate-link-to-definition"] [dependencies] serde = { version = "1.0", optional = true, default-features = false, features = ["derive"] } -hifitime = "3.8" +hifitime = "3.8.4" strum = "0.25" strum_macros = "0.25" horrorshow = "0.8" @@ -28,7 +28,7 @@ itertools = "0.11.0" statrs = "0.16" sp3 = { path = "../sp3", features = ["serde"] } rinex-qc-traits = { path = "../qc-traits", version = "=0.1.1" } -rinex = { path = "../rinex", features = ["obs", "nav", "qc", "processing", "serde", "flate2"] } +rinex = { path = "../rinex", version = "=0.14.1", features = ["full"] } [dev-dependencies] serde_json = "1" diff --git a/rinex-qc/src/analysis/mod.rs b/rinex-qc/src/analysis/mod.rs index dcc6f2bb7..404295f2e 100644 --- a/rinex-qc/src/analysis/mod.rs +++ b/rinex-qc/src/analysis/mod.rs @@ -33,7 +33,7 @@ pub struct QcAnalysis { impl QcAnalysis { /// Creates a new Analysis Report from given RINEX context. /// primary : primary file - pub fn new(primary: &Rinex, nav: &Option, opts: &QcOpts) -> Self { + pub fn new(primary: &Rinex, _nav: &Option, opts: &QcOpts) -> Self { Self { sv: QcSvAnalysis::new(primary, opts), sampling: QcSamplingAnalysis::new(primary, opts), diff --git a/rinex-qc/src/analysis/obs.rs b/rinex-qc/src/analysis/obs.rs index ce5d1d952..b9f6bab85 100644 --- a/rinex-qc/src/analysis/obs.rs +++ b/rinex-qc/src/analysis/obs.rs @@ -5,7 +5,7 @@ use std::str::FromStr; use crate::{pretty_array, QcOpts}; -use rinex::carrier; +//use rinex::carrier; use rinex::carrier::Carrier; use rinex::observation::Snr; use rinex::prelude::{Epoch, EpochFlag, Observable, Rinex, Sv}; @@ -115,7 +115,7 @@ fn report_anomalies<'a>( } tr { th { - : "Cycle slip(s)" + : "Cycle slips" } @ if cs.is_empty() { td { @@ -219,7 +219,7 @@ fn report_epoch_completion( td { @ for ((sv, carrier), count) in complete { b { - : format!("{} {}/L1", sv, carrier) + : format!("{:X} {}/L1", sv, carrier) } p { : format!("{} ({}%)", count, count * 100 / total) @@ -422,10 +422,8 @@ impl QcObsAnalysis { total_epochs = r.len(); for ((epoch, _flag), (_clk, svs)) in r { for (_sv, observables) in svs { - if !observables.is_empty() { - if !epoch_with_obs.contains(&epoch) { - epoch_with_obs.push(*epoch); - } + if !observables.is_empty() && !epoch_with_obs.contains(epoch) { + epoch_with_obs.push(*epoch); } } } @@ -433,7 +431,7 @@ impl QcObsAnalysis { // append ssi: drop vehicle differentiation let mut ssi: HashMap> = HashMap::new(); for (_, _, obs, value) in rnx.ssi() { - if let Some(values) = ssi.get_mut(&obs) { + if let Some(values) = ssi.get_mut(obs) { values.push(value); } else { ssi.insert(obs.clone(), vec![value]); @@ -451,7 +449,7 @@ impl QcObsAnalysis { let mut snr: HashMap> = HashMap::new(); for ((e, _), _, obs, snr_value) in rnx.snr() { let snr_f64: f64 = (snr_value as u8).into(); - if let Some(values) = snr.get_mut(&obs) { + if let Some(values) = snr.get_mut(obs) { values.push((e, snr_f64)); } else { snr.insert(obs.clone(), vec![(e, snr_f64)]); diff --git a/rinex-qc/src/analysis/sv.rs b/rinex-qc/src/analysis/sv.rs index bc3ba457e..ae1ea652b 100644 --- a/rinex-qc/src/analysis/sv.rs +++ b/rinex-qc/src/analysis/sv.rs @@ -13,7 +13,7 @@ impl QcSvAnalysis { pub fn new(primary: &Rinex, _opts: &QcOpts) -> Self { let sv = primary.sv(); Self { - sv: { sv.map(|sv| sv.to_string()).collect() }, + sv: { sv.map(|sv| format!("{:X}", sv)).collect() }, } } } diff --git a/rinex-qc/src/context.rs b/rinex-qc/src/context.rs index 803236e30..003e7984c 100644 --- a/rinex-qc/src/context.rs +++ b/rinex-qc/src/context.rs @@ -3,7 +3,7 @@ use rinex_qc_traits::HtmlReport; use std::collections::HashMap; use std::path::{Path, PathBuf}; -use rinex::carrier::Carrier; +//use rinex::carrier::Carrier; use rinex::observation::Snr; use rinex::prelude::{Epoch, GroundPosition, Rinex, Sv}; use rinex::Error; @@ -116,7 +116,7 @@ impl QcContext { /// Returns reference to SP3 data specifically pub fn sp3_data(&self) -> Option<&SP3> { if let Some(ref sp3) = self.sp3 { - Some(&sp3.data()) + Some(sp3.data()) } else { None } @@ -204,7 +204,7 @@ impl QcContext { /* NB: interpolate Complete Epochs only */ let complete_epoch: Vec<_> = self.primary_data().complete_epoch(min_snr).collect(); for (e, sv_signals) in complete_epoch { - for (sv, carrier) in sv_signals { + for (sv, _carrier) in sv_signals { // if orbit already exists: do not interpolate // this will make things much quicker for high quality data products let found = self @@ -214,17 +214,14 @@ impl QcContext { if let Some((_, _, (x, y, z))) = found { // store as is self.orbits.insert((e, sv), (x, y, z)); - } else { - if let Some(sp3) = self.sp3_data() { - if let Some((x_km, y_km, z_km)) = sp3.sv_position_interpolate(sv, e, order) - { - self.orbits.insert((e, sv), (x_km, y_km, z_km)); - } - } else if let Some(nav) = self.navigation_data() { - if let Some((x_m, y_m, z_m)) = nav.sv_position_interpolate(sv, e, order) { - self.orbits - .insert((e, sv), (x_m * 1.0E-3, y_m * 1.0E-3, z_m * 1.0E-3)); - } + } else if let Some(sp3) = self.sp3_data() { + if let Some((x_km, y_km, z_km)) = sp3.sv_position_interpolate(sv, e, order) { + self.orbits.insert((e, sv), (x_km, y_km, z_km)); + } + } else if let Some(nav) = self.navigation_data() { + if let Some((x_m, y_m, z_m)) = nav.sv_position_interpolate(sv, e, order) { + self.orbits + .insert((e, sv), (x_m * 1.0E-3, y_m * 1.0E-3, z_m * 1.0E-3)); } } } diff --git a/rinex-qc/src/lib.rs b/rinex-qc/src/lib.rs index ac5782d72..3ee92a710 100644 --- a/rinex-qc/src/lib.rs +++ b/rinex-qc/src/lib.rs @@ -85,8 +85,7 @@ impl QcReport { } }, QcClassification::Physics => { - let mut observables: Vec<_> = - ctx.primary_data().observable().map(|o| o.clone()).collect(); + let mut observables: Vec<_> = ctx.primary_data().observable().cloned().collect(); observables.sort(); // improves report rendering for obsv in observables { filter_targets.push(TargetItem::from(obsv)); @@ -103,14 +102,13 @@ impl QcReport { let subset = ctx.primary_data().filter(mask.clone().into()); // also apply to possible NAV augmentation - let nav_subset = if let Some(nav) = &ctx.navigation_data() { - Some(nav.filter(mask.clone().into())) - } else { - None - }; + let nav_subset = ctx + .navigation_data() + .as_ref() + .map(|nav| nav.filter(mask.clone().into())); // perform analysis on these subsets - analysis.push(QcAnalysis::new(&subset, &nav_subset, &opts)); + analysis.push(QcAnalysis::new(&subset, &nav_subset, opts)); } analysis } diff --git a/rinex/Cargo.toml b/rinex/Cargo.toml index 17b8fdcac..6b98e61cc 100644 --- a/rinex/Cargo.toml +++ b/rinex/Cargo.toml @@ -1,6 +1,6 @@ [package] name = "rinex" -version = "0.14.0" +version = "0.14.1" license = "MIT OR Apache-2.0" authors = ["Guillaume W. Bres "] description = "Package to parse and analyze RINEX data" @@ -18,6 +18,7 @@ sbas = ["geo", "wkt"] obs = [] meteo = [] nav = [] +ionex = [] processing = [] qc = ["rinex-qc-traits", "horrorshow"] # rinex Quality Check (mainly OBS RINEX) @@ -25,6 +26,7 @@ qc = ["rinex-qc-traits", "horrorshow"] # rinex Quality Check (mainly OBS RINEX) full = [ "flate2", "horrorshow", + "ionex", "meteo", "nav", "obs", @@ -40,6 +42,7 @@ rustdoc-args = ["--cfg", "docrs", "--generate-link-to-definition"] [build-dependencies] serde_json = { version = "1.0", features = ["preserve_order"] } +serde = { version = "1.0", default-features = false, features = ["derive"] } [dependencies] num = "0.4" @@ -58,7 +61,7 @@ geo = { version = "0.26", optional = true } wkt = { version = "0.10.0", default-features = false, optional = true } serde = { version = "1.0", optional = true, default-features = false, features = ["derive"] } flate2 = { version = "1.0.24", optional = true, default-features = false, features = ["zlib"] } -hifitime = { version = "3.8", features = ["serde", "std"] } +hifitime = { version = "3.8.4", features = ["serde", "std"] } horrorshow = { version = "0.8", optional = true } rinex-qc-traits = { path = "../qc-traits", version = "=0.1.1", optional = true } diff --git a/rinex/benches/benchmark.rs b/rinex/benches/benchmark.rs index 8592b920f..cac03acb4 100644 --- a/rinex/benches/benchmark.rs +++ b/rinex/benches/benchmark.rs @@ -3,7 +3,7 @@ use rinex::{ hatanaka::{numdiff::NumDiff, textdiff::TextDiff}, observation::*, prelude::*, - processing::*, + //processing::*, reader::BufferedReader, record::parse_record, }; @@ -36,7 +36,7 @@ fn profiled() -> Criterion { }*/ fn parse_file(fp: &str) { - let _ = Rinex::from_file(fp); + let _ = Rinex::from_file(fp).unwrap(); } fn text_decompression(textdiff: &mut TextDiff, data: &[&str]) { @@ -192,223 +192,96 @@ fn decompression_benchmark(c: &mut Criterion) { } /* - * Puts record section parsing to the test - */ + * Evaluates parsing performance of plain RINEX parsing fn record_parsing_benchmark(c: &mut Criterion) { let mut group = c.benchmark_group("parsing"); - // prepare for OBS/zegv0010.21o - let mut header = Header::basic_obs().with_observation_fields(HeaderFields { - crinex: None, - codes: { - let mut map: HashMap> = HashMap::new(); - map.insert( - Constellation::GPS, - vec![ - Observable::from_str("C1").unwrap(), - Observable::from_str("C2").unwrap(), - Observable::from_str("C5").unwrap(), - Observable::from_str("L1").unwrap(), - Observable::from_str("L2").unwrap(), - Observable::from_str("L5").unwrap(), - Observable::from_str("P1").unwrap(), - Observable::from_str("P2").unwrap(), - Observable::from_str("S1").unwrap(), - Observable::from_str("S2").unwrap(), - Observable::from_str("S5").unwrap(), - ], - ); - map.insert( - Constellation::Glonass, - vec![ - Observable::from_str("C1").unwrap(), - Observable::from_str("C2").unwrap(), - Observable::from_str("C5").unwrap(), - Observable::from_str("L1").unwrap(), - Observable::from_str("L2").unwrap(), - Observable::from_str("L5").unwrap(), - Observable::from_str("P1").unwrap(), - Observable::from_str("P2").unwrap(), - Observable::from_str("S1").unwrap(), - Observable::from_str("S2").unwrap(), - Observable::from_str("S5").unwrap(), - ], - ); - map - }, - clock_offset_applied: false, - dcb_compensations: Vec::new(), - scalings: HashMap::new(), - }); - group.bench_function("OBSv2/zegv0010.21o", |b| { - b.iter(|| { - record_parsing("../test_resources/OBS/V2/zegv0010.21o", &mut header); - }) - }); - - // prepare for OBS/V3/ACOR00ESP - let mut header = Header::basic_obs().with_observation_fields(HeaderFields { - crinex: None, - codes: { - let mut map: HashMap> = HashMap::new(); - map.insert( - Constellation::GPS, - vec![ - Observable::from_str("C1C").unwrap(), - Observable::from_str("L1C").unwrap(), - Observable::from_str("S1C").unwrap(), - Observable::from_str("C2S").unwrap(), - Observable::from_str("L2S").unwrap(), - Observable::from_str("S2S").unwrap(), - Observable::from_str("C2W").unwrap(), - Observable::from_str("L2W").unwrap(), - Observable::from_str("S2W").unwrap(), - Observable::from_str("C5Q").unwrap(), - Observable::from_str("L5Q").unwrap(), - Observable::from_str("S5Q").unwrap(), - ], - ); - map.insert( - Constellation::Glonass, - vec![ - Observable::from_str("C1C").unwrap(), - Observable::from_str("L1C").unwrap(), - Observable::from_str("S1C").unwrap(), - Observable::from_str("C2P").unwrap(), - Observable::from_str("L2P").unwrap(), - Observable::from_str("S2P").unwrap(), - Observable::from_str("C2C").unwrap(), - Observable::from_str("L2C").unwrap(), - Observable::from_str("S2C").unwrap(), - Observable::from_str("C3Q").unwrap(), - Observable::from_str("L3Q").unwrap(), - Observable::from_str("S3Q").unwrap(), - ], - ); - map.insert( - Constellation::Galileo, - vec![ - Observable::from_str("C1C").unwrap(), - Observable::from_str("L1C").unwrap(), - Observable::from_str("S1C").unwrap(), - Observable::from_str("C5Q").unwrap(), - Observable::from_str("L5Q").unwrap(), - Observable::from_str("S5Q").unwrap(), - Observable::from_str("C6C").unwrap(), - Observable::from_str("L6C").unwrap(), - Observable::from_str("S6C").unwrap(), - Observable::from_str("C7Q").unwrap(), - Observable::from_str("L7Q").unwrap(), - Observable::from_str("S7Q").unwrap(), - Observable::from_str("C8Q").unwrap(), - Observable::from_str("L8Q").unwrap(), - Observable::from_str("S8Q").unwrap(), - ], - ); - map.insert( - Constellation::BeiDou, - vec![ - Observable::from_str("C2I").unwrap(), - Observable::from_str("L2I").unwrap(), - Observable::from_str("S2I").unwrap(), - Observable::from_str("C6I").unwrap(), - Observable::from_str("L6I").unwrap(), - Observable::from_str("S6I").unwrap(), - Observable::from_str("C7I").unwrap(), - Observable::from_str("L7I").unwrap(), - Observable::from_str("S7I").unwrap(), - ], - ); - map - }, - clock_offset_applied: false, - dcb_compensations: Vec::new(), - scalings: HashMap::new(), - }); - group.bench_function("OBSv3/ACOR00ESP", |b| { - b.iter(|| { - record_parsing( - "../test_resources/OBS/V3/ACOR00ESP_R_20213550000_01D_30S_MO.rnx", - &mut header, - ); - }) - }); - - //prepare for CRNX/V1/delf0010.21d - //prepare for CRNX/V3/ESBC00DNK - //prepare for NAV/V2/ijmu3650.21n.gz - //prepare for NAV/V3/MOJN00DNK_R_20201770000_01D_MN.rnx.gz - - group.finish(); /* concludes record section */ -} - -fn processing_benchmark(c: &mut Criterion) { - let mut group = c.benchmark_group("processing"); - let rinex = - Rinex::from_file("../test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz") - .unwrap(); - let record = rinex.record.as_obs().unwrap(); - - for filter in vec![ - (Filter::from_str("mask:GPS,GLO,BDS").unwrap(), "mask:gnss"), - //(Filter::from_str("mask:gt:10 minutes").unwrap(), "mask:dt"), - ( - Filter::from_str("mask:L1C,C1C,L2P,L2W").unwrap(), - "mask:obs", - ), - ( - Filter::from_str("mask:g08,g15,g19,r03,r09").unwrap(), - "mask:sv", - ), - //(Filter::from_str("mask:2020-06-25 08:00:00UTC").unwrap(), "mask:epoch"), - (Filter::from_str("smooth:hatch").unwrap(), "smoothing:hatch"), - ( - Filter::from_str("smooth:hatch:l1c,l2c").unwrap(), - "smoothing:hatch:l1c,l2c", - ), - //(Filter::from_str("smooth:mov:10 minutes").unwrap(), "smoothing:mov:10 mins"), - ] { - let (filter, name) = filter; - group.bench_function(&format!("esbc00dnk_r_2021/{}", name), |b| { - b.iter(|| record.filter(filter.clone())) - }); - } - - for combination in vec![ - (Combination::GeometryFree, "gf"), - (Combination::NarrowLane, "nl"), - (Combination::WideLane, "wl"), - (Combination::MelbourneWubbena, "mw"), + let base_dir = Path::new(env!("CARGO_MANIFEST_DIR")) + .join("..") + .join("test_resources"); + /* + * small, medium, large compressed: OBS + */ + for (rev, filename) in vec![ + ("V2", "del0010.21o"), ] { - let (combination, name) = combination; - group.bench_function(&format!("esbc00dnk_r_2021/{}", name), |b| { + group.bench_function("OBSv2/zegv0010.21o", |b| { b.iter(|| { - record.combine(combination); + record_parsing("../test_resources/OBS/V2/zegv0010.21o", &mut header); }) }); } - group.bench_function("esbc00dnk_r_2021/dcb", |b| { - b.iter(|| { - record.dcb(); - }) - }); - group.bench_function("esbc00dnk_r_2021/ionod", |b| { - b.iter(|| { - record.iono_delay_detector(Duration::from_seconds(30.0)); - }) - }); - group.bench_function("esbc00dnk_r_2021/derivative", |b| { - b.iter(|| { - let der = record.derivative(); - let mov = der.moving_average(Duration::from_seconds(600.0), None); - }) - }); + group.finish(); /* concludes record section */ } + */ + +//fn processing_benchmark(c: &mut Criterion) { +// let mut group = c.benchmark_group("processing"); +// let rinex = +// Rinex::from_file("../test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz") +// .unwrap(); +// let record = rinex.record.as_obs().unwrap(); +// +// for filter in vec![ +// (Filter::from_str("mask:GPS,GLO,BDS").unwrap(), "mask:gnss"), +// //(Filter::from_str("mask:gt:10 minutes").unwrap(), "mask:dt"), +// ( +// Filter::from_str("mask:L1C,C1C,L2P,L2W").unwrap(), +// "mask:obs", +// ), +// ( +// Filter::from_str("mask:g08,g15,g19,r03,r09").unwrap(), +// "mask:sv", +// ), +// //(Filter::from_str("mask:2020-06-25 08:00:00UTC").unwrap(), "mask:epoch"), +// (Filter::from_str("smooth:hatch").unwrap(), "smoothing:hatch"), +// ( +// Filter::from_str("smooth:hatch:l1c,l2c").unwrap(), +// "smoothing:hatch:l1c,l2c", +// ), +// //(Filter::from_str("smooth:mov:10 minutes").unwrap(), "smoothing:mov:10 mins"), +// ] { +// let (filter, name) = filter; +// group.bench_function(&format!("esbc00dnk_r_2021/{}", name), |b| { +// b.iter(|| record.filter(filter.clone())) +// }); +// } +// +// for combination in vec![ +// (Combination::GeometryFree, "gf"), +// (Combination::NarrowLane, "nl"), +// (Combination::WideLane, "wl"), +// (Combination::MelbourneWubbena, "mw"), +// ] { +// let (combination, name) = combination; +// group.bench_function(&format!("esbc00dnk_r_2021/{}", name), |b| { +// b.iter(|| { +// record.combine(combination); +// }) +// }); +// } +// group.bench_function("esbc00dnk_r_2021/dcb", |b| { +// b.iter(|| { +// record.dcb(); +// }) +// }); +// group.bench_function("esbc00dnk_r_2021/ionod", |b| { +// b.iter(|| { +// record.iono_delay_detector(Duration::from_seconds(30.0)); +// }) +// }); +// group.bench_function("esbc00dnk_r_2021/derivative", |b| { +// b.iter(|| { +// let der = record.derivative(); +// let mov = der.moving_average(Duration::from_seconds(600.0), None); +// }) +// }); +//} fn benchmark(c: &mut Criterion) { decompression_benchmark(c); - record_parsing_benchmark(c); - processing_benchmark(c); + //record_parsing_benchmark(c); + //processing_benchmark(c); } criterion_group!(benches, benchmark); diff --git a/rinex/build.rs b/rinex/build.rs index b9294403f..4ff3d4d62 100644 --- a/rinex/build.rs +++ b/rinex/build.rs @@ -3,8 +3,8 @@ use std::io::Write; use std::path::Path; fn build_nav_database() { - let out_dir = env::var("OUT_DIR").unwrap(); - let nav_path = Path::new(&out_dir).join("nav_orbits.rs"); + let outdir = env::var("OUT_DIR").unwrap(); + let nav_path = Path::new(&outdir).join("nav_orbits.rs"); let mut nav_file = std::fs::File::create(&nav_path).unwrap(); // read helper descriptor @@ -100,6 +100,84 @@ pub struct NavHelper<'a> { .unwrap(); } +use serde::Deserialize; + +fn default_launch_month() -> u8 { + 1 // Jan +} + +fn default_launch_day() -> u8 { + 1 // 1st day of month +} + +/* + * We use an intermediate struct + * and "serde" to allow not to describe the launched + * day or month for example + */ +#[derive(Deserialize)] +struct SBASDBEntry<'a> { + pub constellation: &'a str, + pub prn: u16, + pub id: &'a str, + #[serde(default = "default_launch_month")] + pub launched_month: u8, + #[serde(default = "default_launch_day")] + pub launched_day: u8, + pub launched_year: i32, +} + +fn build_sbas_helper() { + let outdir = env::var("OUT_DIR").unwrap(); + let path = Path::new(&outdir).join("sbas.rs"); + let mut fd = std::fs::File::create(path).unwrap(); + + // read descriptor: parse and dump into a static array + let db_content = std::fs::read_to_string("db/SBAS/sbas.json").unwrap(); + + let sbas_db: Vec = serde_json::from_str(&db_content).unwrap(); + + let content = "use lazy_static::lazy_static; + +#[derive(Debug)] +pub struct SBASHelper<'a> { + constellation: &'a str, + prn: u16, + id: &'a str, + launched_day: u8, + launched_month: u8, + launched_year: i32, +} + +lazy_static! { + static ref SBAS_VEHICLES: Vec> = vec![ +\n"; + + fd.write_all(content.as_bytes()).unwrap(); + + for e in sbas_db { + fd.write_all( + format!( + "SBASHelper {{ + constellation: \"{}\", + prn: {}, + id: \"{}\", + launched_year: {}, + launched_month: {}, + launched_day: {} + }},", + e.constellation, e.prn, e.id, e.launched_year, e.launched_month, e.launched_day, + ) + .as_bytes(), + ) + .unwrap() + } + + fd.write_all(" ];".as_bytes()).unwrap(); + fd.write_all("}\n".as_bytes()).unwrap(); +} + fn main() { build_nav_database(); + build_sbas_helper(); } diff --git a/rinex/db/SBAS/sbas.json b/rinex/db/SBAS/sbas.json new file mode 100644 index 000000000..ffb621810 --- /dev/null +++ b/rinex/db/SBAS/sbas.json @@ -0,0 +1,121 @@ +[ + { + "constellation": "AusNZ", + "prn": 122, + "id": "INMARSAT-4F1", + "launched_year": 2020, + "launched_month": 1 + }, + { + "constellation": "EGNOS", + "prn": 123, + "id": "ASTRA-5B", + "launched_year": 2021, + "launched_month": 11 + }, + { + "constellation": "SDCM", + "prn": 125, + "id": "Luch-5A", + "launched_year": 2021, + "launched_month": 12 + }, + { + "constellation": "EGNOS", + "prn": 126, + "id": "INMARSAT-4F2", + "launched_year": 2023, + "launched_month": 4 + }, + { + "constellation": "GAGAN", + "prn": 127, + "id": "GSAT-8", + "launched_year": 2020, + "launched_month": 9 + }, + { + "constellation": "GAGAN", + "prn": 128, + "id": "GSAT-10", + "launched_year": 2020, + "launched_month": 9 + }, + { + "constellation": "BDSBAS", + "prn": 130, + "id": "G6", + "launched_year": 2020, + "launched_month": 10 + }, + { + "constellation": "BDSBAS", + "prn": 130, + "id": "G6", + "launched_year": 2020, + "launched_month": 10 + }, + { + "constellation": "KASS", + "prn": 134, + "id": "MEASAT-3D", + "launched_year": 2021, + "launched_month": 6 + }, + { + "constellation": "EGNOS", + "prn": 136, + "id": "SES-5", + "launched_year": 2021, + "launched_month": 11 + }, + { + "constellation": "WAAS", + "prn": 138, + "id": "ANIK-F1R", + "launched_year": 2022, + "launched_month": 7 + }, + { + "constellation": "SDCM", + "prn": 140, + "id": "Luch-5B", + "launched_year": 2021, + "launched_month": 12 + }, + { + "constellation": "SDCM", + "prn": 141, + "id": "Luch-4", + "launched_year": 2021, + "launched_month": 12 + }, + { + "constellation": "BDSBAS", + "prn": 143, + "id": "G3", + "launched_year": 2020, + "launched_month": 10 + }, + { + "constellation": "BDSBAS", + "prn": 144, + "id": "G1", + "launched_year": 2020, + "launched_month": 10 + }, + { + "constellation": "NSAS", + "prn": 147, + "id": "NIGCOMSAT-1R", + "launched_year": 2021, + "launched_month": 1 + }, + { + "constellation": "ASAL", + "prn": 148, + "id": "ALCOMSAT-1", + "launched_year": 2020, + "launched_month": 1 + } +] diff --git a/rinex/src/algorithm/filters/decim.rs b/rinex/src/algorithm/filters/decim.rs index 0b97d4bba..998647c1b 100644 --- a/rinex/src/algorithm/filters/decim.rs +++ b/rinex/src/algorithm/filters/decim.rs @@ -81,7 +81,7 @@ pub trait Decimate { impl std::str::FromStr for DecimationFilter { type Err = Error; fn from_str(content: &str) -> Result { - let items: Vec<&str> = content.trim().split(":").collect(); + let items: Vec<&str> = content.trim().split(':').collect(); if let Ok(dt) = Duration::from_str(items[0].trim()) { Ok(Self { target: { @@ -94,7 +94,7 @@ impl std::str::FromStr for DecimationFilter { }, dtype: DecimationType::DecimByInterval(dt), }) - } else if let Ok(r) = u32::from_str_radix(items[0].trim(), 10) { + } else if let Ok(r) = items[0].trim().parse::() { Ok(Self { target: { if items.len() > 1 { diff --git a/rinex/src/algorithm/filters/mask.rs b/rinex/src/algorithm/filters/mask.rs index 6f7195b06..72d0395c0 100644 --- a/rinex/src/algorithm/filters/mask.rs +++ b/rinex/src/algorithm/filters/mask.rs @@ -4,13 +4,13 @@ use thiserror::Error; #[derive(Error, Debug)] pub enum Error { #[error("invalid mask target")] - TargetError(#[from] crate::algorithm::target::Error), + InvalidTarget(#[from] crate::algorithm::target::Error), #[error("missing a mask operand")] MissingOperand, #[error("invalid mask operand")] InvalidOperand, #[error("invalid mask target \"{0}\"")] - InvalidTarget(String), + NonSupportedTarget(String), #[error("invalid mask description")] InvalidDescriptor, } @@ -51,13 +51,13 @@ impl std::str::FromStr for MaskOperand { let c = content.trim(); if c.starts_with(">=") { Ok(Self::GreaterEquals) - } else if c.starts_with(">") { + } else if c.starts_with('>') { Ok(Self::GreaterThan) } else if c.starts_with("<=") { Ok(Self::LowerEquals) - } else if c.starts_with("<") { + } else if c.starts_with('<') { Ok(Self::LowerThan) - } else if c.starts_with("=") { + } else if c.starts_with('=') { Ok(Self::Equals) } else if c.starts_with("!=") { Ok(Self::NotEquals) @@ -256,14 +256,14 @@ impl std::str::FromStr for MaskFilter { let float_offset = operand_offset + operand.formatted_len() + 2; Ok(Self { operand, - item: TargetItem::from_elevation(&cleanedup[float_offset..].trim())?, + item: TargetItem::from_elevation(cleanedup[float_offset..].trim())?, }) } else if content[0..1].eq("a") { // --> Azimuth Mask case let float_offset = operand_offset + operand.formatted_len() + 2; Ok(Self { operand, - item: TargetItem::from_azimuth(&cleanedup[float_offset..].trim())?, + item: TargetItem::from_azimuth(cleanedup[float_offset..].trim())?, }) } else { // We're only left with SNR mask case @@ -271,10 +271,10 @@ impl std::str::FromStr for MaskFilter { if content[0..3].eq("snr") { Ok(Self { operand, - item: TargetItem::from_snr(&cleanedup[float_offset..].trim())?, + item: TargetItem::from_snr(cleanedup[float_offset..].trim())?, }) } else { - Err(Error::InvalidTarget( + Err(Error::NonSupportedTarget( cleanedup[..operand_offset].to_string(), )) } @@ -289,7 +289,7 @@ impl std::str::FromStr for MaskFilter { Ok(Self { operand, - item: TargetItem::from_str(&cleanedup[offset..].trim_start())?, + item: TargetItem::from_str(cleanedup[offset..].trim_start())?, }) } } @@ -303,7 +303,7 @@ mod test { use std::str::FromStr; #[test] fn mask_operand() { - for (descriptor, opposite_desc) in vec![ + for (descriptor, opposite_desc) in [ (">=", "<="), (">", "<"), ("=", "!="), @@ -349,7 +349,7 @@ mod test { } #[test] fn mask_elev() { - for desc in vec![ + for desc in [ "e< 40.0", "e != 30", " e<40.0", @@ -368,7 +368,7 @@ mod test { } #[test] fn mask_gnss() { - for (descriptor, opposite_desc) in vec![ + for (descriptor, opposite_desc) in [ (" = GPS", "!= GPS"), ("= GAL,GPS", "!= GAL,GPS"), (" =GLO,GAL", "!= GLO,GAL"), @@ -412,9 +412,7 @@ mod test { } #[test] fn mask_sv() { - for (descriptor, opposite_desc) in - vec![(" = G01", "!= G01"), ("= R03, G31", "!= R03, G31")] - { + for (descriptor, opposite_desc) in [(" = G01", "!= G01"), ("= R03, G31", "!= R03, G31")] { let mask = MaskFilter::from_str(descriptor); assert!( mask.is_ok(), diff --git a/rinex/src/algorithm/filters/mod.rs b/rinex/src/algorithm/filters/mod.rs index 4d4116d4f..6f28c2574 100644 --- a/rinex/src/algorithm/filters/mod.rs +++ b/rinex/src/algorithm/filters/mod.rs @@ -18,15 +18,15 @@ pub enum Error { #[error("unknown filter type \"{0}\"")] UnknownFilterType(String), #[error("invalid mask filter")] - MaskFilterParsingError(#[from] mask::Error), + MaskFilterParsing(#[from] mask::Error), #[error("invalid decimation filter")] - DecimationFilterParsingError(#[from] decim::Error), + DecimationFilterParsing(#[from] decim::Error), #[error("invalid smoothing filter")] - SmoothingFilterParsingError(#[from] smoothing::Error), + SmoothingFilterParsing(#[from] smoothing::Error), #[error("invalid filter target")] - TargetItemError(#[from] super::target::Error), + TargetItem(#[from] super::target::Error), #[error("failed to apply filter")] - FilterError, + Filter, } /// Preprocessing filters, to preprocess RINEX data prior further analysis. @@ -75,7 +75,7 @@ impl From for Filter { impl std::str::FromStr for Filter { type Err = Error; fn from_str(content: &str) -> Result { - let items: Vec<&str> = content.split(":").collect(); + let items: Vec<&str> = content.split(':').collect(); let identifier = items[0].trim(); if identifier.eq("decim") { @@ -118,7 +118,7 @@ mod test { /* * MASK FILTER description */ - for descriptor in vec![ + for descriptor in [ "GPS", "=GPS", " != GPS", @@ -141,7 +141,7 @@ mod test { /* * DECIMATION FILTER description */ - for desc in vec![ + for desc in [ "decim:10", "decim:10 min", "decim:1 hour", @@ -154,7 +154,7 @@ mod test { /* * SMOOTHING FILTER description */ - for desc in vec![ + for desc in [ "smooth:mov:10 min", "smooth:mov:1 hour", "smooth:mov:1 hour:l1c", diff --git a/rinex/src/algorithm/filters/smoothing.rs b/rinex/src/algorithm/filters/smoothing.rs index bf777dd9e..2cd1d5ef7 100644 --- a/rinex/src/algorithm/filters/smoothing.rs +++ b/rinex/src/algorithm/filters/smoothing.rs @@ -23,19 +23,19 @@ pub struct SmoothingFilter { #[derive(Error, Debug)] pub enum Error { #[error("invalid description \"{0}\"")] - InvalidDescriptionError(String), + InvalidDescription(String), #[error("unknown smoothing filter \"{0}\"")] UnknownFilter(String), - #[error("unknown smoothing target")] - TargetError(#[from] crate::algorithm::target::Error), + #[error("invalid target")] + InvalidTarget(#[from] crate::algorithm::target::Error), #[error("failed to parse duration")] - DurationParsingError(#[from] hifitime::Errors), + DurationParsing(#[from] hifitime::Errors), } impl std::str::FromStr for SmoothingFilter { type Err = Error; fn from_str(content: &str) -> Result { - let items: Vec<&str> = content.trim().split(":").collect(); + let items: Vec<&str> = content.trim().split(':').collect(); if items[0].trim().eq("hatch") { Ok(Self { target: { @@ -50,7 +50,7 @@ impl std::str::FromStr for SmoothingFilter { }) } else if items[0].trim().eq("mov") { if items.len() < 2 { - return Err(Error::InvalidDescriptionError(format!("{:?}", items))); + return Err(Error::InvalidDescription(format!("{:?}", items))); } let dt = Duration::from_str(items[1].trim())?; Ok(Self { @@ -87,7 +87,7 @@ mod test { use std::str::FromStr; #[test] fn from_str() { - for desc in vec!["hatch", "hatch:C1C", "hatch:c1c,c2p"] { + for desc in ["hatch", "hatch:C1C", "hatch:c1c,c2p"] { let filter = SmoothingFilter::from_str(desc); assert!( filter.is_ok(), @@ -95,7 +95,7 @@ mod test { desc ); } - for desc in vec![ + for desc in [ "mov:10 min", "mov:1 hour", "mov:10 min:clk", diff --git a/rinex/src/algorithm/target.rs b/rinex/src/algorithm/target.rs index 0165c181d..c96ded211 100644 --- a/rinex/src/algorithm/target.rs +++ b/rinex/src/algorithm/target.rs @@ -13,7 +13,7 @@ pub enum Error { #[error("unknown target \"{0}\"")] UnknownTarget(String), #[error("type guessing error \"{0}\"")] - TypeGuessingError(String), + TypeGuessing(String), #[error("expecting two epochs when describing a duration")] InvalidDuration, #[error("bad epoch description")] @@ -29,9 +29,9 @@ pub enum Error { #[error("constellation parsing error")] ConstellationParing(#[from] constellation::ParsingError), #[error("failed to parse epoch flag")] - EpochFlagParsingError(#[from] crate::epoch::flag::Error), + EpochFlagParsing(#[from] crate::epoch::flag::Error), #[error("failed to parse constellation")] - ConstellationParsingError, + ConstellationParsing, #[error("invalid nav item")] InvalidNavItem(#[from] crate::navigation::Error), #[error("observable parsing error")] @@ -236,7 +236,7 @@ impl std::str::FromStr for TargetItem { * when operand comes first in description. * Otherwise, we muse use other methods */ - let items: Vec<&str> = c.split(",").collect(); + let items: Vec<&str> = c.split(',').collect(); /* * Epoch and Durations */ @@ -309,11 +309,11 @@ impl std::str::FromStr for TargetItem { .map(|s| s.to_string()) .collect(); - if matched_orbits.len() > 0 { + if !matched_orbits.is_empty() { Ok(Self::OrbitItem(matched_orbits)) } else { // not a single match - Err(Error::TypeGuessingError(c.to_string())) + Err(Error::TypeGuessing(c.to_string())) } } } @@ -452,7 +452,7 @@ mod test { assert_eq!(target, TargetItem::DurationItem(dt)); // test Matching NAV orbits - for descriptor in vec![ + for descriptor in [ "iode", "crc", "crs", @@ -469,7 +469,7 @@ mod test { } // test non matching NAV orbits - for descriptor in vec!["oide", "ble", "blah, oide"] { + for descriptor in ["oide", "ble", "blah, oide"] { let target = TargetItem::from_str(descriptor); assert!( target.is_err(), diff --git a/rinex/src/antex/frequency.rs b/rinex/src/antex/frequency.rs index fcb3461cd..251a391a8 100644 --- a/rinex/src/antex/frequency.rs +++ b/rinex/src/antex/frequency.rs @@ -19,10 +19,7 @@ impl Default for Pattern { impl Pattern { /// Returns true if this phase pattern is azimuth dependent pub fn is_azimuth_dependent(&self) -> bool { - match self { - Self::AzimuthDependent(_) => true, - _ => false, - } + matches!(self, Self::AzimuthDependent(_)) } /// Unwraps pattern values, whether it is /// Azimuth dependent or not @@ -124,7 +121,7 @@ impl Frequency { } pub fn with_carrier(&self, carrier: Carrier) -> Self { let mut f = self.clone(); - f.carrier = carrier.clone(); + f.carrier = carrier; f } pub fn with_northern_eccentricity(&self, north: f64) -> Self { @@ -156,7 +153,7 @@ mod test { fn test_pattern() { let default = Pattern::default(); assert_eq!(default, Pattern::NonAzimuthDependent(Vec::new())); - assert_eq!(default.is_azimuth_dependent(), false); + assert!(!default.is_azimuth_dependent()); } #[test] fn test_frequency() { diff --git a/rinex/src/antex/pcv.rs b/rinex/src/antex/pcv.rs index ee52a1d6d..5ea4ab113 100644 --- a/rinex/src/antex/pcv.rs +++ b/rinex/src/antex/pcv.rs @@ -33,10 +33,7 @@ impl std::str::FromStr for Pcv { impl Pcv { pub fn is_relative(&self) -> bool { - match self { - Self::Relative(_) => true, - _ => false, - } + matches!(self, Self::Relative(_)) } pub fn is_absolute(&self) -> bool { !self.is_relative() @@ -58,7 +55,7 @@ mod test { fn test_pcv() { assert_eq!(Pcv::default(), Pcv::Absolute); assert!(Pcv::Absolute.is_absolute()); - assert_eq!(Pcv::Relative(String::from("AOAD/M_T")).is_absolute(), false); + assert!(!Pcv::Relative(String::from("AOAD/M_T")).is_absolute()); let pcv = Pcv::from_str("A"); assert!(pcv.is_ok()); diff --git a/rinex/src/antex/record.rs b/rinex/src/antex/record.rs index 97af7ed2d..245f8d8ab 100644 --- a/rinex/src/antex/record.rs +++ b/rinex/src/antex/record.rs @@ -181,16 +181,16 @@ mod test { #[test] fn test_new_epoch() { let content = " START OF ANTENNA"; - assert_eq!(is_new_epoch(content), true); + assert!(is_new_epoch(content)); let content = "TROSAR25.R4 LEIT727259 TYPE / SERIAL NO"; - assert_eq!(is_new_epoch(content), false); + assert!(!is_new_epoch(content)); let content = " 26 # OF FREQUENCIES"; - assert_eq!(is_new_epoch(content), false); + assert!(!is_new_epoch(content)); let content = " G01 START OF FREQUENCY"; - assert_eq!(is_new_epoch(content), false); + assert!(!is_new_epoch(content)); } } diff --git a/rinex/src/carrier.rs b/rinex/src/carrier.rs index 715f1b9ef..ad2ed60d2 100644 --- a/rinex/src/carrier.rs +++ b/rinex/src/carrier.rs @@ -637,9 +637,14 @@ impl Carrier { Constellation::Glonass => Self::from_glo_observable(observable), Constellation::Galileo => Self::from_gal_observable(observable), Constellation::QZSS => Self::from_qzss_observable(observable), - Constellation::Geo | Constellation::SBAS(_) => Self::from_geo_observable(observable), Constellation::IRNSS => Self::from_irnss_observable(observable), - _ => todo!("from_\"{}:{}\"_observable()", constellation, observable), + c => { + if c.is_sbas() { + Self::from_geo_observable(observable) + } else { + unreachable!("observable for {}", constellation); + } + }, } } @@ -668,7 +673,7 @@ impl Carrier { 8 => Ok(Self::E5), _ => Ok(Self::E1), }, - Constellation::SBAS(_) | Constellation::Geo => match sv.prn { + Constellation::SBAS => match sv.prn { 1 => Ok(Self::L1), 5 => Ok(Self::L5), _ => Ok(Self::L1), @@ -695,10 +700,7 @@ impl Carrier { 9 => Ok(Self::S), _ => Ok(Self::L1), }, - _ => panic!( - "non supported conversion from {}", - sv.constellation.to_3_letter_code() - ), + _ => panic!("non supported conversion from {:?}", sv.constellation), } } } @@ -717,9 +719,9 @@ mod test { assert_eq!(l1.frequency_mhz(), 1575.42_f64); assert_eq!(l1.wavelength(), 299792458.0 / 1_575_420_000.0_f64); - for constell in vec![ + for constell in [ Constellation::GPS, - Constellation::Geo, + Constellation::SBAS, Constellation::Glonass, Constellation::Galileo, Constellation::BeiDou, @@ -746,9 +748,9 @@ mod test { assert_eq!(Carrier::from_observable(constell, &obs), Ok(Carrier::L5),); } /* - * Geo + * SBAS */ - } else if constell == Constellation::Geo { + } else if constell == Constellation::SBAS { let codes = vec!["C1", "L1C", "D1", "S1", "S1C", "D1C"]; for code in codes { let obs = Observable::from_str(code).unwrap(); diff --git a/rinex/src/clocks/mod.rs b/rinex/src/clocks/mod.rs index 949fef590..039d02c26 100644 --- a/rinex/src/clocks/mod.rs +++ b/rinex/src/clocks/mod.rs @@ -1,16 +1,16 @@ //! RINEX Clock files parser & analysis use hifitime::TimeScale; pub mod record; -pub use record::{Data, DataType, Error, Record, System}; +pub use record::{ClockData, ClockDataType, Error, Record, System}; /// Clocks `RINEX` specific header fields #[derive(Clone, Debug, Default, PartialEq)] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] pub struct HeaderFields { /// Types of observation in this file - pub codes: Vec, + pub codes: Vec, /// Clock Data analysis production center - pub agency: Option, + pub agency: Option, /// Reference station pub station: Option, /// Reference clock descriptor @@ -40,7 +40,7 @@ impl HeaderFields { s } /// Set production agency - pub fn with_agency(&self, agency: Agency) -> Self { + pub fn with_agency(&self, agency: ClockAnalysisAgency) -> Self { let mut s = self.clone(); s.agency = Some(agency); s @@ -57,10 +57,10 @@ pub struct Station { pub id: String, } -/// Describes a clock analysis center / agency +/// Describes a clock analysis agency #[derive(Clone, PartialEq, Debug)] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] -pub struct Agency { +pub struct ClockAnalysisAgency { /// IGS AC 3 letter code pub code: String, /// agency name diff --git a/rinex/src/clocks/record.rs b/rinex/src/clocks/record.rs index d4115ca3e..e19d4dd28 100644 --- a/rinex/src/clocks/record.rs +++ b/rinex/src/clocks/record.rs @@ -68,20 +68,25 @@ pub enum Error { /// Clocks file payload #[derive(Clone, Debug, PartialEq, Default)] #[cfg_attr(feature = "serde", derive(Serialize))] -pub struct Data { - /// Clock bias +pub struct ClockData { + /// Clock bias [s] pub bias: f64, - pub bias_sigma: Option, - pub rate: Option, - pub rate_sigma: Option, - pub accel: Option, - pub accel_sigma: Option, + /// Clock bias deviation + pub bias_dev: Option, + /// Clock drift [s/s] + pub drift: Option, + /// Clock drift deviation + pub drift_dev: Option, + /// Clock drift change [s/s^2] + pub drift_change: Option, + /// Clock drift change deviation + pub drift_change_dev: Option, } /// Clock data observables #[derive(Debug, PartialEq, Eq, Hash, Clone, EnumString)] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] -pub enum DataType { +pub enum ClockDataType { /// Data analysis results for receiver clocks /// derived from a set of network receivers and satellites AR, @@ -96,7 +101,7 @@ pub enum DataType { MS, } -impl std::fmt::Display for DataType { +impl std::fmt::Display for ClockDataType { fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { match self { Self::AR => f.write_str("AR"), @@ -109,7 +114,6 @@ impl std::fmt::Display for DataType { } /// Clocks RINEX record content. -/// Data is sorted by [Epoch], by [DataType] and by [System]. /* TODO /// Example of Clock record browsing: /// ``` @@ -128,12 +132,12 @@ impl std::fmt::Display for DataType { /// } /// ``` */ -pub type Record = BTreeMap>>; +pub type Record = BTreeMap>>; pub(crate) fn is_new_epoch(line: &str) -> bool { - // first 2 bytes match a DataType code + // first 2 bytes match a ClockDataType code let content = line.split_at(2).0; - DataType::from_str(content).is_ok() + ClockDataType::from_str(content).is_ok() } /// Builds `RINEX` record entry for `Clocks` data files. @@ -142,12 +146,12 @@ pub(crate) fn is_new_epoch(line: &str) -> bool { pub(crate) fn parse_epoch( version: Version, content: &str, -) -> Result<(Epoch, DataType, System, Data), Error> { +) -> Result<(Epoch, ClockDataType, System, ClockData), Error> { let mut lines = content.lines(); let line = lines.next().unwrap(); // Data type code let (dtype, rem) = line.split_at(3); - let data_type = DataType::from_str(dtype.trim())?; // must pass + let data_type = ClockDataType::from_str(dtype.trim())?; // must pass let mut rem = rem.clone(); let limit = Version { major: 3, minor: 4 }; @@ -191,15 +195,15 @@ pub(crate) fn parse_epoch( // nb of data fields let (n, _) = rem.split_at(4); - let n = u8::from_str_radix(n.trim(), 10)?; + let n = n.trim().parse::()?; // data fields - let mut data = Data::default(); + let mut data = ClockData::default(); let items: Vec<&str> = line.split_ascii_whitespace().collect(); - data.bias = f64::from_str(items[9].trim())?; // bias must pass + data.bias = items[9].trim().parse::()?; // bias must pass if n > 1 { - if let Ok(f) = f64::from_str(items[10].trim()) { - data.bias_sigma = Some(f) + if let Ok(f) = items[10].trim().parse::() { + data.bias_dev = Some(f) } } @@ -207,51 +211,50 @@ pub(crate) fn parse_epoch( if let Some(l) = lines.next() { let line = l.clone(); let items: Vec<&str> = line.split_ascii_whitespace().collect(); - for i in 0..items.len() { - if let Ok(f) = f64::from_str(items[i].trim()) { + for (i, item) in items.iter().enumerate() { + if let Ok(f) = item.trim().parse::() { if i == 0 { - data.rate = Some(f); + data.drift = Some(f); } else if i == 1 { - data.rate_sigma = Some(f); + data.drift_dev = Some(f); } else if i == 2 { - data.accel = Some(f); + data.drift_change = Some(f); } else if i == 3 { - data.accel_sigma = Some(f); + data.drift_change_dev = Some(f); } } } } } - Ok((epoch, data_type, system, data)) } /// Writes epoch into stream pub(crate) fn fmt_epoch( epoch: &Epoch, - data: &HashMap>, + data: &HashMap>, ) -> Result { let mut lines = String::with_capacity(128); for (dtype, data) in data.iter() { for (system, data) in data.iter() { lines.push_str(&format!("{} {} {} ", dtype, system, epoch)); lines.push_str(&format!("{:.13E} ", data.bias)); - if let Some(sigma) = data.bias_sigma { + if let Some(sigma) = data.bias_dev { lines.push_str(&format!("{:.13E} ", sigma)); } - if let Some(rate) = data.rate { - lines.push_str(&format!("{:.13E} ", rate)); + if let Some(drift) = data.drift { + lines.push_str(&format!("{:.13E} ", drift)); } - if let Some(sigma) = data.rate_sigma { + if let Some(sigma) = data.drift_dev { lines.push_str(&format!("{:.13E} ", sigma)); } - if let Some(accel) = data.accel { - lines.push_str(&format!("{:.13E} ", accel)); + if let Some(drift_change) = data.drift_change { + lines.push_str(&format!("{:.13E} ", drift_change)); } - if let Some(sigma) = data.accel_sigma { + if let Some(sigma) = data.drift_change_dev { lines.push_str(&format!("{:.13E} ", sigma)); } - lines.push_str("\n"); + lines.push('\n'); } } Ok(lines) @@ -263,22 +266,22 @@ mod test { #[test] fn test_is_new_epoch() { let c = "AR AREQ 1994 07 14 20 59 0.000000 6 -0.123456789012E+00 -0.123456789012E+01"; - assert_eq!(is_new_epoch(c), true); + assert!(is_new_epoch(c)); let c = "RA AREQ 1994 07 14 20 59 0.000000 6 -0.123456789012E+00 -0.123456789012E+01"; - assert_eq!(is_new_epoch(c), false); + assert!(!is_new_epoch(c)); let c = "DR AREQ 1994 07 14 20 59 0.000000 6 -0.123456789012E+00 -0.123456789012E+01"; - assert_eq!(is_new_epoch(c), true); + assert!(is_new_epoch(c)); let c = "CR AREQ 1994 07 14 20 59 0.000000 6 -0.123456789012E+00 -0.123456789012E+01"; - assert_eq!(is_new_epoch(c), true); + assert!(is_new_epoch(c)); let c = "AS AREQ 1994 07 14 20 59 0.000000 6 -0.123456789012E+00 -0.123456789012E+01"; - assert_eq!(is_new_epoch(c), true); + assert!(is_new_epoch(c)); let c = "CR USNO 1995 07 14 20 59 50.000000 2 0.123456789012E+00 -0.123456789012E-01"; - assert_eq!(is_new_epoch(c), true); + assert!(is_new_epoch(c)); let c = "AS G16 1994 07 14 20 59 0.000000 2 -0.123456789012E+00 -0.123456789012E+01"; - assert_eq!(is_new_epoch(c), true); + assert!(is_new_epoch(c)); let c = "A G16 1994 07 14 20 59 0.000000 2 -0.123456789012E+00 -0.123456789012E+01"; - assert_eq!(is_new_epoch(c), false); + assert!(!is_new_epoch(c)); } } @@ -298,29 +301,29 @@ impl Merge for Record { for (system, data) in systems.iter() { if let Some(ddata) = ssystems.get_mut(system) { // provide only previously omitted fields - if let Some(data) = data.bias_sigma { - if ddata.bias_sigma.is_none() { - ddata.bias_sigma = Some(data); + if let Some(data) = data.bias_dev { + if ddata.bias_dev.is_none() { + ddata.bias_dev = Some(data); } } - if let Some(data) = data.rate { - if ddata.rate.is_none() { - ddata.rate = Some(data); + if let Some(data) = data.drift { + if ddata.drift.is_none() { + ddata.drift = Some(data); } } - if let Some(data) = data.rate_sigma { - if ddata.rate_sigma.is_none() { - ddata.rate_sigma = Some(data); + if let Some(data) = data.drift_dev { + if ddata.drift_dev.is_none() { + ddata.drift_dev = Some(data); } } - if let Some(data) = data.accel { - if ddata.accel.is_none() { - ddata.accel = Some(data); + if let Some(data) = data.drift_change { + if ddata.drift_change.is_none() { + ddata.drift_change = Some(data); } } - if let Some(data) = data.accel_sigma { - if ddata.accel_sigma.is_none() { - ddata.accel_sigma = Some(data); + if let Some(data) = data.drift_change_dev { + if ddata.drift_change_dev.is_none() { + ddata.drift_change_dev = Some(data); } } } else { @@ -348,7 +351,7 @@ impl Split for Record { .iter() .flat_map(|(k, v)| { if k <= &epoch { - Some((k.clone(), v.clone())) + Some((*k, v.clone())) } else { None } @@ -358,7 +361,7 @@ impl Split for Record { .iter() .flat_map(|(k, v)| { if k > &epoch { - Some((k.clone(), v.clone())) + Some((*k, v.clone())) } else { None } @@ -395,9 +398,9 @@ impl Mask for Record { true // retain other system types } }); - systems.len() > 0 + !systems.is_empty() }); - dtypes.len() > 0 + !dtypes.is_empty() }); }, _ => {}, // TargetItem:: diff --git a/rinex/src/constellation/mod.rs b/rinex/src/constellation/mod.rs index 2ce18cfb0..37478222b 100644 --- a/rinex/src/constellation/mod.rs +++ b/rinex/src/constellation/mod.rs @@ -2,28 +2,22 @@ use hifitime::TimeScale; use thiserror::Error; -mod augmentation; -pub use augmentation::Augmentation; +//#[cfg(feature = "serde")] +//use serde::{Deserialize, Serialize}; -#[cfg(feature = "serde")] -use serde::{Deserialize, Serialize}; +mod sbas; #[cfg(feature = "sbas")] -pub use augmentation::sbas_selection_helper; +pub use sbas::sbas_selection_helper; /// Constellation parsing & identification related errors #[derive(Error, Clone, Debug, PartialEq)] pub enum ParsingError { #[error("unknown constellation \"{0}\"")] Unknown(String), - #[error("unrecognized constellation \"{0}\"")] - Unrecognized(String), - #[error("unknown constellation format \"{0}\"")] - Format(String), } /// Describes all known `GNSS` constellations -/// when manipulating `RINEX` #[derive(Default, Clone, Copy, Debug, PartialEq, Eq, PartialOrd, Ord, Hash)] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] pub enum Constellation { @@ -38,14 +32,37 @@ pub enum Constellation { QZSS, /// `Galileo` european constellation Galileo, - /// `Geo` : stationnary satellite, - /// also serves as SBAS with unknown augmentation system - Geo, - /// `SBAS` - SBAS(Augmentation), - /// `IRNSS` constellation, - /// now officially renamed "NavIC" + /// `IRNSS` constellation, renamed "NavIC" IRNSS, + /// American augmentation system, + WAAS, + /// European augmentation system + EGNOS, + /// Japanese MTSAT Space Based augmentation system + MSAS, + /// Indian augmentation system + GAGAN, + /// Chinese augmentation system + BDSBAS, + /// South Korean augmentation system + KASS, + /// Russian augmentation system + SDCM, + /// South African augmentation system + ASBAS, + /// Autralia / NZ augmentation system + SPAN, + /// SBAS is used to describe SBAS (augmentation) + /// vehicles without much more information + SBAS, + /// Australia-NZ Geoscience system + AusNZ, + /// Group Based SBAS + GBAS, + /// Nigerian SBAS + NSAS, + /// Algerian SBAS + ASAL, /// `Mixed` for Mixed constellations /// RINEX files description Mixed, @@ -53,168 +70,156 @@ pub enum Constellation { impl std::fmt::Display for Constellation { fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { - f.write_str(self.to_3_letter_code()) + write!(f, "{:X}", self) } } impl Constellation { - /* - * Identifies GNSS constellation from standard 1 letter code - * but can insensitive. - * Mostly used in Self::from_str (public method) - */ - pub(crate) fn from_1_letter_code(code: &str) -> Result { - if code.len() != 1 { - return Err(ParsingError::Format(code.to_string())); - } - - let lower = code.to_lowercase(); - if lower.eq("g") { - Ok(Self::GPS) - } else if lower.eq("r") { - Ok(Self::Glonass) - } else if lower.eq("c") { - Ok(Self::BeiDou) - } else if lower.eq("e") { - Ok(Self::Galileo) - } else if lower.eq("j") { - Ok(Self::QZSS) - } else if lower.eq("s") { - Ok(Self::Geo) - } else if lower.eq("i") { - Ok(Self::IRNSS) - } else if lower.eq("m") { - Ok(Self::Mixed) - } else { - Err(ParsingError::Unknown(code.to_string())) + /// Returns true if Self is an augmentation system + pub fn is_sbas(&self) -> bool { + match *self { + Constellation::WAAS + | Constellation::KASS + | Constellation::BDSBAS + | Constellation::EGNOS + | Constellation::GAGAN + | Constellation::SDCM + | Constellation::ASBAS + | Constellation::SPAN + | Constellation::MSAS + | Constellation::NSAS + | Constellation::ASAL + | Constellation::AusNZ + | Constellation::SBAS => true, + _ => false, } } - /* - * Identifies Constellation from stanadrd 3 letter code, case insensitive. - * Used in public Self::from_str, or some place else in that crate. - */ - pub(crate) fn from_3_letter_code(code: &str) -> Result { - if code.len() != 3 { - return Err(ParsingError::Format(code.to_string())); + pub(crate) fn is_mixed(&self) -> bool { + *self == Constellation::Mixed + } + /// Returns associated time scale. Returns None + /// if related time scale is not supported. + pub fn timescale(&self) -> Option { + match self { + Self::GPS | Self::QZSS => Some(TimeScale::GPST), + Self::Galileo => Some(TimeScale::GST), + Self::BeiDou => Some(TimeScale::BDT), + Self::Glonass => Some(TimeScale::UTC), + c => { + if c.is_sbas() { + Some(TimeScale::GPST) + } else { + None + } + }, } + } +} - let lower = code.to_lowercase(); - if lower.eq("gps") { +impl std::str::FromStr for Constellation { + type Err = ParsingError; + fn from_str(string: &str) -> Result { + let s = string.trim().to_lowercase(); + if s.eq("g") || s.contains("gps") { Ok(Self::GPS) - } else if lower.eq("glo") { + } else if s.eq("r") || s.contains("glo") || s.contains("glonass") { Ok(Self::Glonass) - } else if lower.eq("bds") { + } else if s.eq("bdsbas") { + Ok(Self::BDSBAS) + } else if s.eq("c") || s.contains("bds") || s.contains("beidou") { Ok(Self::BeiDou) - } else if lower.eq("gal") { + } else if s.eq("e") || s.contains("gal") || s.contains("galileo") { Ok(Self::Galileo) - } else if lower.eq("qzs") { + } else if s.eq("j") || s.contains("qzss") { Ok(Self::QZSS) - } else if lower.eq("sbs") | lower.eq("geo") { - Ok(Self::Geo) - } else if lower.eq("irn") { + } else if s.eq("i") || s.contains("irnss") || s.contains("navic") { Ok(Self::IRNSS) - } else { - Err(ParsingError::Unknown(code.to_string())) - } - } - /* - * Identifies `gnss` constellation from given standard plain name, - * like "GPS", or "Galileo". This method is not case sensitive. - * Used in public Self::from_str, or some place else in that crate. - */ - pub(crate) fn from_plain_name(code: &str) -> Result { - let lower = code.to_lowercase(); - if lower.contains("gps") { - Ok(Self::GPS) - } else if lower.contains("glonass") { - Ok(Self::Glonass) - } else if lower.contains("galileo") { - Ok(Self::Galileo) - } else if lower.contains("qzss") { - Ok(Self::QZSS) - } else if lower.contains("beidou") { - Ok(Self::BeiDou) - } else if lower.contains("sbas") { - Ok(Self::Geo) - } else if lower.contains("geo") { - Ok(Self::Geo) - } else if lower.contains("irnss") { - Ok(Self::IRNSS) - } else if lower.contains("mixed") { + } else if s.eq("m") || s.contains("mixed") { Ok(Self::Mixed) + } else if s.eq("ausnz") { + Ok(Self::AusNZ) + } else if s.eq("egnos") { + Ok(Self::EGNOS) + } else if s.eq("waas") { + Ok(Self::WAAS) + } else if s.eq("kass") { + Ok(Self::KASS) + } else if s.eq("gagan") { + Ok(Self::GAGAN) + } else if s.eq("asbas") { + Ok(Self::ASBAS) + } else if s.eq("nsas") { + Ok(Self::NSAS) + } else if s.eq("asal") { + Ok(Self::ASAL) + } else if s.eq("msas") { + Ok(Self::MSAS) + } else if s.eq("span") { + Ok(Self::SPAN) + } else if s.eq("gbas") { + Ok(Self::GBAS) + } else if s.eq("sdcm") { + Ok(Self::SDCM) + } else if s.eq("s") || s.contains("geo") || s.contains("sbas") { + Ok(Self::SBAS) } else { - Err(ParsingError::Unrecognized(code.to_string())) - } - } - /// Converts self into time scale - pub fn to_timescale(&self) -> Option { - match self { - Self::GPS | Self::QZSS => Some(TimeScale::GPST), - Self::Galileo => Some(TimeScale::GST), - Self::BeiDou => Some(TimeScale::BDT), - Self::Geo | Self::SBAS(_) => Some(TimeScale::GPST), - // this is wrong but we can't do better - Self::Glonass | Self::IRNSS => Some(TimeScale::UTC), - _ => None, - } - } - /// Converts self to 1 letter code (RINEX standard code) - pub(crate) fn to_1_letter_code(&self) -> &str { - match self { - Self::GPS => "G", - Self::Glonass => "R", - Self::Galileo => "E", - Self::BeiDou => "C", - Self::SBAS(_) | Self::Geo => "S", - Self::QZSS => "J", - Self::IRNSS => "I", - Self::Mixed => "M", - } - } - /* Converts self to 3 letter code (RINEX standard code) */ - pub(crate) fn to_3_letter_code(&self) -> &str { - match self { - Self::GPS => "GPS", - Self::Glonass => "GLO", - Self::Galileo => "GAL", - Self::BeiDou => "BDS", - Self::SBAS(_) | Self::Geo => "GEO", - Self::QZSS => "QZS", - Self::IRNSS => "IRN", - Self::Mixed => "MIX", + Err(ParsingError::Unknown(string.to_string())) } } +} - /// Returns associated time scale. Returns None - /// if related time scale is not supported. - pub fn timescale(&self) -> Option { +impl std::fmt::LowerHex for Constellation { + /* + * {:x}: formats Self as single letter standard code + */ + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { match self { - Self::GPS | Self::QZSS => Some(TimeScale::GPST), - Self::Galileo => Some(TimeScale::GST), - Self::BeiDou => Some(TimeScale::BDT), - Self::Geo | Self::SBAS(_) => Some(TimeScale::GPST), // this is correct ? - _ => None, + Self::GPS => write!(f, "G"), + Self::Glonass => write!(f, "R"), + Self::Galileo => write!(f, "E"), + Self::BeiDou => write!(f, "C"), + Self::QZSS => write!(f, "J"), + Self::IRNSS => write!(f, "I"), + c => { + if c.is_sbas() { + write!(f, "S") + } else if c.is_mixed() { + write!(f, "M") + } else { + Err(std::fmt::Error) + } + }, } } } -impl std::str::FromStr for Constellation { - type Err = ParsingError; - /// Identifies `gnss` constellation from given code. - /// Code should be standard constellation name, - /// or official 1/3 letter RINEX code. - /// This method is case insensitive - fn from_str(code: &str) -> Result { - if code.len() == 3 { - Ok(Self::from_3_letter_code(code)?) - } else if code.len() == 1 { - Ok(Self::from_1_letter_code(code)?) - } else if let Ok(s) = Self::from_plain_name(code) { - Ok(s) - } else if let Ok(sbas) = Augmentation::from_str(code) { - Ok(Self::SBAS(sbas)) - } else { - Err(ParsingError::Unknown(code.to_string())) +impl std::fmt::UpperHex for Constellation { + /* + * {:X} formats Self as 3 letter standard code + */ + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + match self { + Self::GPS => write!(f, "GPS"), + Self::Glonass => write!(f, "GLO"), + Self::Galileo => write!(f, "GAL"), + Self::BeiDou => write!(f, "BDS"), + Self::QZSS => write!(f, "QZSS"), + Self::IRNSS => write!(f, "IRNSS"), + Self::WAAS => write!(f, "WAAS"), + Self::EGNOS => write!(f, "EGNOS"), + Self::BDSBAS => write!(f, "BDSBAS"), + Self::AusNZ => write!(f, "AUSNZ"), + Self::MSAS => write!(f, "MSAS"), + Self::NSAS => write!(f, "NSAS"), + Self::GBAS => write!(f, "GBAS"), + Self::SPAN => write!(f, "SPAN"), + Self::GAGAN => write!(f, "GAGAN"), + Self::KASS => write!(f, "KASS"), + Self::ASBAS => write!(f, "ASBAS"), + Self::ASAL => write!(f, "ASAL"), + Self::SDCM => write!(f, "SDCM"), + Self::Mixed => write!(f, "MIXED"), + Self::SBAS => write!(f, "SBAS"), } } } @@ -225,50 +230,48 @@ mod tests { use hifitime::TimeScale; use std::str::FromStr; #[test] - fn from_1_letter_code() { - let c = Constellation::from_1_letter_code("G"); - assert_eq!(c.is_ok(), true); - assert_eq!(c.unwrap(), Constellation::GPS); - - let c = Constellation::from_1_letter_code("R"); - assert_eq!(c.is_ok(), true); - assert_eq!(c.unwrap(), Constellation::Glonass); - - let c = Constellation::from_1_letter_code("M"); - assert_eq!(c.is_ok(), true); - assert_eq!(c.unwrap(), Constellation::Mixed); - - let c = Constellation::from_1_letter_code("J"); - assert_eq!(c.is_ok(), true); - assert_eq!(c.unwrap(), Constellation::QZSS); + fn from_str() { + for (desc, expected) in vec![ + ("G", Ok(Constellation::GPS)), + ("GPS", Ok(Constellation::GPS)), + ("R", Ok(Constellation::Glonass)), + ("GLO", Ok(Constellation::Glonass)), + ("J", Ok(Constellation::QZSS)), + ("M", Ok(Constellation::Mixed)), + ("WAAS", Ok(Constellation::WAAS)), + ("KASS", Ok(Constellation::KASS)), + ("GBAS", Ok(Constellation::GBAS)), + ("NSAS", Ok(Constellation::NSAS)), + ("SPAN", Ok(Constellation::SPAN)), + ("EGNOS", Ok(Constellation::EGNOS)), + ("ASBAS", Ok(Constellation::ASBAS)), + ("MSAS", Ok(Constellation::MSAS)), + ("GAGAN", Ok(Constellation::GAGAN)), + ("BDSBAS", Ok(Constellation::BDSBAS)), + ("ASAL", Ok(Constellation::ASAL)), + ("SDCM", Ok(Constellation::SDCM)), + ] { + assert_eq!( + Constellation::from_str(desc), + expected, + "failed to parse constellation from \"{}\"", + desc + ); + } - let c = Constellation::from_1_letter_code("X"); - assert_eq!(c.is_err(), true); - } - #[test] - fn from_3_letter_code() { - let c = Constellation::from_3_letter_code("GPS"); - assert_eq!(c.is_ok(), true); - assert_eq!(c.unwrap(), Constellation::GPS); - let c = Constellation::from_3_letter_code("GLO"); - assert_eq!(c.is_ok(), true); - assert_eq!(c.unwrap(), Constellation::Glonass); - let c = Constellation::from_3_letter_code("GPX"); - assert_eq!(c.is_err(), true); - let c = Constellation::from_3_letter_code("X"); - assert_eq!(c.is_err(), true); + for desc in ["X", "x", "GPX", "gpx", "unknown", "blah"] { + assert!(Constellation::from_str(desc).is_err()); + } } #[test] - fn augmentation() { - let c = Augmentation::from_str("WAAS"); - assert_eq!(c.is_ok(), true); - assert_eq!(c.unwrap(), Augmentation::WAAS); - let c = Augmentation::from_str("WASS"); - assert_eq!(c.is_err(), true); + fn test_sbas() { + for sbas in ["WAAS", "KASS", "EGNOS", "ASBAS", "MSAS", "GAGAN", "ASAL"] { + assert!(Constellation::from_str(sbas).unwrap().is_sbas()); + } } #[test] fn timescale() { - for (gnss, expected) in vec![ + for (gnss, expected) in [ (Constellation::GPS, TimeScale::GPST), (Constellation::Galileo, TimeScale::GST), (Constellation::BeiDou, TimeScale::BDT), diff --git a/rinex/src/constellation/augmentation.rs b/rinex/src/constellation/sbas.rs similarity index 63% rename from rinex/src/constellation/augmentation.rs rename to rinex/src/constellation/sbas.rs index b8edfa3c6..c9c9ce24a 100644 --- a/rinex/src/constellation/augmentation.rs +++ b/rinex/src/constellation/sbas.rs @@ -1,37 +1,8 @@ -//! `GNSS` geostationary augmentation systems, -//! mainly used for high precision positioning -use strum_macros::EnumString; - -#[cfg(feature = "serde")] -use serde::{Deserialize, Serialize}; - -#[derive(Default, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Debug, EnumString)] -#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] -/// GNSS Augmentation systems, -/// must be used based on current location -pub enum Augmentation { - /// Augmentation Unknown - #[default] - Unknown, - /// American augmentation system, - WAAS, - /// European augmentation system - EGNOS, - /// Japanese augmentation system - MSAS, - /// Indian augmentation system - GAGAN, - /// Chinese augmentation system - BDSBAS, - /// South Korean augmentation system - KASS, - /// Russian augmentation system - SDCM, - /// South African augmentation system - ASBAS, - /// Autralia / NZ augmentation system - SPAN, -} +//! Geostationary augmentation systems +use crate::prelude::Constellation; + +//#[cfg(feature = "serde")] +//use serde::{Deserialize, Serialize}; #[cfg(feature = "sbas")] use geo::{point, Contains, LineString}; @@ -68,8 +39,8 @@ where } #[cfg(feature = "sbas")] -fn load_database() -> Vec<(Augmentation, geo::Polygon)> { - let mut db: Vec<(Augmentation, geo::Polygon)> = Vec::new(); +fn load_database() -> Vec<(Constellation, geo::Polygon)> { + let mut db: Vec<(Constellation, geo::Polygon)> = Vec::new(); let db_path = env!("CARGO_MANIFEST_DIR").to_owned() + "/db/SBAS/"; let db_path = std::path::PathBuf::from(db_path); for entry in std::fs::read_dir(db_path).unwrap() { @@ -83,7 +54,7 @@ fn load_database() -> Vec<(Augmentation, geo::Polygon)> { line_string(fullpath), // exterior boundaries vec![], ); // dont care about interior - if let Ok(sbas) = Augmentation::from_str(&name.to_uppercase()) { + if let Ok(sbas) = Constellation::from_str(&name.to_uppercase()) { db.push((sbas, poly)) } } @@ -101,18 +72,18 @@ fn load_database() -> Vec<(Augmentation, geo::Polygon)> { /// /// let paris = (48.808378, 2.382682); // lat, lon [ddeg] /// let sbas = sbas_selection_helper(paris.0, paris.1); -/// assert_eq!(sbas, Some(Augmentation::EGNOS)); +/// assert_eq!(sbas, Some(Constellation::EGNOS)); /// /// let antartica = (-77.490631, 91.435181); // lat, lon [ddeg] /// let sbas = sbas_selection_helper(antartica.0, antartica.1); /// assert_eq!(sbas.is_none(), true); ///``` -pub fn sbas_selection_helper(lat: f64, lon: f64) -> Option { +pub fn sbas_selection_helper(lat: f64, lon: f64) -> Option { let db = load_database(); let point: geo::Point = point!(x: lon, y: lat,); for (sbas, area) in db { if area.contains(&point) { - return Some(sbas.clone()); + return Some(sbas); } } None @@ -127,63 +98,63 @@ mod test { fn sbas_helper() { // PARIS --> EGNOS let sbas = sbas_selection_helper(48.808378, 2.382682); - assert_eq!(sbas.is_some(), true); - assert_eq!(sbas.unwrap(), Augmentation::EGNOS); + assert!(sbas.is_some()); + assert_eq!(sbas.unwrap(), Constellation::EGNOS); // ANTARICA --> NONE let sbas = sbas_selection_helper(-77.490631, 91.435181); - assert_eq!(sbas.is_none(), true); + assert!(sbas.is_none()); // LOS ANGELES --> WAAS let sbas = sbas_selection_helper(33.981431, -118.193601); - assert_eq!(sbas.is_some(), true); - assert_eq!(sbas.unwrap(), Augmentation::WAAS); + assert!(sbas.is_some()); + assert_eq!(sbas.unwrap(), Constellation::WAAS); // ARGENTINA --> NONE let sbas = sbas_selection_helper(-23.216639, -63.170983); - assert_eq!(sbas.is_none(), true); + assert!(sbas.is_none()); // NIGER --> ASBAS let sbas = sbas_selection_helper(10.714217, 17.087263); - assert_eq!(sbas.is_some(), true); - assert_eq!(sbas.unwrap(), Augmentation::ASBAS); + assert!(sbas.is_some()); + assert_eq!(sbas.unwrap(), Constellation::ASBAS); // South AFRICA --> None let sbas = sbas_selection_helper(-32.473320, 21.112770); - assert_eq!(sbas.is_none(), true); + assert!(sbas.is_none()); // India --> GAGAN let sbas = sbas_selection_helper(19.314290, 76.798953); - assert_eq!(sbas.is_some(), true); - assert_eq!(sbas.unwrap(), Augmentation::GAGAN); + assert!(sbas.is_some()); + assert_eq!(sbas.unwrap(), Constellation::GAGAN); // South Indian Ocean --> None let sbas = sbas_selection_helper(-29.349172, 72.773447); - assert_eq!(sbas.is_none(), true); + assert!(sbas.is_none()); // Australia --> SPAN let sbas = sbas_selection_helper(-27.579847, 131.334992); - assert_eq!(sbas.is_some(), true); - assert_eq!(sbas.unwrap(), Augmentation::SPAN); + assert!(sbas.is_some()); + assert_eq!(sbas.unwrap(), Constellation::SPAN); // NZ --> SPAN let sbas = sbas_selection_helper(-45.113525, 169.864842); - assert_eq!(sbas.is_some(), true); - assert_eq!(sbas.unwrap(), Augmentation::SPAN); + assert!(sbas.is_some()); + assert_eq!(sbas.unwrap(), Constellation::SPAN); // Central China: BDSBAS let sbas = sbas_selection_helper(34.462967, 98.172480); - assert_eq!(sbas, Some(Augmentation::BDSBAS)); + assert_eq!(sbas, Some(Constellation::BDSBAS)); // South Korea: KASS let sbas = sbas_selection_helper(37.067846, 128.34); - assert_eq!(sbas, Some(Augmentation::KASS)); + assert_eq!(sbas, Some(Constellation::KASS)); // Japan: MSAS let sbas = sbas_selection_helper(36.081095, 138.274859); - assert_eq!(sbas, Some(Augmentation::MSAS)); + assert_eq!(sbas, Some(Constellation::MSAS)); // Russia: SDCM let sbas = sbas_selection_helper(60.004390, 89.090326); - assert_eq!(sbas, Some(Augmentation::SDCM)); + assert_eq!(sbas, Some(Constellation::SDCM)); } } diff --git a/rinex/src/epoch/mod.rs b/rinex/src/epoch/mod.rs index d88588cfa..d045a052d 100644 --- a/rinex/src/epoch/mod.rs +++ b/rinex/src/epoch/mod.rs @@ -36,7 +36,7 @@ pub enum ParsingError { * Infaillible `Epoch::now()` call. */ pub(crate) fn now() -> Epoch { - Epoch::now().unwrap_or(Epoch::from_gregorian_utc_at_midnight(2000, 01, 01)) + Epoch::now().unwrap_or(Epoch::from_gregorian_utc_at_midnight(2000, 1, 1)) } /* @@ -141,13 +141,13 @@ pub(crate) fn parse_in_timescale( let mut mm = 0_u8; let mut ss = 0_u8; let mut ns = 0_u32; - let mut epoch = Epoch::default(); let mut flag = EpochFlag::default(); for (field_index, item) in content.split_ascii_whitespace().enumerate() { match field_index { 0 => { - y = i32::from_str_radix(item, 10) + y = item + .parse::() .map_err(|_| ParsingError::YearField(item.to_string()))?; /* old RINEX problem: YY is sometimes encoded on two digits */ @@ -160,29 +160,37 @@ pub(crate) fn parse_in_timescale( } }, 1 => { - m = u8::from_str_radix(item, 10) + m = item + .parse::() .map_err(|_| ParsingError::MonthField(item.to_string()))?; }, 2 => { - d = u8::from_str_radix(item, 10) + d = item + .parse::() .map_err(|_| ParsingError::DayField(item.to_string()))?; }, 3 => { - hh = u8::from_str_radix(item, 10) + hh = item + .parse::() .map_err(|_| ParsingError::HoursField(item.to_string()))?; }, 4 => { - mm = u8::from_str_radix(item, 10) + mm = item + .parse::() .map_err(|_| ParsingError::MinutesField(item.to_string()))?; }, 5 => { - if let Some(dot) = item.find(".") { + if let Some(dot) = item.find('.') { let is_nav = item.trim().len() < 7; - ss = u8::from_str_radix(item[..dot].trim(), 10) + ss = item[..dot] + .trim() + .parse::() .map_err(|_| ParsingError::SecondsField(item.to_string()))?; - ns = u32::from_str_radix(item[dot + 1..].trim(), 10) + ns = item[dot + 1..] + .trim() + .parse::() .map_err(|_| ParsingError::NanosecondsField(item.to_string()))?; if is_nav { @@ -193,7 +201,9 @@ pub(crate) fn parse_in_timescale( ns *= 100; } } else { - ss = u8::from_str_radix(item.trim(), 10) + ss = item + .trim() + .parse::() .map_err(|_| ParsingError::SecondsField(item.to_string()))?; } }, @@ -214,7 +224,9 @@ pub(crate) fn parse_in_timescale( if y == 0 { return Err(ParsingError::FormatError); } - epoch = Epoch::from_gregorian_utc(y, m, d, hh, mm, ss, ns); + + let epoch = Epoch::from_gregorian_utc(y, m, d, hh, mm, ss, ns); + Ok((epoch, flag)) }, _ => { // in case provided content is totally invalid, @@ -222,14 +234,20 @@ pub(crate) fn parse_in_timescale( if y == 0 { return Err(ParsingError::FormatError); } - epoch = Epoch::from_str(&format!( + let epoch = Epoch::from_str(&format!( "{:04}-{:02}-{:02}T{:02}:{:02}:{:02}.{:09} {}", - y, m, d, hh, mm, ss, ns, ts + y, + m, + d, + hh, + mm, + ss, + ns / 100_000_000, + ts ))?; + Ok((epoch, flag)) }, } - - Ok((epoch, flag)) } pub(crate) fn parse_utc(s: &str) -> Result<(Epoch, EpochFlag), ParsingError> { @@ -243,7 +261,7 @@ mod test { #[test] fn epoch_parse_nav_v2() { let e = parse_utc("20 12 31 23 45 0.0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, flag) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2020); @@ -261,7 +279,7 @@ mod test { ); let e = parse_utc("21 1 1 16 15 0.0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, flag) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2021); @@ -281,7 +299,7 @@ mod test { #[test] fn epoch_parse_nav_v2_nanos() { let e = parse_utc("20 12 31 23 45 0.1"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, _) = e.unwrap(); let (_, _, _, _, _, ss, ns) = e.to_gregorian_utc(); assert_eq!(ss, 0); @@ -294,7 +312,7 @@ mod test { #[test] fn epoch_parse_nav_v3() { let e = parse_utc("2021 01 01 00 00 00 "); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, _) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2021); @@ -311,7 +329,7 @@ mod test { ); let e = parse_utc("2021 01 01 09 45 00 "); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, _) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2021); @@ -327,7 +345,7 @@ mod test { ); let e = parse_utc("2020 06 25 00 00 00"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, _) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2020); @@ -343,7 +361,7 @@ mod test { ); let e = parse_utc("2020 06 25 09 49 04"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, _) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2020); @@ -361,7 +379,7 @@ mod test { #[test] fn epoch_parse_obs_v2() { let e = parse_utc(" 21 12 21 0 0 0.0000000 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, flag) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2021); @@ -379,7 +397,7 @@ mod test { ); let e = parse_utc(" 21 12 21 0 0 30.0000000 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, flag) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2021); @@ -396,38 +414,38 @@ mod test { ); let e = parse_utc(" 21 12 21 0 0 30.0000000 1"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (_e, flag) = e.unwrap(); assert_eq!(flag, EpochFlag::PowerFailure); //assert_eq!(format!("{:o}", e), "21 12 21 0 0 30.0000000 1"); let e = parse_utc(" 21 12 21 0 0 30.0000000 2"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (_e, flag) = e.unwrap(); assert_eq!(flag, EpochFlag::AntennaBeingMoved); let e = parse_utc(" 21 12 21 0 0 30.0000000 3"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (_e, flag) = e.unwrap(); assert_eq!(flag, EpochFlag::NewSiteOccupation); let e = parse_utc(" 21 12 21 0 0 30.0000000 4"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (_e, flag) = e.unwrap(); assert_eq!(flag, EpochFlag::HeaderInformationFollows); let e = parse_utc(" 21 12 21 0 0 30.0000000 5"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (_e, flag) = e.unwrap(); assert_eq!(flag, EpochFlag::ExternalEvent); let e = parse_utc(" 21 12 21 0 0 30.0000000 6"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (_e, flag) = e.unwrap(); assert_eq!(flag, EpochFlag::CycleSlip); let e = parse_utc(" 21 1 1 0 0 0.0000000 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, flag) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2021); @@ -441,7 +459,7 @@ mod test { //assert_eq!(format!("{:o}", e), "21 1 1 0 0 0.0000000 0"); let e = parse_utc(" 21 1 1 0 7 30.0000000 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, flag) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2021); @@ -457,7 +475,7 @@ mod test { #[test] fn epoch_parse_obs_v3() { let e = parse_utc(" 2022 01 09 00 00 0.0000000 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, flag) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2022); @@ -471,7 +489,7 @@ mod test { //assert_eq!(format!("{}", e), "2022 01 09 00 00 0.0000000 0"); let e = parse_utc(" 2022 01 09 00 13 30.0000000 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, flag) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2022); @@ -485,7 +503,7 @@ mod test { //assert_eq!(format!("{}", e), "2022 01 09 00 13 30.0000000 0"); let e = parse_utc(" 2022 03 04 00 52 30.0000000 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, flag) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2022); @@ -499,7 +517,7 @@ mod test { //assert_eq!(format!("{}", e), "2022 03 04 00 52 30.0000000 0"); let e = parse_utc(" 2022 03 04 00 02 30.0000000 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, flag) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2022); @@ -515,7 +533,7 @@ mod test { #[test] fn epoch_parse_obs_v2_nanos() { let e = parse_utc(" 21 1 1 0 7 39.1234567 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, _) = e.unwrap(); let (_, _, _, _, _, ss, ns) = e.to_gregorian_utc(); assert_eq!(ss, 39); @@ -524,7 +542,7 @@ mod test { #[test] fn epoch_parse_obs_v3_nanos() { let e = parse_utc("2022 01 09 00 00 0.1000000 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, _) = e.unwrap(); let (_, _, _, _, _, ss, ns) = e.to_gregorian_utc(); assert_eq!(ss, 0); @@ -532,7 +550,7 @@ mod test { //assert_eq!(format!("{}", e), "2022 01 09 00 00 0.1000000 0"); let e = parse_utc(" 2022 01 09 00 00 0.1234000 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, _) = e.unwrap(); let (_, _, _, _, _, ss, ns) = e.to_gregorian_utc(); assert_eq!(ss, 0); @@ -540,7 +558,7 @@ mod test { //assert_eq!(format!("{}", e), "2022 01 09 00 00 0.1234000 0"); let e = parse_utc(" 2022 01 09 00 00 8.7654321 0"); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, _) = e.unwrap(); let (_, _, _, _, _, ss, ns) = e.to_gregorian_utc(); assert_eq!(ss, 8); @@ -550,7 +568,7 @@ mod test { #[test] fn epoch_parse_meteo_v2() { let e = parse_utc(" 22 1 4 0 0 0 "); - assert_eq!(e.is_ok(), true); + assert!(e.is_ok()); let (e, _) = e.unwrap(); let (y, m, d, hh, mm, ss, ns) = e.to_gregorian_utc(); assert_eq!(y, 2022); diff --git a/rinex/src/ground_position.rs b/rinex/src/ground_position.rs index 6e9d613e2..64dad7410 100644 --- a/rinex/src/ground_position.rs +++ b/rinex/src/ground_position.rs @@ -13,9 +13,9 @@ impl From<(f64, f64, f64)> for GroundPosition { } } -impl Into<(f64, f64, f64)> for GroundPosition { - fn into(self) -> (f64, f64, f64) { - (self.0, self.1, self.2) +impl From for (f64, f64, f64) { + fn from(val: GroundPosition) -> Self { + (val.0, val.1, val.2) } } diff --git a/rinex/src/hatanaka/compressor.rs b/rinex/src/hatanaka/compressor.rs index 95b9bd83d..55c87f0e6 100644 --- a/rinex/src/hatanaka/compressor.rs +++ b/rinex/src/hatanaka/compressor.rs @@ -1,6 +1,6 @@ //! RINEX compression module use super::{numdiff::NumDiff, textdiff::TextDiff, Error}; -use crate::is_comment; +use crate::is_rinex_comment; use crate::{Constellation, Observable, Sv}; use std::collections::HashMap; use std::str::FromStr; @@ -49,17 +49,16 @@ pub struct Compressor { fn format_epoch_descriptor(content: &str) -> String { let mut result = String::new(); - result.push_str("&"); + result.push('&'); for line in content.lines() { result.push_str(line.trim()) // removes all \tab } - result.push_str("\n"); + result.push('\n'); result } -impl Compressor { - /// Creates a new compression structure - pub fn new() -> Self { +impl Default for Compressor { + fn default() -> Self { Self { first_epoch: true, epoch_ptr: 0, @@ -75,7 +74,9 @@ impl Compressor { forced_init: HashMap::new(), } } +} +impl Compressor { /// Identifies amount of vehicles to be provided in next iterations /// by analyzing epoch descriptor fn determine_nb_vehicles(&self, content: &str) -> Result { @@ -83,7 +84,7 @@ impl Compressor { Err(Error::MalformedEpochDescriptor) } else { let nb = &content[30..32]; - if let Ok(u) = u16::from_str_radix(nb.trim(), 10) { + if let Ok(u) = nb.trim().parse::() { //println!("Identified {} vehicles", u); //DEBUG Ok(u.into()) } else { @@ -104,9 +105,9 @@ impl Compressor { if constell_id.is_ascii_digit() { // in old RINEX + mono constell context // it is possible that constellation ID is omitted.. - vehicle.insert_str(0, constellation.to_1_letter_code()); + vehicle.insert_str(0, &format!("{:x}", constellation)); } - let sv = Sv::from_str(&vehicle)?; + let sv = Sv::from_str(vehicle)?; //println!("VEHICULE: {}", sv); //DEBUG Ok(sv) } else { @@ -120,10 +121,10 @@ impl Compressor { //println!(">>> VEHICULE CONCLUDED"); //DEBUG // conclude line with lli/ssi flags let flags = self.flags_descriptor.trim_end(); - if flags.len() > 0 { + if !flags.is_empty() { result.push_str(flags); } - result.push_str("\n"); + result.push('\n'); self.flags_descriptor.clear(); // move to next vehicle self.obs_ptr = 0; @@ -177,7 +178,7 @@ impl Compressor { loop { let line: &str = match lines.next() { Some(l) => { - if l.trim().len() == 0 { + if l.trim().is_empty() { // line completely empty // ==> determine if we were expecting content if self.state == State::Body { @@ -185,14 +186,14 @@ impl Compressor { if self.obs_ptr > 0 { // previously active // identify current Sv - if let Ok(sv) = self.current_vehicle(&constellation) { + if let Ok(sv) = self.current_vehicle(constellation) { // nb of obs for this constellation let sv_nb_obs = observables[&sv.constellation].len(); let nb_missing = std::cmp::min(5, sv_nb_obs - self.obs_ptr); //println!("Early empty line - missing {} field(s)", nb_missing); //DEBUG for i in 0..nb_missing { - result.push_str(" "); // empty whitespace, on each missing observable - // to remain retro compatible with official tools + result.push(' '); // empty whitespace, on each missing observable + // to remain retro compatible with official tools self.flags_descriptor.push_str(" "); // both missing self.schedule_kernel_init(&sv, self.obs_ptr + i); } @@ -217,7 +218,7 @@ impl Compressor { // println!("\nWorking from LINE : \"{}\"", line); //DEBUG // [0] : COMMENTS (special case) - if is_comment!(line) { + if is_rinex_comment(line) { if line.contains("RINEX FILE SPLICE") { // [0*] SPLICE special comments // merged RINEX Files @@ -227,7 +228,7 @@ impl Compressor { result // feed content as is .push_str(line); result // \n dropped by .lines() - .push_str("\n"); + .push('\n'); continue; } @@ -251,7 +252,7 @@ impl Compressor { // if we did have clock offset, // append in a new line // otherwise append a BLANK - self.epoch_descriptor.push_str("\n"); + self.epoch_descriptor.push('\n'); let nb_lines = num_integer::div_ceil(self.nb_vehicles, 12) as u8; if self.epoch_ptr == nb_lines { @@ -267,19 +268,19 @@ impl Compressor { //missing clock offset field here //next line should not always be empty ///////////////////////////////////// - result.push_str("\n"); + result.push('\n'); self.first_epoch = false; } else { result.push_str( - &self.epoch_diff.compress(&self.epoch_descriptor).trim_end(), + self.epoch_diff.compress(&self.epoch_descriptor).trim_end(), ); - result.push_str("\n"); + result.push('\n'); ///////////////////////////////////// //TODO //missing clock offset field here //next line should not always be empty ///////////////////////////////////// - result.push_str("\n"); + result.push('\n'); } self.obs_ptr = 0; @@ -292,7 +293,7 @@ impl Compressor { // nb of obs in this line let nb_obs_line = num_integer::div_ceil(line.len(), 17); // identify current satellite using stored epoch description - if let Ok(sv) = self.current_vehicle(&constellation) { + if let Ok(sv) = self.current_vehicle(constellation) { // nb of obs for this constellation let sv_nb_obs = observables[&sv.constellation].len(); if self.obs_ptr + nb_obs_line > sv_nb_obs { @@ -302,9 +303,9 @@ impl Compressor { //println!("SV {} final fields were omitted", sv); //DEBUG for index in self.obs_ptr..sv_nb_obs + 1 { self.schedule_kernel_init(&sv, index); - result.push_str(" "); // put an empty space on missing observables - // this is how RNX2CRX (official) behaves, - // if we don't do this we break retro compatibility + result.push(' '); // put an empty space on missing observables + // this is how RNX2CRX (official) behaves, + // if we don't do this we break retro compatibility self.flags_descriptor.push_str(" "); } result = self.conclude_vehicle(&result); @@ -314,7 +315,7 @@ impl Compressor { self.nb_vehicles = self.determine_nb_vehicles(line)?; self.epoch_ptr = 1; // we already have a new descriptor self.epoch_descriptor.push_str(line); - self.epoch_descriptor.push_str("\n"); + self.epoch_descriptor.push('\n'); continue; // avoid end of this loop, // as this vehicle is now concluded } @@ -329,9 +330,9 @@ impl Compressor { let (data, rem) = observables.split_at(index); let (obsdata, flags) = data.split_at(14); observables = rem.clone(); - if let Ok(obsdata) = f64::from_str(obsdata.trim()) { + if let Ok(obsdata) = obsdata.trim().parse::() { let obsdata = f64::round(obsdata * 1000.0) as i64; - if flags.trim().len() == 0 { + if flags.trim().is_empty() { // Both Flags ommited //println!("OBS \"{}\" LLI \"X\" SSI \"X\"", obsdata); //DEBUG // data compression @@ -357,7 +358,7 @@ impl Compressor { break; } } - if indexes.len() == 0 { + if indexes.is_empty() { self.forced_init.remove(&sv); } } else { @@ -432,7 +433,7 @@ impl Compressor { break; } } - if indexes.len() == 0 { + if indexes.is_empty() { self.forced_init.remove(&sv); } } else { @@ -460,17 +461,17 @@ impl Compressor { diff.1.init(lli); diff.2.init(ssi); result.push_str(&format!("3&{} ", obsdata)); //append obs - if lli.len() > 0 { + if !lli.is_empty() { self.flags_descriptor.push_str(lli); } else { - self.flags_descriptor.push_str(" "); + self.flags_descriptor.push(' '); } - if ssi.len() > 0 { + if !ssi.is_empty() { self.flags_descriptor.push_str(ssi); } else { // SSI omitted - self.flags_descriptor.push_str(" "); + self.flags_descriptor.push(' '); } sv_diffs.insert(self.obs_ptr, diff); } @@ -486,12 +487,12 @@ impl Compressor { diff.1.init(lli); diff.2.init(ssi); self.flags_descriptor.push_str(lli); - if ssi.len() > 0 { + if !ssi.is_empty() { self.flags_descriptor.push_str(ssi); } else { // SSI omitted diff.2.init(" "); // BLANK - self.flags_descriptor.push_str(" "); + self.flags_descriptor.push(' '); } let mut map: HashMap = HashMap::new(); @@ -503,9 +504,9 @@ impl Compressor { //obsdata::f64::from_str() // when floating point parsing is in failure, // we know this observable is omitted - result.push_str(" "); // put an empty space on missing observables - // this is how RNX2CRX (official) behaves, - // if we don't do this we break retro compatibility + result.push(' '); // put an empty space on missing observables + // this is how RNX2CRX (official) behaves, + // if we don't do this we break retro compatibility self.flags_descriptor.push_str(" "); self.schedule_kernel_init(&sv, self.obs_ptr); } diff --git a/rinex/src/hatanaka/decompressor.rs b/rinex/src/hatanaka/decompressor.rs index a82945e39..aa3c2da4a 100644 --- a/rinex/src/hatanaka/decompressor.rs +++ b/rinex/src/hatanaka/decompressor.rs @@ -1,6 +1,6 @@ //! RINEX decompression module use super::{numdiff::NumDiff, textdiff::TextDiff, Error}; -use crate::{is_comment, prelude::*}; +use crate::{is_rinex_comment, prelude::*}; use std::collections::HashMap; use std::str::FromStr; @@ -55,7 +55,7 @@ fn format_epoch( } let (epoch, systems) = content.split_at(32); // grab epoch - result.push_str(&epoch.replace("&", " ")); // rework + result.push_str(&epoch.replace('&', " ")); // rework //CRINEX has systems squashed in a single line // we just split it to match standard definitions @@ -99,7 +99,7 @@ fn format_epoch( return Err(Error::FaultyRecoveredEpoch); } let (epoch, _) = content.split_at(35); - result.push_str(&epoch.replace("&", " ")); + result.push_str(&epoch.replace('&', " ")); //TODO clock offset if let Some(value) = clock_offset { result.push_str(&format!(" {:3.12}", (value as f64) / 1000.0_f64)) @@ -183,10 +183,10 @@ impl Decompressor { ) -> Option { let epoch = &self.epoch_descriptor; let offset: usize = match crx_major { - 1 => std::cmp::min((32 + 3 * (sv_ptr + 1)).into(), epoch.len()), // overflow protection - _ => std::cmp::min((41 + 3 * (sv_ptr + 1)).into(), epoch.len()), // overflow protection + 1 => std::cmp::min(32 + 3 * (sv_ptr + 1), epoch.len()), // overflow protection + _ => std::cmp::min(41 + 3 * (sv_ptr + 1), epoch.len()), // overflow protection }; - let system = epoch.split_at(offset.into()).0; + let system = epoch.split_at(offset).0; let (_, svnn) = system.split_at(system.len() - 3); // last 3 XXX let svnn = svnn.trim(); match crx_major > 2 { @@ -203,7 +203,7 @@ impl Decompressor { }, constellation => { // OLD + FIXED: constellation might be omitted....... - if let Ok(prn) = u8::from_str_radix(&svnn[1..].trim(), 10) { + if let Ok(prn) = u8::from_str_radix(svnn[1..].trim(), 10) { Some(Sv { prn, constellation: *constellation, @@ -250,7 +250,7 @@ impl Decompressor { //println!("state: {:?}", self.state); // [0] : COMMENTS (special case) - if is_comment!(line) { + if is_rinex_comment(line) { //if line.contains("RINEX FILE SPLICE") { // [0*] SPLICE special comments // merged RINEX Files @@ -258,8 +258,7 @@ impl Decompressor { //} result // feed content as is .push_str(line); - result // \n dropped by .lines() - .push_str("\n"); + result.push('\n'); continue; // move to next line } @@ -269,8 +268,7 @@ impl Decompressor { if line.starts_with("> ") && !self.first_epoch { result // feed content as is .push_str(line); - result // \n dropped by .lines() - .push_str("\n"); + result.push('\n'); continue; // move to next line } @@ -279,12 +277,12 @@ impl Decompressor { if self.first_epoch { match crx_major { 1 => { - if !line.starts_with("&") { + if !line.starts_with('&') { return Err(Error::FaultyCrx1FirstEpoch); } }, 3 => { - if !line.starts_with(">") { + if !line.starts_with('>') { return Err(Error::FaultyCrx3FirstEpoch); } }, @@ -312,7 +310,7 @@ impl Decompressor { * this line is dedicated to clock offset description */ let mut clock_offset: Option = None; - if line.contains("&") { + if line.contains('&') { // clock offset kernel (re)init let (n, rem) = line.split_at(1); if let Ok(order) = u8::from_str_radix(n, 10) { @@ -368,8 +366,7 @@ impl Decompressor { /* * identify satellite we're dealing with */ - if let Some(sv) = self.current_satellite(crx_major, &crx_constell, self.sv_ptr) - { + if let Some(sv) = self.current_satellite(crx_major, crx_constell, self.sv_ptr) { //println!("SV: {:?}", sv); //DEBUG self.sv_ptr += 1; // increment for next time // vehicles are always described in a single line @@ -384,7 +381,11 @@ impl Decompressor { let mut inner: Vec<(NumDiff, TextDiff, TextDiff)> = Vec::with_capacity(16); // this protects from malformed Headers or malformed Epoch descriptions - if let Some(codes) = observables.get(&sv.constellation) { + let codes = match sv.constellation.is_sbas() { + true => observables.get(&Constellation::SBAS), + false => observables.get(&sv.constellation), + }; + if let Some(codes) = codes { for _ in codes { let mut kernels = ( NumDiff::new(NumDiff::MAX_COMPRESSION_ORDER)?, @@ -402,12 +403,16 @@ impl Decompressor { * iterate over entire line */ let mut line = line.trim_end(); - if let Some(codes) = observables.get(&sv.constellation) { + let codes = match sv.constellation.is_sbas() { + true => observables.get(&Constellation::SBAS), + false => observables.get(&sv.constellation), + }; + if let Some(codes) = codes { while obs_ptr < codes.len() { if let Some(pos) = line.find(' ') { let content = &line[..pos]; //println!("OBS \"{}\" - CONTENT \"{}\"", codes[obs_ptr], content); //DEBUG - if content.len() == 0 { + if content.is_empty() { /* * missing observation */ @@ -417,7 +422,7 @@ impl Decompressor { * regular progression */ if let Some(sv_diff) = self.sv_diff.get_mut(&sv) { - if let Some(marker) = content.find("&") { + if let Some(marker) = content.find('&') { // kernel (re)initialization let (order, rem) = content.split_at(marker); let order = u8::from_str_radix(order.trim(), 10)?; @@ -453,7 +458,7 @@ impl Decompressor { */ //println!("OBS \"{}\" - CONTENT \"{}\"", codes[obs_ptr], line); //DEBUG if let Some(sv_diff) = self.sv_diff.get_mut(&sv) { - if let Some(marker) = line.find("&") { + if let Some(marker) = line.find('&') { // kernel (re)initliaization let (order, rem) = line.split_at(marker); let order = u8::from_str_radix(order.trim(), 10)?; @@ -482,7 +487,7 @@ impl Decompressor { /* * Flags field */ - if line.len() > 0 { + if !line.is_empty() { // can parse at least 1 flag self.parse_flags(&sv, line); } @@ -522,7 +527,7 @@ impl Decompressor { // old RINEX if (index + 1).rem_euclid(5) == 0 { // maximal nb of OBS per line - result.push_str("\n") + result.push('\n') } } } diff --git a/rinex/src/hatanaka/textdiff.rs b/rinex/src/hatanaka/textdiff.rs index a22710936..1c0f20b99 100644 --- a/rinex/src/hatanaka/textdiff.rs +++ b/rinex/src/hatanaka/textdiff.rs @@ -43,7 +43,7 @@ impl TextDiff { if s1_len > s0_len { // got new bytes to latch - let new_slice = &data[min..s1_len].replace("&", " "); + let new_slice = &data[min..s1_len].replace('&', " "); self.buffer.push_str(new_slice); } @@ -63,7 +63,7 @@ impl TextDiff { if c != &inner[i] { result.push_str(&c.to_string()); } else { - result.push_str(" "); + result.push(' '); } } } @@ -71,8 +71,8 @@ impl TextDiff { for i in inner.len()..data.len() { if let Some(c) = to_compress.get(i) { if c.is_ascii_whitespace() { - self.buffer.push_str("&"); - result.push_str("&"); + self.buffer.push('&'); + result.push('&'); } else { self.buffer.push_str(&c.to_string()); result.push_str(&c.to_string()); diff --git a/rinex/src/header.rs b/rinex/src/header.rs index d75046869..8e8238d10 100644 --- a/rinex/src/header.rs +++ b/rinex/src/header.rs @@ -3,6 +3,7 @@ use super::*; use crate::{ antex, clocks, + clocks::{ClockAnalysisAgency, ClockDataType}, ground_position::GroundPosition, hardware::{Antenna, Rcvr, SvAntenna}, ionex, leap, meteo, observation, @@ -19,6 +20,8 @@ use std::str::FromStr; use strum_macros::EnumString; use thiserror::Error; +use crate::{fmt_comment, fmt_rinex}; + #[cfg(feature = "serde")] use serde::{Deserialize, Serialize}; @@ -348,11 +351,11 @@ impl Header { )); } - let date: Vec<&str> = items[0].split("-").collect(); - let time: Vec<&str> = items[1].split(":").collect(); + let date: Vec<&str> = items[0].split('-').collect(); + let time: Vec<&str> = items[1].split(':').collect(); let day = date[0].trim(); - let day = u8::from_str_radix(day, 10).or(Err(ParsingError::DateTimeParsing( + let day = day.parse::().or(Err(ParsingError::DateTimeParsing( String::from("day"), day.to_string(), )))?; @@ -361,19 +364,19 @@ impl Header { let month = parse_formatted_month(month)?; let y = date[2].trim(); - let mut y = i32::from_str_radix(y, 10).or(Err(ParsingError::DateTimeParsing( + let mut y = y.parse::().or(Err(ParsingError::DateTimeParsing( String::from("year"), y.to_string(), )))?; let h = time[0].trim(); - let h = u8::from_str_radix(h, 10).or(Err(ParsingError::DateTimeParsing( + let h = h.parse::().or(Err(ParsingError::DateTimeParsing( String::from("hour"), h.to_string(), )))?; let m = time[1].trim(); - let m = u8::from_str_radix(m, 10).or(Err(ParsingError::DateTimeParsing( + let m = m.parse::().or(Err(ParsingError::DateTimeParsing( String::from("minute"), m.to_string(), )))?; @@ -405,13 +408,13 @@ impl Header { if let Ok(mut pcv) = antex::Pcv::from_str(pcv_str.trim()) { if pcv.is_relative() { // try to parse "Relative Type" - if rel_type.trim().len() > 0 { + if !rel_type.trim().is_empty() { pcv = pcv.with_relative_type(rel_type.trim()); } } antex = antex.with_pcv(pcv); } - if ref_sn.trim().len() > 0 { + if !ref_sn.trim().is_empty() { antex = antex.with_serial_number(ref_sn.trim()) } } else if marker.contains("TYPE / SERIAL NO") { @@ -481,17 +484,17 @@ impl Header { rinex_type = Type::from_str(type_str.trim())?; if type_str.contains("GLONASS") { // old GLONASS NAV : no constellation field - constellation = Some(Constellation::Glonass) + constellation = Some(Constellation::Glonass); } else if type_str.contains("GPS NAV DATA") { // old GPS NAV: no constellation field - constellation = Some(Constellation::GPS) + constellation = Some(Constellation::GPS); } else if type_str.contains("METEOROLOGICAL DATA") { // these files are not tied to a constellation system, // therefore, do not have this field } else { // regular files if let Ok(constell) = Constellation::from_str(constell_str.trim()) { - constellation = Some(constell) + constellation = Some(constell); } } /* @@ -499,7 +502,7 @@ impl Header { */ let vers = vers.trim(); version = Version::from_str(vers).or(Err(ParsingError::VersionParsing( - format!("RINEX VERSION / TYPE \"{}\"", vers.to_string()), + format!("RINEX VERSION / TYPE \"{}\"", vers), )))?; if !version.is_supported() { @@ -514,7 +517,7 @@ impl Header { false => rb.trim().to_string(), }; let (date_str, _) = rem.split_at(20); - date = date_str.trim().to_string() + date = date_str.trim().to_string(); } else if marker.contains("MARKER NAME") { station = content.split_at(20).0.trim().to_string() } else if marker.contains("MARKER NUMBER") { @@ -522,9 +525,7 @@ impl Header { } else if marker.contains("MARKER TYPE") { let code = content.split_at(20).0.trim(); if let Ok(marker) = MarkerType::from_str(code) { - marker_type = Some(marker) - } else { - return Err(ParsingError::MarkerType(code.to_string())); + marker_type = Some(marker); } } else if marker.contains("OBSERVER / AGENCY") { let (obs, ag) = content.split_at(20); @@ -532,7 +533,7 @@ impl Header { agency = ag.trim().to_string(); } else if marker.contains("REC # / TYPE / VERS") { if let Ok(receiver) = Rcvr::from_str(content) { - rcvr = Some(receiver) + rcvr = Some(receiver); } } else if marker.contains("SYS / PCVS APPLIED") { let (gnss, rem) = content.split_at(2); @@ -551,7 +552,7 @@ impl Header { program.to_string() } }, - constellation: gnss.clone(), + constellation: gnss, url: { let url = url.trim(); if url.eq("") { @@ -580,7 +581,7 @@ impl Header { program.to_string() } }, - constellation: gnss.clone(), + constellation: gnss, url: { let url = url.trim(); if url.eq("") { @@ -601,7 +602,8 @@ impl Header { let (factor, rem) = rem.split_at(6); let factor = factor.trim(); - let scaling = u16::from_str_radix(factor, 10) + let scaling = factor + .parse::() .or(Err(parse_int_error!("SYS / SCALE FACTOR", factor)))?; let (_num, rem) = rem.split_at(3); @@ -800,10 +802,30 @@ impl Header { // {}, + Some(c) => { + // in case of OLD RINEX : fixed constellation + // use that information, as it may be omitted in the TIME OF OBS header + time_of_first_obs.time_scale = c + .timescale() + .ok_or(ParsingError::TimescaleParsing(c.to_string()))?; + }, + } observation = observation.with_time_of_first_obs(time_of_first_obs); } else if marker.contains("TIME OF LAST OBS") { - let time_of_last_obs = Self::parse_time_of_obs(content)?; + let mut time_of_last_obs = Self::parse_time_of_obs(content)?; + match constellation { + Some(Constellation::Mixed) | None => {}, + Some(c) => { + // in case of OLD RINEX : fixed constellation + // use that information, as it may be omitted in the TIME OF OBS header + time_of_last_obs.time_scale = c + .timescale() + .ok_or(ParsingError::TimescaleParsing(c.to_string()))?; + }, + } observation = observation.with_time_of_last_obs(time_of_last_obs); } else if marker.contains("TYPES OF OBS") { // these observations can serve both Observation & Meteo RINEX @@ -814,17 +836,17 @@ impl Header { match constellation { Some(Constellation::Mixed) => { lazy_static! { - static ref KNOWN_CONSTELLS: Vec = vec![ + static ref KNOWN_CONSTELLS: [Constellation; 6] = [ Constellation::GPS, Constellation::Glonass, Constellation::Galileo, Constellation::BeiDou, Constellation::QZSS, - Constellation::Geo, + Constellation::SBAS, ]; } for c in KNOWN_CONSTELLS.iter() { - if let Some(codes) = observation.codes.get_mut(&c) { + if let Some(codes) = observation.codes.get_mut(c) { codes.push(observable.clone()); } else { observation.codes.insert(*c, vec![observable.clone()]); @@ -849,10 +871,10 @@ impl Header { } } } else if marker.contains("SYS / # / OBS TYPES") { - let (possible_content, content) = content.split_at(6); - if possible_content.len() > 0 { - let code = &possible_content[..1]; - if let Ok(c) = Constellation::from_1_letter_code(code) { + let (possible_counter, content) = content.split_at(6); + if !possible_counter.is_empty() { + let code = &possible_counter[..1]; + if let Ok(c) = Constellation::from_str(code) { current_constell = Some(c); } } @@ -863,7 +885,7 @@ impl Header { let obscode = &content[i * 4..std::cmp::min((i + 1) * 4, content.len())].trim(); if let Ok(observable) = Observable::from_str(obscode) { - if obscode.len() > 0 { + if !obscode.is_empty() { if let Some(codes) = observation.codes.get_mut(&constell) { codes.push(observable); } else { @@ -875,20 +897,21 @@ impl Header { } } else if marker.contains("ANALYSIS CENTER") { let (code, agency) = content.split_at(3); - clocks = clocks.with_agency(clocks::Agency { + clocks = clocks.with_agency(ClockAnalysisAgency { code: code.trim().to_string(), name: agency.trim().to_string(), }); } else if marker.contains("# / TYPES OF DATA") { let (n, r) = content.split_at(6); let n = n.trim(); - let n = - u8::from_str_radix(n, 10).or(Err(parse_int_error!("# / TYPES OF DATA", n)))?; + let n = n + .parse::() + .or(Err(parse_int_error!("# / TYPES OF DATA", n)))?; let mut rem = r.clone(); for _ in 0..n { let (code, r) = rem.split_at(6); - if let Ok(c) = clocks::DataType::from_str(code.trim()) { + if let Ok(c) = ClockDataType::from_str(code.trim()) { clocks.codes.push(c); } rem = r.clone() @@ -914,18 +937,22 @@ impl Header { } } } else if marker.contains("GLONASS SLOT / FRQ #") { + //TODO + // This should be used when dealing with Glonass carriers + let slots = content.split_at(4).1.trim(); for i in 0..num_integer::div_ceil(slots.len(), 7) { let svnn = &slots[i * 7..i * 7 + 4]; let chx = &slots[i * 7 + 4..std::cmp::min(i * 7 + 4 + 3, slots.len())]; if let Ok(svnn) = Sv::from_str(svnn.trim()) { - if let Ok(chx) = i8::from_str_radix(chx.trim(), 10) { + if let Ok(chx) = chx.trim().parse::() { glo_channels.insert(svnn, chx); } } } } else if marker.contains("GLONASS COD/PHS/BIS") { //TODO + // This will help RTK solving against GLONASS SV } else if marker.contains("ION ALPHA") { //TODO //0.7451D-08 -0.1490D-07 -0.5960D-07 0.1192D-06 ION ALPHA @@ -975,19 +1002,19 @@ impl Header { } } else if marker.contains("# OF STATIONS") { // IONEX - if let Ok(u) = u32::from_str_radix(content.trim(), 10) { + if let Ok(u) = content.trim().parse::() { ionex = ionex.with_nb_stations(u) } } else if marker.contains("# OF SATELLITES") { // IONEX - if let Ok(u) = u32::from_str_radix(content.trim(), 10) { + if let Ok(u) = content.trim().parse::() { ionex = ionex.with_nb_satellites(u) } /* * Initial TEC map scaling */ } else if marker.contains("EXPONENT") { - if let Ok(e) = i8::from_str_radix(content.trim(), 10) { + if let Ok(e) = content.trim().parse::() { ionex = ionex.with_exponent(e); } @@ -1157,411 +1184,6 @@ impl Header { }) } - /// Combines self and rhs header into a new header. - /// Self's attribute are always preferred. - /// Behavior: - /// - self's attributes are always preferred (in case of unique attributes) - /// - observables are concatenated - /// This fails if : - /// - RINEX types do not match - /// - IONEX: map dimensions do not match and grid definitions do not strictly match - pub fn merge(&self, header: &Self) -> Result { - if self.rinex_type != header.rinex_type { - return Err(merge::Error::FileTypeMismatch); - } - if self.rinex_type == Type::IonosphereMaps { - if let Some(i0) = &self.ionex { - if let Some(i1) = &header.ionex { - if i0.map_dimension != i1.map_dimension { - panic!("can only merge ionex files with identical map dimensions") - } - } - } - } - Ok(Self { - version: { - // retains oldest rev - if self.version < header.version { - self.version.clone() - } else { - header.version.clone() - } - }, - rinex_type: self.rinex_type.clone(), - comments: { - self.comments.clone() //TODO: append rhs too! - }, - leap: { - if let Some(leap) = self.leap { - Some(leap.clone()) - } else if let Some(leap) = header.leap { - Some(leap.clone()) - } else { - None - } - }, - glo_channels: { - let mut channels = self.glo_channels.clone(); - for (svnn, channel) in &header.glo_channels { - channels.insert(*svnn, *channel); - } - channels - }, - run_by: self.run_by.clone(), - program: self.program.clone(), - observer: self.observer.clone(), - date: self.date.clone(), - station: self.station.clone(), - station_id: self.station_id.clone(), - station_url: self.station_url.clone(), - agency: self.agency.clone(), - license: self.license.clone(), - doi: self.doi.clone(), - dcb_compensations: { - /* - * DCBs compensations are marked, only if - * both compensated for in A & B. - * In this case, resulting data, for a given constellation, - * is still 100% compensated for. - */ - if self.dcb_compensations.len() == 0 || header.dcb_compensations.len() == 0 { - Vec::new() // drop everything - } else { - let rhs_constellations: Vec<_> = header - .dcb_compensations - .iter() - .map(|dcb| dcb.constellation.clone()) - .collect(); - let dcbs: Vec = self - .dcb_compensations - .clone() - .iter() - .filter(|dcb| rhs_constellations.contains(&dcb.constellation)) - .map(|dcb| dcb.clone()) - .collect(); - dcbs - } - }, - pcv_compensations: { - /* - * Same logic as .dcb_compensations - */ - if self.pcv_compensations.len() == 0 || header.pcv_compensations.len() == 0 { - Vec::new() // drop everything - } else { - let rhs_constellations: Vec<_> = header - .pcv_compensations - .iter() - .map(|pcv| pcv.constellation.clone()) - .collect(); - let pcvs: Vec = self - .pcv_compensations - .clone() - .iter() - .filter(|pcv| rhs_constellations.contains(&pcv.constellation)) - .map(|pcv| pcv.clone()) - .collect(); - pcvs - } - }, - marker_type: { - if let Some(mtype) = &self.marker_type { - Some(mtype.clone()) - } else if let Some(mtype) = &header.marker_type { - Some(mtype.clone()) - } else { - None - } - }, - gps_utc_delta: { - if let Some(d) = self.gps_utc_delta { - Some(d) - } else if let Some(d) = header.gps_utc_delta { - Some(d) - } else { - None - } - }, - data_scaling: { - if let Some(d) = self.data_scaling { - Some(d) - } else if let Some(d) = header.data_scaling { - Some(d) - } else { - None - } - }, - constellation: { - if let Some(c0) = self.constellation { - if let Some(c1) = header.constellation { - if c0 != c1 { - Some(Constellation::Mixed) - } else { - Some(c0.clone()) - } - } else { - Some(c0.clone()) - } - } else if let Some(constellation) = header.constellation { - Some(constellation.clone()) - } else { - None - } - }, - rcvr: { - if let Some(rcvr) = &self.rcvr { - Some(rcvr.clone()) - } else if let Some(rcvr) = &header.rcvr { - Some(rcvr.clone()) - } else { - None - } - }, - rcvr_antenna: { - if let Some(a) = &self.rcvr_antenna { - Some(a.clone()) - } else if let Some(a) = &header.rcvr_antenna { - Some(a.clone()) - } else { - None - } - }, - sv_antenna: { - if let Some(a) = &self.sv_antenna { - Some(a.clone()) - } else if let Some(a) = &header.sv_antenna { - Some(a.clone()) - } else { - None - } - }, - wavelengths: { - if let Some(wv) = &self.wavelengths { - Some(wv.clone()) - } else if let Some(wv) = &header.wavelengths { - Some(wv.clone()) - } else { - None - } - }, - sampling_interval: { - if let Some(interval) = self.sampling_interval { - Some(interval.clone()) - } else if let Some(interval) = header.sampling_interval { - Some(interval.clone()) - } else { - None - } - }, - ground_position: { - if let Some(pos) = &self.ground_position { - Some(pos.clone()) - } else if let Some(pos) = &header.ground_position { - Some(pos.clone()) - } else { - None - } - }, - obs: { - if let Some(d0) = &self.obs { - if let Some(d1) = &header.obs { - Some(observation::HeaderFields { - time_of_first_obs: std::cmp::min( - d0.time_of_first_obs, - d1.time_of_first_obs, - ), - time_of_last_obs: std::cmp::max( - d0.time_of_last_obs, - d1.time_of_last_obs, - ), - crinex: d0.crinex.clone(), - codes: { - let mut map = d0.codes.clone(); - for (constellation, obscodes) in d1.codes.iter() { - if let Some(codes) = map.get_mut(&constellation) { - for obs in obscodes { - if !codes.contains(&obs) { - codes.push(obs.clone()); - } - } - } else { - map.insert(constellation.clone(), obscodes.clone()); - } - } - map - }, - clock_offset_applied: d0.clock_offset_applied - && d1.clock_offset_applied, - scalings: HashMap::new(), //TODO - }) - } else { - Some(d0.clone()) - } - } else if let Some(data) = &header.obs { - Some(data.clone()) - } else { - None - } - }, - meteo: { - if let Some(m0) = &self.meteo { - if let Some(m1) = &header.meteo { - Some(meteo::HeaderFields { - sensors: { - let mut sensors = m0.sensors.clone(); - for sens in m1.sensors.iter() { - if !sensors.contains(&sens) { - sensors.push(sens.clone()) - } - } - sensors - }, - codes: { - let mut observables = m0.codes.clone(); - for obs in m1.codes.iter() { - if !observables.contains(&obs) { - observables.push(obs.clone()) - } - } - observables - }, - }) - } else { - Some(m0.clone()) - } - } else if let Some(meteo) = &header.meteo { - Some(meteo.clone()) - } else { - None - } - }, - clocks: { - if let Some(d0) = &self.clocks { - if let Some(d1) = &header.clocks { - Some(clocks::HeaderFields { - codes: { - let mut codes = d0.codes.clone(); - for code in d1.codes.iter() { - if !codes.contains(&code) { - codes.push(code.clone()) - } - } - codes - }, - agency: { - if let Some(agency) = &d0.agency { - Some(agency.clone()) - } else if let Some(agency) = &d1.agency { - Some(agency.clone()) - } else { - None - } - }, - station: { - if let Some(station) = &d0.station { - Some(station.clone()) - } else if let Some(station) = &d1.station { - Some(station.clone()) - } else { - None - } - }, - clock_ref: { - if let Some(clk) = &d0.clock_ref { - Some(clk.clone()) - } else if let Some(clk) = &d1.clock_ref { - Some(clk.clone()) - } else { - None - } - }, - timescale: { - if let Some(ts) = &d0.timescale { - Some(ts.clone()) - } else if let Some(ts) = &d1.timescale { - Some(ts.clone()) - } else { - None - } - }, - }) - } else { - Some(d0.clone()) - } - } else if let Some(d1) = &header.clocks { - Some(d1.clone()) - } else { - None - } - }, - antex: { - if let Some(d0) = &self.antex { - Some(d0.clone()) - } else if let Some(data) = &header.antex { - Some(data.clone()) - } else { - None - } - }, - ionex: { - if let Some(d0) = &self.ionex { - if let Some(d1) = &header.ionex { - Some(ionex::HeaderFields { - reference: d0.reference.clone(), - description: { - if let Some(description) = &d0.description { - Some(description.clone()) - } else if let Some(description) = &d1.description { - Some(description.clone()) - } else { - None - } - }, - exponent: std::cmp::min(d0.exponent, d1.exponent), // TODO: this is not correct, - mapping: { - if let Some(map) = &d0.mapping { - Some(map.clone()) - } else if let Some(map) = &d1.mapping { - Some(map.clone()) - } else { - None - } - }, - map_dimension: d0.map_dimension, - base_radius: d0.base_radius, - grid: d0.grid.clone(), - elevation_cutoff: d0.elevation_cutoff, - observables: { - if let Some(obs) = &d0.observables { - Some(obs.clone()) - } else if let Some(obs) = &d1.observables { - Some(obs.clone()) - } else { - None - } - }, - nb_stations: std::cmp::max(d0.nb_stations, d1.nb_stations), - nb_satellites: std::cmp::max(d0.nb_satellites, d1.nb_satellites), - dcbs: { - let mut dcbs = d0.dcbs.clone(); - for (b, dcb) in &d1.dcbs { - dcbs.insert(b.clone(), *dcb); - } - dcbs - }, - }) - } else { - Some(d0.clone()) - } - } else if let Some(d1) = &header.ionex { - Some(d1.clone()) - } else { - None - } - }, - }) - } - /// Returns true if self is a `Compressed RINEX` pub fn is_crinex(&self) -> bool { if let Some(obs) = &self.obs { @@ -1675,315 +1297,452 @@ impl Header { let (ns, rem) = rem.split_at(8); // println!("Y \"{}\" M \"{}\" D \"{}\" HH \"{}\" MM \"{}\" SS \"{}\" NS \"{}\"", y, m, d, hh, mm, ss, ns); // DEBUG - let y = u32::from_str_radix(y.trim(), 10) + let y = y + .trim() + .parse::() .map_err(|_| ParsingError::DateTimeParsing(String::from("year"), y.to_string()))?; - let m = u8::from_str_radix(m.trim(), 10) + let m = m + .trim() + .parse::() .map_err(|_| ParsingError::DateTimeParsing(String::from("months"), m.to_string()))?; - let d = u8::from_str_radix(d.trim(), 10) + let d = d + .trim() + .parse::() .map_err(|_| ParsingError::DateTimeParsing(String::from("days"), d.to_string()))?; - let hh = u8::from_str_radix(hh.trim(), 10) + let hh = hh + .trim() + .parse::() .map_err(|_| ParsingError::DateTimeParsing(String::from("hours"), hh.to_string()))?; - let mm = u8::from_str_radix(mm.trim(), 10) + let mm = mm + .trim() + .parse::() .map_err(|_| ParsingError::DateTimeParsing(String::from("minutes"), mm.to_string()))?; - let ss = u8::from_str_radix(ss.trim(), 10) + let ss = ss + .trim() + .parse::() .map_err(|_| ParsingError::DateTimeParsing(String::from("seconds"), ss.to_string()))?; - let ns = u32::from_str_radix(ns.trim(), 10) + let ns = ns + .trim() + .parse::() .map_err(|_| ParsingError::DateTimeParsing(String::from("nanos"), ns.to_string()))?; /* timescale might be missing in OLD RINEX: we handle that externally */ let mut ts = TimeScale::TAI; let rem = rem.trim(); - if rem.len() > 0 { - // println!("TS \"{}\"", rem); // DBEUG + if !rem.is_empty() { + // println!("TS \"{}\"", rem); // DBEUGts = TimeScale::from_str(rem.trim()).map_err(|_| { ts = TimeScale::from_str(rem.trim()).map_err(|_| { ParsingError::DateTimeParsing(String::from("timescale"), rem.to_string()) })?; } - Ok(Epoch::from_str(&format!( + Epoch::from_str(&format!( "{:04}-{:02}-{:02}T{:02}:{:02}:{:02}.{:08} {}", y, m, d, hh, mm, ss, ns, ts )) - .map_err(|_| ParsingError::DateTimeParsing(String::from("timescale"), rem.to_string()))?) + .map_err(|_| ParsingError::DateTimeParsing(String::from("timescale"), rem.to_string())) } -} -impl std::fmt::Display for Header { - /// `Header` formatter, mainly for RINEX file production purposes - fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { - // start with CRINEX attributes, if need be - if let Some(obs) = &self.obs { - if let Some(crinex) = &obs.crinex { - write!(f, "{}\n", crinex)?; - } - } - // RINEX VERSION / TYPE - write!( - f, - "{:6}.{:02} ", - self.version.major, self.version.minor - )?; + /* + * Format VERSION/TYPE field + */ + pub(crate) fn fmt_rinex_version_type(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + let major = self.version.major; + let minor = self.version.minor; match self.rinex_type { - Type::NavigationData => { - match self.constellation { - Some(Constellation::Glonass) => { - // Glonass Special case - write!(f, "{:<20}", "G: GLONASS NAV DATA")?; - write!(f, "{:<20}", "")?; - write!(f, "{}", "RINEX VERSION / TYPE\n")? - }, - Some(c) => { - write!(f, "{:<20}", "NAVIGATION DATA")?; - write!(f, "{:<20}", c.to_1_letter_code())?; - write!(f, "{:<20}", "RINEX VERSION / TYPE\n")? - }, - _ => panic!("constellation must be specified when formatting a NavigationData"), - } + Type::NavigationData => match self.constellation { + Some(Constellation::Glonass) => { + writeln!( + f, + "{}", + fmt_rinex( + &format!("{:6}.{:02} G: GLONASS NAV DATA", major, minor), + "RINEX VERSION / TYPE" + ) + ) + }, + Some(c) => { + writeln!( + f, + "{}", + fmt_rinex( + &format!( + "{:6}.{:02} NAVIGATION DATA {:X<20}", + major, minor, c + ), + "RINEX VERSION / TYPE" + ) + ) + }, + _ => panic!("constellation must be specified when formatting a NavigationData"), }, Type::ObservationData => match self.constellation { Some(c) => { - write!(f, "{:<20}", "OBSERVATION DATA")?; - write!(f, "{:<20}", c.to_1_letter_code())?; - write!(f, "{:<20}", "RINEX VERSION / TYPE\n")? + writeln!( + f, + "{}", + fmt_rinex( + &format!( + "{:6}.{:02} OBSERVATION DATA {:x<20}", + major, minor, c + ), + "RINEX VERSION / TYPE" + ) + ) }, _ => panic!("constellation must be specified when formatting ObservationData"), }, Type::MeteoData => { - write!(f, "{:<20}", "METEOROLOGICAL DATA")?; - write!(f, "{:<20}", "")?; - write!(f, "{:<20}", "RINEX VERSION / TYPE\n")?; + writeln!( + f, + "{}", + fmt_rinex( + &format!("{:6}.{:02} METEOROLOGICAL DATA", major, minor), + "RINEX VERSION / TYPE" + ) + ) }, Type::ClockData => { - write!(f, "{:<20}", "CLOCK DATA")?; - write!(f, "{:<20}", "")?; - write!(f, "{:<20}", "RINEX VERSION / TYPE\n")?; + writeln!( + f, + "{}", + fmt_rinex( + &format!("{:6}.{:02} CLOCK DATA", major, minor), + "RINEX VERSION / TYPE" + ) + ) }, Type::AntennaData => todo!(), Type::IonosphereMaps => todo!(), } - // COMMENTS - for comment in self.comments.iter() { - write!(f, "{:<60}", comment)?; - write!(f, "COMMENT\n")? + } + /* + * Format rinex type dependent stuff + */ + pub(crate) fn fmt_rinex_dependent(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + match self.rinex_type { + Type::ObservationData => self.fmt_observation_rinex(f), + Type::MeteoData => self.fmt_meteo_rinex(f), + Type::NavigationData => Ok(()), + Type::ClockData => self.fmt_clock_rinex(f), + Type::IonosphereMaps => self.fmt_ionex(f), + Type::AntennaData => Ok(()), } - // PGM / RUN BY / DATE - write!(f, "{:<20}", self.program)?; - write!(f, "{:<20}", self.run_by)?; - write!(f, "{:<20}", self.date)?; //TODO - write!(f, "{}", "PGM / RUN BY / DATE\n")?; - // OBSERVER / AGENCY - if self.observer.len() + self.agency.len() > 0 { - write!(f, "{:<20}", self.observer)?; - write!(f, "{:<40}", self.agency)?; - write!(f, "OBSERVER / AGENCY\n")?; + } + /* + * Clock Data fields formatting + */ + fn fmt_clock_rinex(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + if let Some(clocks) = &self.clocks { + // Types of data: observables equivalent + let mut descriptor = String::new(); + descriptor.push_str(&format!("{:6}", clocks.codes.len())); + for (i, observable) in clocks.codes.iter().enumerate() { + if (i % 9) == 0 && i > 0 { + descriptor.push_str(" "); // TAB + } + descriptor.push_str(&format!("{:6}", observable)); + } + writeln!(f, "{}", fmt_rinex(&descriptor, "# / TYPES OF DATA"))?; + + // possible timescale + if let Some(ts) = clocks.timescale { + writeln!( + f, + "{}", + fmt_rinex(&format!(" {:x}", ts), "TIME SYSTEM ID") + )?; + } + + // possible agency + if let Some(agency) = &clocks.agency { + writeln!( + f, + "{}", + fmt_rinex( + &format!("{:<5} {}", agency.code, agency.name), + "ANALYSIS CENTER" + ) + )?; + } } - // MARKER NAME - if self.station.len() > 0 { - write!(f, "{:<20}", self.station)?; - write!(f, "{:<40}", " ")?; - write!(f, "{}", "MARKER NAME\n")?; + Ok(()) + } + /* + * IONEX fields formatting + */ + fn fmt_ionex(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + if let Some(ionex) = &self.ionex { + writeln!( + f, + "{}", + fmt_rinex(&format!("{:6}", ionex.map_dimension), "MAP DIMENSION") + )?; + // h grid + let (start, end, spacing) = ( + ionex.grid.height.start, + ionex.grid.height.end, + ionex.grid.height.spacing, + ); + writeln!( + f, + "{}", + fmt_rinex( + &format!("{} {} {}", start, end, spacing), + "HGT1 / HGT2 / DHGT" + ) + )?; + // lat grid + let (start, end, spacing) = ( + ionex.grid.latitude.start, + ionex.grid.latitude.end, + ionex.grid.latitude.spacing, + ); + writeln!( + f, + "{}", + fmt_rinex( + &format!("{} {} {}", start, end, spacing), + "LAT1 / LAT2 / DLAT" + ) + )?; + // lon grid + let (start, end, spacing) = ( + ionex.grid.longitude.start, + ionex.grid.longitude.end, + ionex.grid.longitude.spacing, + ); + writeln!( + f, + "{}", + fmt_rinex( + &format!("{} {} {}", start, end, spacing), + "LON1 / LON2 / DLON" + ) + )?; + // elevation cutoff + writeln!( + f, + "{}", + fmt_rinex(&format!("{}", ionex.elevation_cutoff), "ELEVATION CUTOFF") + )?; + // mapping func + if let Some(func) = &ionex.mapping { + writeln!( + f, + "{}", + fmt_rinex(&format!("{:?}", func), "MAPPING FUNCTION") + )?; + } else { + writeln!(f, "{}", fmt_rinex("NONE", "MAPPING FUNCTION"))?; + } + // time of first map + writeln!(f, "{}", fmt_rinex("TODO", "EPOCH OF FIRST MAP"))?; + // time of last map + writeln!(f, "{}", fmt_rinex("TODO", "EPOCH OF LAST MAP"))?; } - // MARKER NUMBER - if self.station_id.len() > 0 { - // has been parsed - write!(f, "{:<20}", self.station_id)?; - write!(f, "{:<40}", " ")?; - write!(f, "{}", "MARKER NUMBER\n")?; + Ok(()) + } + /* + * Meteo Data fields formatting + */ + fn fmt_meteo_rinex(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + if let Some(meteo) = &self.meteo { + /* + * List of observables + */ + let mut descriptor = String::new(); + descriptor.push_str(&format!("{:6}", meteo.codes.len())); + for (i, observable) in meteo.codes.iter().enumerate() { + if (i % 9) == 0 && i > 0 { + descriptor.push_str(" "); // TAB + } + descriptor.push_str(&format!(" {}", observable)); + } + writeln!(f, "{}", fmt_rinex(&descriptor, "# / TYPES OF OBSERV"))?; + for sensor in &meteo.sensors { + write!(f, "{}", sensor)?; + } } + Ok(()) + } + /* + * Observation Data fields formatting + */ + fn fmt_observation_rinex(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + if let Some(obs) = &self.obs { + if let Some(e) = obs.time_of_first_obs { + //TODO: hifitime does not have a gregorian decomposition method at the moment + //let offset = match time_of_first_obs.time_scale { + // TimeScale::GPST => Duration::from_seconds(19.0), + // TimeScale::GST => Duration::from_seconds(35.0), + // TimeScale::BDT => Duration::from_seconds(35.0), + // _ => Duration::default(), + //}; + let (y, m, d, hh, mm, ss, nanos) = e.to_gregorian_utc(); + writeln!( + f, + "{}", + fmt_rinex( + &format!( + " {:04} {:02} {:02} {:02} {:02} {:02}.{:07} {:x}", + y, m, d, hh, mm, ss, nanos, e.time_scale + ), + "TIME OF FIRST OBS" + ) + )?; + } + if let Some(e) = obs.time_of_last_obs { + let (y, m, d, hh, mm, ss, nanos) = e.to_gregorian_utc(); + writeln!( + f, + "{}", + fmt_rinex( + &format!( + " {:04} {:02} {:02} {:02} {:02} {:02}.{:07} {:x}", + y, m, d, hh, mm, ss, nanos, e.time_scale + ), + "TIME OF LAST OBS" + ) + )?; + } + /* + * Form the observables list + */ + match self.version.major { + 1 | 2 => { + /* + * List of observables + */ + let mut descriptor = String::new(); + if let Some((_constell, observables)) = obs.codes.iter().next() { + descriptor.push_str(&format!("{:6}", observables.len())); + for (i, observable) in observables.iter().enumerate() { + if (i % 9) == 0 && i > 0 { + descriptor.push_str(" "); // TAB + } + descriptor.push_str(&format!("{:>6}", observable)); + } + writeln!(f, "{}", fmt_rinex(&descriptor, "# / TYPES OF OBSERV"))?; + } + }, + _ => {}, + } + // must take place after list of observables: + // TODO scaling factor + // TODO DCBS compensations + // TODO PCVs compensations + } + Ok(()) + } + /* + * Format all comments + */ + pub(crate) fn fmt_comments(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + for comment in self.comments.iter() { + writeln!(f, "{}", fmt_comment(comment))?; + } + Ok(()) + } +} + +impl std::fmt::Display for Header { + /// `Header` formatter, mainly for RINEX file production purposes + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + // start with CRINEX attributes, if need be + if let Some(obs) = &self.obs { + if let Some(crinex) = &obs.crinex { + writeln!(f, "{}", crinex)?; + } + } + + self.fmt_rinex_version_type(f)?; + self.fmt_comments(f)?; + + // PGM / RUN BY / DATE + writeln!( + f, + "{}", + fmt_rinex( + &format!("{:<20}{:<20}{:<20}", self.program, self.run_by, self.date), + "PGM / RUN BY / DATE" + ) + )?; + + // OBSERVER / AGENCY + writeln!( + f, + "{}", + fmt_rinex( + &format!("{:<20}{}", self.observer, self.agency), + "OBSERVER /AGENCY" + ) + )?; + + writeln!(f, "{}", fmt_rinex(&self.station, "MARKER NAME"))?; + writeln!(f, "{}", fmt_rinex(&self.station_id, "MARKER NUMBER"))?; + // ANT if let Some(antenna) = &self.rcvr_antenna { - write!(f, "{:<20}", antenna.model)?; - write!(f, "{:<40}", antenna.sn)?; - write!(f, "{}", "ANT # / TYPE\n")?; + writeln!( + f, + "{}", + fmt_rinex( + &format!("{:<20}{}", antenna.model, antenna.sn), + "ANT # / TYPE" + ) + )?; if let Some(coords) = &antenna.coords { - write!(f, "{:14.4}", coords.0)?; - write!(f, "{:14.4}", coords.1)?; - write!(f, "{:14.4}", coords.2)?; - write!(f, "{}", "APPROX POSITION XYZ\n")? - } - if let Some(h) = &antenna.height { - write!(f, "{:14.4}", h)?; - if let Some(e) = &antenna.eastern { - write!(f, "{:14.4}", e)?; - } else { - write!(f, "{:14.4}", 0.0)?; - } - if let Some(n) = &antenna.northern { - write!(f, "{:14.4}", n)?; - } else { - write!(f, "{:14.4}", 0.0)?; - } - write!(f, "{:18}", "")?; - write!(f, "{}", "ANTENNA: DELTA H/E/N\n")? + writeln!( + f, + "{}", + fmt_rinex( + &format!("{:14.4}{:14.4}{:14.4}", coords.0, coords.1, coords.2), + "APPROX POSITION XYZ" + ) + )?; } + writeln!( + f, + "{}", + fmt_rinex( + &format!( + "{:14.4}{:14.4}{:14.4}", + antenna.height.unwrap_or(0.0), + antenna.eastern.unwrap_or(0.0), + antenna.northern.unwrap_or(0.0) + ), + "ANTENNA: DELTA H/E/N" + ) + )?; } // RCVR if let Some(rcvr) = &self.rcvr { - write!(f, "{:<20}", rcvr.sn)?; - write!(f, "{:<20}", rcvr.model)?; - write!(f, "{:<20}", rcvr.firmware)?; - write!(f, "REC # / TYPE / VERS\n")? + writeln!( + f, + "{}", + fmt_rinex( + &format!("{:<20}{:<20}{}", rcvr.sn, rcvr.model, rcvr.firmware), + "REC # / TYPE / VERS" + ) + )?; } // INTERVAL if let Some(interval) = &self.sampling_interval { - write!(f, "{:6}", interval.to_seconds())?; - write!(f, "{:<54}", "")?; - write!(f, "INTERVAL\n")? - } - // List of Observables - match self.rinex_type { - Type::ObservationData => { - if let Some(obs) = &self.obs { - if let Some(time_of_first_obs) = obs.time_of_first_obs { - //TODO: hifitime does not have a gregorian decomposition method at the moment - let offset = match time_of_first_obs.time_scale { - TimeScale::GPST => Duration::from_seconds(19.0), - TimeScale::GST => Duration::from_seconds(35.0), - TimeScale::BDT => Duration::from_seconds(35.0), - _ => Duration::default(), - }; - let (y, m, d, hh, mm, ss, nanos) = (time_of_first_obs).to_gregorian_utc(); - let mut descriptor = format!( - " {:04} {:02} {:02} {:02} {:02} {:02}.{:07} {:x}", - y, m, d, hh, mm, ss, nanos, time_of_first_obs.time_scale - ); - descriptor.push_str(&format!( - "{: Duration::from_seconds(19.0), - TimeScale::GST => Duration::from_seconds(35.0), - TimeScale::BDT => Duration::from_seconds(35.0), - _ => Duration::default(), - }; - let (y, m, d, hh, mm, ss, nanos) = - (time_of_last_obs + offset).to_gregorian_utc(); - let mut descriptor = format!( - " {:04} {:02} {:02} {:02} {:02} {:02}.{:08} {:x}", - y, m, d, hh, mm, ss, nanos, time_of_last_obs.time_scale - ); - descriptor.push_str(&format!( - "{: { - // old revisions - for (_, observables) in obs.codes.iter() { - write!(f, "{:6}", observables.len())?; - let mut descriptor = String::new(); - for i in 0..observables.len() { - if (i % 9) == 0 && i > 0 { - //ADD LABEL - descriptor.push_str("# / TYPES OF OBSERV\n"); - descriptor.push_str(&format!("{:<6}", "")); - //TAB - } - // this will not work if observable - // does not fit on 2 characters - descriptor.push_str(&format!(" {}", observables[i])); - } - //ADD BLANK on last line - if observables.len() <= 9 { - // fits on one line - descriptor.push_str(&format!( - "{: { - // modern revisions - for (constell, codes) in obs.codes.iter() { - let mut line = format!("{:<4}", constell.to_1_letter_code()); - line.push_str(&format!("{:2}", codes.len())); - for i in 0..codes.len() { - if (i + 1) % 14 == 0 { - line.push_str(&format!( - "{: { - if let Some(obs) = &self.meteo { - write!(f, "{:6}", obs.codes.len())?; - let mut description = String::new(); - for i in 0..obs.codes.len() { - if (i % 9) == 0 && i > 0 { - description.push_str("# / TYPES OF OBSERV\n"); - write!(f, "{}", description)?; - description.clear(); - description.push_str(&format!("{:<6}", "")); //TAB - } - description.push_str(&format!(" {}", obs.codes[i])); - } - description.push_str(&format!( - "{: {}, + writeln!( + f, + "{}", + fmt_rinex(&format!("{:6}", interval.to_seconds()), "INTERVAL") + )?; } - // Must take place after list of Observables: - //TODO: scale factor, if any - //TODO: DCBS compensation, if any - //TODO: PCVs compensation, if any + // LEAP if let Some(leap) = &self.leap { let mut line = String::new(); @@ -2005,83 +1764,35 @@ impl std::fmt::Display for Header { )); write!(f, "{}", line)? } - // Custom Meteo fields - if let Some(meteo) = &self.meteo { - let sensors = &meteo.sensors; - for sensor in sensors { - write!(f, "{}", sensor)? - } - } - // Custom Clock fields - if let Some(clocks) = &self.clocks { - // Types of data: is the equivalent of Observation codes - write!(f, "{:6}", clocks.codes.len())?; - for code in &clocks.codes { - write!(f, " {}", code)?; - } - write!( - f, - "{:>width$}\n", - "# / TYPES OF DATA\n", - width = 80 - 6 - 6 * clocks.codes.len() - 2 - )?; - // possible timescale - if let Some(ts) = clocks.timescale { - write!( - f, - " {:x} TIME SYSTEM ID\n", - ts - )?; - } - // possible reference agency - if let Some(agency) = &clocks.agency { - write!(f, "{:<5} ", agency.code)?; - write!(f, "{}", agency.name)?; - write!(f, "ANALYSIS CENTER\n")?; - } - // possible reference clock information - } - // Custom IONEX fields - if let Some(ionex) = &self.ionex { - //TODO: - // EPOCH OF FIRST and LAST MAP - // with epoch::format(Ionex) - let _ = write!(f, "{:6} MAP DIMENSION\n", ionex.map_dimension); - let h = &ionex.grid.height; - let _ = write!( - f, - "{} {} {} HGT1 / HGT2 / DHGT\n", - h.start, h.end, h.spacing - ); - let lat = &ionex.grid.latitude; - let _ = write!( - f, - "{} {} {} LAT1 / LON2 / DLAT\n", - lat.start, lat.end, lat.spacing - ); - let lon = &ionex.grid.longitude; - let _ = write!( - f, - "{} {} {} LON1 / LON2 / DLON\n", - lon.start, lon.end, lon.spacing - ); - let _ = write!(f, "{} ELEVATION CUTOFF\n", ionex.elevation_cutoff); - if let Some(func) = &ionex.mapping { - let _ = write!(f, "{:?} MAPPING FUNCTION\n", func); - } else { - let _ = write!(f, "NONE MAPPING FUNCTION\n"); - } - let _ = write!(f, "{} EXPONENT\n", ionex.exponent); - if let Some(desc) = &ionex.description { - for line in 0..desc.len() / 60 { - let max = std::cmp::min((line + 1) * 60, desc.len()); - let _ = write!(f, "{} COMMENT\n", &desc[line * 60..max]); - } - } - } - // END OF HEADER - write!(f, "{:>74}", "END OF HEADER\n") + // RINEX Type dependent header + self.fmt_rinex_dependent(f)?; + + //TODO + // things that could be nice to squeeze in: + // [+] SBAS contained (detailed vehicles) + // [+] RINEX 3 -> 2 observables conversion (see OBS/V2/rovn as an example) + writeln!(f, "{}", fmt_rinex("", "END OF HEADER")) + } +} + +impl Header { + /* + * Macro to be used when marking Self as Merged file + */ + fn merge_comment(timestamp: Epoch) -> String { + let (y, m, d, hh, mm, ss, _) = timestamp.to_gregorian_utc(); + format!( + "rustrnx-{:<11} FILE MERGE {}{}{} {}{}{} {:x}", + env!("CARGO_PKG_VERSION"), + y, + m, + d, + hh, + mm, + ss, + timestamp.time_scale + ) } } @@ -2111,9 +1822,22 @@ impl Merge for Header { let (a_rev, b_rev) = (self.version, rhs.version); self.version = std::cmp::min(a_rev, b_rev); + // sampling interval special case + match self.sampling_interval { + None => { + if rhs.sampling_interval.is_some() { + self.sampling_interval = rhs.sampling_interval; + } + }, + Some(lhs) => { + if let Some(rhs) = rhs.sampling_interval { + self.sampling_interval = Some(std::cmp::min(lhs, rhs)); + } + }, + } + merge::merge_mut_vec(&mut self.comments, &rhs.comments); merge::merge_mut_option(&mut self.marker_type, &rhs.marker_type); - merge::merge_mut_option(&mut self.sampling_interval, &rhs.sampling_interval); merge::merge_mut_option(&mut self.license, &rhs.license); merge::merge_mut_option(&mut self.data_scaling, &rhs.data_scaling); merge::merge_mut_option(&mut self.doi, &rhs.doi); @@ -2124,6 +1848,41 @@ impl Merge for Header { merge::merge_mut_option(&mut self.sv_antenna, &rhs.sv_antenna); merge::merge_mut_option(&mut self.ground_position, &rhs.ground_position); merge::merge_mut_option(&mut self.wavelengths, &rhs.wavelengths); + merge::merge_mut_option(&mut self.gps_utc_delta, &rhs.gps_utc_delta); + + // DCBS compensation is preserved, only if both A&B both have it + if self.dcb_compensations.is_empty() || rhs.dcb_compensations.is_empty() { + self.dcb_compensations.clear(); // drop everything + } else { + let rhs_constellations: Vec<_> = rhs + .dcb_compensations + .iter() + .map(|dcb| dcb.constellation) + .collect(); + self.dcb_compensations + .iter_mut() + .filter(|dcb| rhs_constellations.contains(&dcb.constellation)) + .count(); + } + + // PCV compensation : same logic + // only preserve compensations present in both A & B + if self.pcv_compensations.is_empty() || rhs.pcv_compensations.is_empty() { + self.pcv_compensations.clear(); // drop everything + } else { + let rhs_constellations: Vec<_> = rhs + .pcv_compensations + .iter() + .map(|pcv| pcv.constellation) + .collect(); + self.dcb_compensations + .iter_mut() + .filter(|pcv| rhs_constellations.contains(&pcv.constellation)) + .count(); + } + + //TODO : + //merge::merge_mut(&mut self.glo_channels, &rhs.glo_channels); // RINEX specific operation if let Some(lhs) = &mut self.antex { @@ -2151,6 +1910,7 @@ impl Merge for Header { if let Some(rhs) = &rhs.obs { merge::merge_mut_option(&mut lhs.crinex, &rhs.crinex); merge::merge_mut_unique_map2d(&mut lhs.codes, &rhs.codes); + // TODO: manage that lhs.clock_offset_applied |= rhs.clock_offset_applied; } } @@ -2174,6 +1934,10 @@ impl Merge for Header { if lhs.base_radius != rhs.base_radius { return Err(merge::Error::IonexBaseRadiusMismatch); } + + //TODO: this is not enough, need to take into account and rescale.. + lhs.exponent = std::cmp::min(lhs.exponent, rhs.exponent); + merge::merge_mut_option(&mut lhs.description, &rhs.description); merge::merge_mut_option(&mut lhs.mapping, &rhs.mapping); if lhs.elevation_cutoff == 0.0 { @@ -2188,6 +1952,10 @@ impl Merge for Header { } } } + // add special comment + let now = Epoch::now()?; + let merge_comment = Self::merge_comment(now); + self.comments.push(merge_comment); Ok(()) } } @@ -2260,7 +2028,7 @@ mod test { use super::parse_formatted_month; #[test] fn formatted_month_parser() { - for (desc, expected) in vec![("Jan", 1), ("Feb", 2), ("Mar", 3), ("Nov", 11), ("Dec", 12)] { + for (desc, expected) in [("Jan", 1), ("Feb", 2), ("Mar", 3), ("Nov", 11), ("Dec", 12)] { let month = parse_formatted_month(desc); assert!(month.is_ok(), "failed to parse month from \"{}\"", desc); let month = month.unwrap(); diff --git a/rinex/src/ionex/grid.rs b/rinex/src/ionex/grid.rs index eecd067a3..2947ec2c8 100644 --- a/rinex/src/ionex/grid.rs +++ b/rinex/src/ionex/grid.rs @@ -112,6 +112,6 @@ mod test { ); let grid = GridLinspace::new(1.0, 10.0, 1.0).unwrap(); assert_eq!(grid.length(), 10); - assert_eq!(grid.is_single_point(), false); + assert!(!grid.is_single_point()); } } diff --git a/rinex/src/ionex/mod.rs b/rinex/src/ionex/mod.rs index 600fbd9fc..d9694a3ce 100644 --- a/rinex/src/ionex/mod.rs +++ b/rinex/src/ionex/mod.rs @@ -1,10 +1,11 @@ //! IONEX module use super::Sv; +use hifitime::Epoch; use std::collections::HashMap; use strum_macros::EnumString; pub mod record; -pub use record::{Map, Record}; +pub use record::{Record, TECPlane, TEC}; pub mod grid; pub use grid::{Grid, GridLinspace}; @@ -41,6 +42,10 @@ pub enum BiasSource { #[derive(Debug, Clone, PartialEq)] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] pub struct HeaderFields { + /// Epoch of first map + pub epoch_of_first_map: Epoch, + /// Epoch of last map + pub epoch_of_last_map: Epoch, /// Reference system used for following TEC maps, /// cf. [system::RefSystem]. pub reference: RefSystem, @@ -76,6 +81,8 @@ pub struct HeaderFields { impl Default for HeaderFields { fn default() -> Self { Self { + epoch_of_first_map: Epoch::default(), + epoch_of_last_map: Epoch::default(), reference: RefSystem::default(), exponent: -1, // very important: allows missing EXPONENT fields map_dimension: 2, // 2D map by default @@ -109,7 +116,7 @@ impl HeaderFields { pub fn with_description(&self, desc: &str) -> Self { let mut s = self.clone(); if let Some(ref mut d) = s.description { - d.push_str(" "); + d.push(' '); d.push_str(desc) } else { s.description = Some(desc.to_string()) @@ -129,7 +136,7 @@ impl HeaderFields { } pub fn with_observables(&self, o: &str) -> Self { let mut s = self.clone(); - if o.len() > 0 { + if !o.is_empty() { s.observables = Some(o.to_string()) } s @@ -197,14 +204,14 @@ mod test { fn test_mapping_func() { let content = "COSZ"; let func = MappingFunction::from_str(content); - assert_eq!(func.is_ok(), true); + assert!(func.is_ok()); assert_eq!(func.unwrap(), MappingFunction::CosZ); let content = "QFAC"; let func = MappingFunction::from_str(content); - assert_eq!(func.is_ok(), true); + assert!(func.is_ok()); assert_eq!(func.unwrap(), MappingFunction::QFac); let content = "DONT"; let func = MappingFunction::from_str(content); - assert_eq!(func.is_err(), true); + assert!(func.is_err()); } } diff --git a/rinex/src/ionex/record.rs b/rinex/src/ionex/record.rs index db29edd57..a1abeae75 100644 --- a/rinex/src/ionex/record.rs +++ b/rinex/src/ionex/record.rs @@ -1,71 +1,41 @@ use crate::{merge, merge::Merge, prelude::*, split, split::Split}; -use super::{grid, GridLinspace}; +use super::grid; +use crate::epoch; use hifitime::Duration; -use std::collections::BTreeMap; +use std::collections::{BTreeMap, HashMap}; use std::str::FromStr; use thiserror::Error; -pub(crate) fn is_new_tec_map(line: &str) -> bool { +pub(crate) fn is_new_tec_plane(line: &str) -> bool { line.contains("START OF TEC MAP") } -pub(crate) fn is_new_rms_map(line: &str) -> bool { +pub(crate) fn is_new_rms_plane(line: &str) -> bool { line.contains("START OF RMS MAP") } -pub(crate) fn is_new_height_map(line: &str) -> bool { - line.contains("START OF HEIGHT MAP") -} - -/// Returns true if given content describes the start of -/// a Ionosphere map. -pub(crate) fn is_new_map(line: &str) -> bool { - is_new_tec_map(line) || is_new_rms_map(line) || is_new_height_map(line) -} +/* + * Don't know what Height maps are actually + */ +// pub(crate) fn is_new_height_map(line: &str) -> bool { +// line.contains("START OF HEIGHT MAP") +// } -/// A Map is a list of estimates for -/// a given Latitude, Longitude, Altitude #[derive(Debug, Clone, PartialEq, PartialOrd)] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] -pub struct MapPoint { - /// Latitude of this estimate - pub latitude: f64, - /// Longitude of this estimate - pub longitude: f64, - /// Altitude of this estimate - pub altitude: f64, - /// Actual estimate (scaling applied) - pub value: f64, +pub struct TEC { + /// TEC value + pub tec: f64, + /// RMS(tec) + pub rms: Option, } -pub type Map = Vec; +pub type TECPlane = HashMap<(i32, i32), TEC>; -/* - * Merges `rhs` into `lhs` in up to 3 dimensions - */ -fn map_merge3d_mut(lhs: &mut Map, rhs: &Map) { - for rhs_p in rhs { - let mut found = false; - for lhs_p in lhs.into_iter() { - found |= (lhs_p.latitude == rhs_p.latitude) - && (lhs_p.longitude == rhs_p.longitude) - && (lhs_p.altitude == rhs_p.altitude); - if found { - break; - } - } - if !found { - lhs.push(rhs_p.clone()); - } - } -} - -/// `IONEX` record is sorted by epoch. -/// For each epoch, a TEC map is always given. -/// Possible RMS map and Height map may exist at a given epoch. -/// Ionosphere maps are always given in Earth fixed reference frames. +/// IONEX contains 2D (fixed altitude) or 3D Ionosphere Maps. +/// See [Rinex::ionex] and related feature for more information. /// ``` /// use rinex::prelude::*; /// use rinex::ionex::*; @@ -87,33 +57,32 @@ fn map_merge3d_mut(lhs: &mut Map, rhs: &Map) { /// assert_eq!(params.elevation_cutoff, 0.0); /// assert_eq!(params.mapping, None); // no mapping function /// } -/// let record = rinex.record.as_ionex() -/// .unwrap(); -/// for (epoch, (tec, rms, height)) in record { -/// // RMS map never provided in this file -/// assert_eq!(rms.is_none(), true); -/// // 2D IONEX: height maps never provided -/// assert_eq!(height.is_none(), true); -/// // We only get TEC maps -/// // when using TEC values, we previously applied all required scalings -/// for point in tec { -/// let lat = point.latitude; // in ddeg -/// let lon = point.longitude; // in ddeg -/// let alt = point.altitude; // in km -/// let value = point.value; // correctly scaled ("exponent") -/// } -/// } /// ``` -pub type Record = BTreeMap, Option)>; +pub type Record = BTreeMap<(Epoch, i32), TECPlane>; #[derive(Debug, Error)] pub enum Error { - #[error("failed to parse map index")] - ParseIndexError, + #[error("failed to parse map index from \"{0}\"")] + MapIndexParsing(String), #[error("faulty epoch description")] EpochDescriptionError, - #[error("faulty longitude range definition")] - LongitudeRangeError(#[from] grid::Error), + #[error("bad grid definition")] + BadGridDefinition(#[from] grid::Error), + #[error("failed to parse {0} coordinates from \"{1}\"")] + CoordinatesParsing(String, String), + #[error("failed to parse epoch")] + EpochParsing(#[from] epoch::ParsingError), +} + +/* + * Merges `rhs` into `lhs` + */ +fn merge_plane_mut(lhs: &mut TECPlane, rhs: &TECPlane) { + for (coord, tec) in rhs { + if lhs.get(coord).is_none() { + lhs.insert(*coord, tec.clone()); + } + } } /* @@ -121,20 +90,29 @@ pub enum Error { * - a TEC map * - an RMS tec map * - an height map - * defined for returned Epoch + * Returns: Epoth(t), nth Map index, latitude, altitude and TEC plane accross longitudes */ -pub(crate) fn parse_map(header: &mut Header, content: &str) -> Result<(usize, Epoch, Map), Error> { +pub(crate) fn parse_plane( + content: &str, + header: &mut Header, + is_rms_plane: bool, +) -> Result<(Epoch, i32, TECPlane), Error> { let lines = content.lines(); let mut epoch = Epoch::default(); - let mut map = Map::with_capacity(128); // result - let mut latitude: f64 = 0.0; // current latitude - let mut altitude: f64 = 0.0; // current altitude - let mut ptr: usize = 0; // pointer in longitude space - let mut linspace = GridLinspace::default(); // (longitude) linspace + let mut plane = TECPlane::with_capacity(128); + + // this can't fail at this point let ionex = header .ionex .as_mut() .expect("faulty ionex context: missing specific header definitions"); + + // current {lat, lon} within current grid def. + let mut latitude = 0_i32; + let mut longitude = 0_i32; + let mut altitude = 0_i32; + let mut dlon = (ionex.grid.longitude.spacing * 1000.0) as i32; + for line in lines { if line.len() > 60 { let (content, marker) = line.split_at(60); @@ -142,90 +120,126 @@ pub(crate) fn parse_map(header: &mut Header, content: &str) -> Result<(usize, Ep continue; // skip that one } else if marker.contains("END OF") && marker.contains("MAP") { let index = content.split_at(6).0; - if let Ok(u) = u32::from_str_radix(index.trim(), 10) { - return Ok((u as usize, epoch, map)); - } else { - return Err(Error::ParseIndexError); - } + let index = index.trim(); + let _map_index = index + .parse::() + .or(Err(Error::MapIndexParsing(index.to_string())))?; + + return Ok((epoch, altitude, plane)); } else if marker.contains("LAT/LON1/LON2/DLON/H") { - // space coordinates definition for next block + // grid definition for next block let (_, rem) = content.split_at(2); + let (lat, rem) = rem.split_at(6); + let lat = lat.trim(); + let lat = f64::from_str(lat).or(Err(Error::CoordinatesParsing( + String::from("latitude"), + lat.to_string(), + )))?; + let (lon1, rem) = rem.split_at(6); - let (lon2, rem) = rem.split_at(6); - let (dlon, rem) = rem.split_at(6); + let lon1 = lon1.trim(); + let lon1 = f64::from_str(lon1).or(Err(Error::CoordinatesParsing( + String::from("longitude"), + lon1.to_string(), + )))?; + + let (_lon2, rem) = rem.split_at(6); + //let lon2 = lon2.trim(); + //let lon2 = f64::from_str(lon2).or(Err(Error::CoordinatesParsing( + // String::from("longitude"), + // lon2.to_string(), + //)))?; + + let (dlon_str, rem) = rem.split_at(6); + let dlon_str = dlon_str.trim(); + let dlon_f64 = f64::from_str(dlon_str).or(Err(Error::CoordinatesParsing( + String::from("longitude"), + dlon_str.to_string(), + )))?; + let (h, _) = rem.split_at(6); - latitude = - f64::from_str(lat.trim()).expect("failed to parse grid latitude start point"); - let lon1 = - f64::from_str(lon1.trim()).expect("failed to parse longitude start point"); - let lon2 = f64::from_str(lon2.trim()).expect("failed to parse longitude end point"); - let dlon = - f64::from_str(dlon.trim()).expect("failed to parse longitude grid spacing"); - altitude = f64::from_str(h.trim()).expect("failed to parse next grid altitude"); - linspace = GridLinspace::new(lon1, lon2, dlon)?; - ptr = 0; + let h = h.trim(); + let alt = f64::from_str(h).or(Err(Error::CoordinatesParsing( + String::from("altitude"), + h.to_string(), + )))?; + + altitude = (alt.round() * 100.0_f64) as i32; + latitude = (lat.round() * 1000.0_f64) as i32; + longitude = (lon1.round() * 1000.0_f64) as i32; + dlon = (dlon_f64.round() * 1000.0_f64) as i32; + + // debug + // println!("NEW GRID : h: {} lat : {} lon : {}, dlon: {}", altitude, latitude, longitude, dlon); } else if marker.contains("EPOCH OF CURRENT MAP") { - // time definition - let items: Vec<&str> = content.split_ascii_whitespace().collect(); - if items.len() != 6 { - return Err(Error::EpochDescriptionError); - } - if let Ok(y) = i32::from_str_radix(items[0].trim(), 10) { - if let Ok(m) = u8::from_str_radix(items[1].trim(), 10) { - if let Ok(d) = u8::from_str_radix(items[2].trim(), 10) { - if let Ok(hh) = u8::from_str_radix(items[3].trim(), 10) { - if let Ok(mm) = u8::from_str_radix(items[4].trim(), 10) { - if let Ok(ss) = u8::from_str_radix(items[5].trim(), 10) { - epoch = Epoch::from_gregorian_utc(y, m, d, hh, mm, ss, 0); - } - } - } - } - } - } + epoch = epoch::parse_utc(content)?.0; } else if marker.contains("EXPONENT") { - // scaling redefinition - if let Ok(e) = i8::from_str_radix(content.trim(), 10) { - *ionex = ionex.with_exponent(e); // scaling update + // update current scaling + if let Ok(e) = content.trim().parse::() { + ionex.exponent = e; } } else { // parsing TEC values - for item in line.split_ascii_whitespace().into_iter() { - if let Ok(v) = i32::from_str_radix(item.trim(), 10) { - // parse & apply correct scaling + for item in line.split_ascii_whitespace() { + if let Ok(v) = item.trim().parse::() { let mut value = v as f64; + // current scaling value *= 10.0_f64.powf(ionex.exponent as f64); - map.push(MapPoint { - latitude, - longitude: linspace.start + linspace.spacing * ptr as f64, - altitude, - value, - }); - ptr += 1; + + let tec = match is_rms_plane { + true => { + TEC { + tec: 0.0_f64, // DONT CARE + rms: Some(value), + } + }, + false => TEC { + tec: value, + rms: None, + }, + }; + + plane.insert((latitude, longitude), tec); } + + longitude += dlon; + //debug + //println!("longitude: {}", longitude); } } } else { // less than 60 characters // parsing TEC values - for item in line.split_ascii_whitespace().into_iter() { - if let Ok(v) = i32::from_str_radix(item.trim(), 10) { - // parse & apply correct scaling + for item in line.split_ascii_whitespace() { + if let Ok(v) = item.trim().parse::() { let mut value = v as f64; + // current scaling value *= 10.0_f64.powf(ionex.exponent as f64); - map.push(MapPoint { - latitude, - longitude: linspace.start + linspace.spacing * ptr as f64, - altitude, - value, - }); - ptr += 1; + + let tec = match is_rms_plane { + true => { + TEC { + tec: 0.0_f64, // DONT CARE + rms: Some(value), + } + }, + false => TEC { + tec: value, + rms: None, + }, + }; + + plane.insert((latitude, longitude), tec); } + + longitude += dlon; + //debug + //println!("longitude: {}", longitude); } } } - Ok((0, epoch, map)) + Ok((epoch, altitude, plane)) } #[cfg(test)] @@ -233,150 +247,25 @@ mod test { use super::*; #[test] fn test_new_tec_map() { - assert_eq!( - is_new_tec_map( - "1 START OF TEC MAP" - ), - true - ); - assert_eq!( - is_new_tec_map( - "1 START OF RMS MAP" - ), - false - ); - assert_eq!( - is_new_rms_map( - "1 START OF RMS MAP" - ), - true - ); - assert_eq!( - is_new_height_map( - "1 START OF HEIGHT MAP" - ), - true - ); - } - #[test] - fn test_merge_map2d() { - let mut lhs = vec![ - MapPoint { - latitude: 0.0, - longitude: 0.0, - altitude: 0.0, - value: 1.0, - }, - MapPoint { - latitude: 0.0, - longitude: 10.0, - altitude: 0.0, - value: 2.0, - }, - MapPoint { - latitude: 0.0, - longitude: 20.0, - altitude: 0.0, - value: 3.0, - }, - MapPoint { - latitude: 10.0, - longitude: 0.0, - altitude: 0.0, - value: 4.0, - }, - MapPoint { - latitude: 10.0, - longitude: 10.0, - altitude: 0.0, - value: 5.0, - }, - MapPoint { - latitude: 10.0, - longitude: 20.0, - altitude: 0.0, - value: 6.0, - }, - ]; - let rhs = vec![ - MapPoint { - latitude: 0.0, - longitude: 0.0, - altitude: 0.0, - value: 0.0, - }, - MapPoint { - latitude: 5.0, - longitude: 0.0, - altitude: 0.0, - value: 1.0, - }, - MapPoint { - latitude: 10.0, - longitude: 0.0, - altitude: 0.0, - value: 0.0, - }, - MapPoint { - latitude: 10.0, - longitude: 25.0, - altitude: 0.0, - value: 6.0, - }, - ]; - let expected = vec![ - MapPoint { - latitude: 0.0, - longitude: 0.0, - altitude: 0.0, - value: 1.0, - }, - MapPoint { - latitude: 0.0, - longitude: 10.0, - altitude: 0.0, - value: 2.0, - }, - MapPoint { - latitude: 0.0, - longitude: 20.0, - altitude: 0.0, - value: 3.0, - }, - MapPoint { - latitude: 10.0, - longitude: 0.0, - altitude: 0.0, - value: 4.0, - }, - MapPoint { - latitude: 10.0, - longitude: 10.0, - altitude: 0.0, - value: 5.0, - }, - MapPoint { - latitude: 10.0, - longitude: 20.0, - altitude: 0.0, - value: 6.0, - }, - MapPoint { - latitude: 5.0, - longitude: 0.0, - altitude: 0.0, - value: 1.0, - }, - MapPoint { - latitude: 10.0, - longitude: 25.0, - altitude: 0.0, - value: 6.0, - }, - ]; - map_merge3d_mut(&mut lhs, &rhs); - assert_eq!(&lhs, &expected); + assert!(is_new_tec_plane( + "1 START OF TEC MAP" + )); + assert!(!is_new_tec_plane( + "1 START OF RMS MAP" + )); + assert!(is_new_rms_plane( + "1 START OF RMS MAP" + )); + // assert_eq!( + // is_new_height_map( + // "1 START OF HEIGHT MAP" + // ), + // true + // ); } + //#[test] + //fn test_merge_map2d() { + //} } impl Merge for Record { @@ -388,31 +277,21 @@ impl Merge for Record { } /// Merges `rhs` into `Self` fn merge_mut(&mut self, rhs: &Self) -> Result<(), merge::Error> { - for (epoch, maps) in rhs { - let (tec, rms, h) = maps; - if let Some(lhs_maps) = self.get_mut(epoch) { - let (lhs_tec, lhs_rms, lhs_h) = lhs_maps; - - map_merge3d_mut(&mut lhs_tec.to_vec(), tec); - - if let Some(map) = rms { - if let Some(lhs_map) = lhs_rms { - map_merge3d_mut(&mut lhs_map.to_vec(), map); - } else { - *lhs_rms = Some(map.to_vec()); // RMS map now provided - } - } - - if let Some(map) = h { - if let Some(lhs_map) = lhs_h { - map_merge3d_mut(&mut lhs_map.to_vec(), map); + for (eh, plane) in rhs { + if let Some(lhs_plane) = self.get_mut(eh) { + for (latlon, plane) in plane { + if let Some(tec) = lhs_plane.get_mut(latlon) { + if let Some(rms) = plane.rms { + if tec.rms.is_none() { + tec.rms = Some(rms); + } + } } else { - *lhs_h = Some(map.to_vec()); // H map now provided + lhs_plane.insert(*latlon, plane.clone()); } } } else { - // new epoch - self.insert(*epoch, (tec.to_vec(), rms.clone(), h.clone())); + self.insert(*eh, plane.clone()); } } Ok(()) @@ -421,27 +300,27 @@ impl Merge for Record { impl Split for Record { fn split(&self, epoch: Epoch) -> Result<(Self, Self), split::Error> { - let r0 = self + let before = self .iter() - .flat_map(|(k, v)| { - if *k < epoch { - Some((k.clone(), v.clone())) + .flat_map(|((e, h), plane)| { + if *e < epoch { + Some(((*e, *h), plane.clone())) } else { None } }) .collect(); - let r1 = self + let after = self .iter() - .flat_map(|(k, v)| { - if *k >= epoch { - Some((k.clone(), v.clone())) + .flat_map(|((e, h), plane)| { + if *e >= epoch { + Some(((*e, *h), plane.clone())) } else { None } }) .collect(); - Ok((r0, r1)) + Ok((before, after)) } fn split_dt(&self, _duration: Duration) -> Result, split::Error> { Ok(Vec::new()) @@ -461,27 +340,27 @@ impl Mask for Record { fn mask_mut(&mut self, mask: MaskFilter) { match mask.operand { MaskOperand::Equals => match mask.item { - TargetItem::EpochItem(epoch) => self.retain(|e, _| *e == epoch), + TargetItem::EpochItem(epoch) => self.retain(|(e, _), _| *e == epoch), _ => {}, // TargetItem:: does not apply }, MaskOperand::NotEquals => match mask.item { - TargetItem::EpochItem(epoch) => self.retain(|e, _| *e != epoch), + TargetItem::EpochItem(epoch) => self.retain(|(e, _), _| *e != epoch), _ => {}, // TargetItem:: does not apply }, MaskOperand::GreaterEquals => match mask.item { - TargetItem::EpochItem(epoch) => self.retain(|e, _| *e >= epoch), + TargetItem::EpochItem(epoch) => self.retain(|(e, _), _| *e >= epoch), _ => {}, // TargetItem:: does not apply }, MaskOperand::GreaterThan => match mask.item { - TargetItem::EpochItem(epoch) => self.retain(|e, _| *e > epoch), + TargetItem::EpochItem(epoch) => self.retain(|(e, _), _| *e > epoch), _ => {}, // TargetItem:: does not apply }, MaskOperand::LowerEquals => match mask.item { - TargetItem::EpochItem(epoch) => self.retain(|e, _| *e <= epoch), + TargetItem::EpochItem(epoch) => self.retain(|(e, _), _| *e <= epoch), _ => {}, // TargetItem:: does not apply }, MaskOperand::LowerThan => match mask.item { - TargetItem::EpochItem(epoch) => self.retain(|e, _| *e < epoch), + TargetItem::EpochItem(epoch) => self.retain(|(e, _), _| *e < epoch), _ => {}, // TargetItem:: does not apply }, } diff --git a/rinex/src/leap.rs b/rinex/src/leap.rs index ae1b6be1f..2a90e8427 100644 --- a/rinex/src/leap.rs +++ b/rinex/src/leap.rs @@ -59,7 +59,7 @@ impl std::str::FromStr for Leap { match items.len() > 2 { false => { // [1] simple format: basic - ls.leap = u32::from_str_radix(items[0].trim(), 10)? + ls.leap = items[0].trim().parse::()?; }, true => { // [2] complex format: advanced infos @@ -68,10 +68,10 @@ impl std::str::FromStr for Leap { let (week, rem) = rem.split_at(5); let (day, rem) = rem.split_at(5); let system = rem.trim(); - ls.leap = u32::from_str_radix(leap.trim(), 10)?; - ls.delta_tls = Some(u32::from_str_radix(tls.trim(), 10)?); - ls.week = Some(u32::from_str_radix(week.trim(), 10)?); - ls.day = Some(u32::from_str_radix(day.trim(), 10)?); + ls.leap = leap.trim().parse::()?; + ls.delta_tls = Some(tls.trim().parse::()?); + ls.week = Some(week.trim().parse::()?); + ls.day = Some(day.trim().parse::()?); if system.eq("") { ls.timescale = None } else { @@ -91,7 +91,7 @@ mod test { fn basic_format() { let content = "18"; let leap = Leap::from_str(content); - assert_eq!(leap.is_ok(), true); + assert!(leap.is_ok()); let leap = leap.unwrap(); assert_eq!(leap.leap, 18); } @@ -99,7 +99,7 @@ mod test { fn standard_format() { let content = "18 18 2185 7"; let leap = Leap::from_str(content); - assert_eq!(leap.is_ok(), true); + assert!(leap.is_ok()); let leap = leap.unwrap(); assert_eq!(leap.leap, 18); assert_eq!(leap.week, Some(2185)); @@ -109,7 +109,7 @@ mod test { fn parse_with_timescale() { let content = "18 18 2185 7GPS"; let leap = Leap::from_str(content); - assert_eq!(leap.is_ok(), true); + assert!(leap.is_ok()); let leap = leap.unwrap(); assert_eq!(leap.leap, 18); assert_eq!(leap.week, Some(2185)); diff --git a/rinex/src/lib.rs b/rinex/src/lib.rs index de93b605b..519f20b0e 100644 --- a/rinex/src/lib.rs +++ b/rinex/src/lib.rs @@ -1,6 +1,7 @@ #![doc(html_logo_url = "https://raw.githubusercontent.com/georust/meta/master/logo/logo.png")] #![doc = include_str!("../README.md")] #![cfg_attr(docrs, feature(doc_cfg))] +#![allow(clippy::type_complexity)] pub mod antex; pub mod carrier; @@ -52,13 +53,14 @@ use std::collections::{BTreeMap, HashMap}; use thiserror::Error; use hifitime::Duration; +use ionex::TECPlane; use observable::Observable; use observation::Crinex; use version::Version; /// Package to include all basic structures pub mod prelude { - pub use crate::constellation::{Augmentation, Constellation}; + pub use crate::constellation::Constellation; pub use crate::epoch::EpochFlag; pub use crate::ground_position::GroundPosition; pub use crate::header::Header; @@ -116,6 +118,46 @@ macro_rules! hourly_session { #[cfg(docrs)] pub use bibliography::Bibliography; +/* + * returns true if given line is a comment + */ +pub(crate) fn is_rinex_comment(content: &str) -> bool { + content.len() > 60 && content.trim_end().ends_with("COMMENT") +} + +/* + * macro to format one header line or a comment + */ +pub(crate) fn fmt_rinex(content: &str, marker: &str) -> String { + if content.len() < 60 { + format!("{: String { + fmt_rinex(content, "COMMENT") +} + #[derive(Clone, Default, Debug, PartialEq)] /// `Rinex` describes a `RINEX` file, it comprises a [Header] section, /// and a [record::Record] file body. @@ -335,12 +377,12 @@ impl Rinex { /// IONEX specific filename convention fn ionex_filename(&self) -> String { let mut ret: String = "ccc".to_string(); // 3 figue Analysis center - ret.push_str("e"); // extension or region code "G" for global ionosphere maps + ret.push('e'); // extension or region code "G" for global ionosphere maps ret.push_str("ddd"); // day of the year of first record - ret.push_str("h"); // file sequence number (1,2,...) or hour (A, B.., Z) within day + ret.push('h'); // file sequence number (1,2,...) or hour (A, B.., Z) within day ret.push_str("yy"); // 2 digit year - ret.push_str("I"); // ionex - //ret.to_uppercase(); //TODO + ret.push('I'); // ionex + //ret.to_uppercase(); //TODO ret } @@ -418,7 +460,7 @@ impl Rinex { //NB - _FFU is omitted for files containing navigation data let uf = String::from("Z"); let c: String = match header.constellation { - Some(c) => c.to_1_letter_code().to_uppercase(), + Some(c) => format!("{:x}", c).to_uppercase(), _ => String::from("X"), }; let t: String = match rtype { @@ -522,56 +564,6 @@ impl Rinex { } } - /// Converts 2D Ionex to 3D ionex by - /// providing some height maps. - pub fn with_height_maps(&self, height: BTreeMap) -> Self { - let mut s = self.clone(); - s.to_ionex_3d(height); - s - } - - /// Add RMS maps to self, for epochs - /// where such map was not previously provided - pub fn with_rms_maps(&self, rms: BTreeMap) -> Self { - let mut s = self.clone(); - if let Some(r) = s.record.as_mut_ionex() { - for (e, (_, rms_map, _)) in r.iter_mut() { - if let Some(m) = rms.get(e) { - *rms_map = Some(m.to_vec()); - } - } - } - s - } - - /// Provide Height maps for epochs where such map was not previously provided - pub fn to_ionex_3d(&mut self, height: BTreeMap) { - if let Some(ionex) = self.header.ionex.as_mut() { - ionex.map_dimension = 3; - } - if let Some(r) = self.record.as_mut_ionex() { - for (e, (_, _, map_h)) in r.iter_mut() { - if let Some(m) = height.get(e) { - *map_h = Some(m.to_vec()); - } - } - } - } - - /// Returns ionex map borders, as North Eastern - /// and South Western latitude longitude coordinates, - /// expressed in ddeg° - pub fn ionex_map_borders(&self) -> Option<((f64, f64), (f64, f64))> { - if let Some(params) = &self.header.ionex { - Some(( - (params.grid.latitude.start, params.grid.longitude.start), - (params.grid.latitude.end, params.grid.longitude.end), - )) - } else { - None - } - } - /// Returns true if this is a METEO RINEX pub fn is_meteo_rinex(&self) -> bool { self.header.rinex_type == types::Type::MeteoData @@ -602,7 +594,6 @@ impl Rinex { .count() > 0 } - /// Returns true if Antenna Phase Center variations are compensated /// for in this file. Useful for high precision application. pub fn pcv_compensation(&self, constellation: Constellation) -> bool { @@ -617,60 +608,30 @@ impl Rinex { /// meaning, this file is the combination of two RINEX files merged together. /// This is determined by the presence of a custom yet somewhat standardized `FILE MERGE` comments pub fn is_merged(&self) -> bool { - for (_, content) in self.comments.iter() { - for c in content { - if c.contains("FILE MERGE") { - return true; - } + let special_comment = String::from("FILE MERGE"); + for comment in self.header.comments.iter() { + if comment.contains(&special_comment) { + return true; } } false } - //TODO: move to ObsverationIter - // /// Returns [`Epoch`]s where a loss of lock event happened. - // /// This is only relevant on OBS RINEX. - // pub fn epoch_lock_loss(&self) -> Vec { - // self.lli_and_mask(observation::LliFlags::LOCK_LOSS).epoch() - // } - /// Removes all observations where receiver phase lock was lost. /// This is only relevant on OBS RINEX. pub fn lock_loss_filter_mut(&mut self) { self.lli_and_mask_mut(observation::LliFlags::LOCK_LOSS) } - pub fn retain_best_elevation_angles_mut(&mut self) { - unimplemented!("retain_best_elev: use preprocessing toolkit instead"); - //let best_vehicles = self.space_vehicles_best_elevation_angle(); - //if let Some(record) = self.record.as_mut_nav() { - // record.retain(|e, classes| { - // let best = best_vehicles.get(e).unwrap(); - // classes.retain(|class, frames| { - // if *class == navigation::FrameClass::Ephemeris { - // frames.retain(|fr| { - // let (_, sv, _) = fr.as_eph().unwrap(); - // best.contains(sv) - // }); - // frames.len() > 0 - // } else { - // false - // } - // }); - // classes.len() > 0 - // }); - //} - } - /// List [clocks::record::System] (reference systems) contained in this CLK RINEX. /// Reference systems can either be an Sv or a ground station. pub fn clock_ref_systems(&self) -> Vec { let mut map: Vec = Vec::new(); if let Some(r) = self.record.as_clock() { - for (_, dtypes) in r { - for (_dtype, systems) in dtypes { - for (system, _) in systems { - if !map.contains(&system) { + for dtypes in r.values() { + for systems in dtypes.values() { + for system in systems.keys() { + if !map.contains(system) { map.push(system.clone()); } } @@ -686,9 +647,9 @@ impl Rinex { pub fn clock_ref_stations(&self) -> Vec { let mut ret: Vec = Vec::with_capacity(32); if let Some(r) = self.record.as_clock() { - for (_, dtypes) in r { - for (_, systems) in dtypes { - for (system, _) in systems { + for dtypes in r.values() { + for systems in dtypes.values() { + for system in systems.keys() { if let clocks::System::Station(station) = system { if !ret.contains(station) { ret.push(station.clone()); @@ -730,101 +691,6 @@ impl Rinex { c.lli_and_mask_mut(mask); c } - - /// Extracts signal strength as (min, max) duplet, - /// accross all vehicles. - /// Only relevant on Observation RINEX. - pub fn observation_ssi_minmax(&self) -> Option<(observation::Snr, observation::Snr)> { - let mut ret: Option<(observation::Snr, observation::Snr)> = None; - if let Some(r) = self.record.as_obs() { - for (_, (_, vehicles)) in r.iter() { - for (_, observation) in vehicles.iter() { - for (_, data) in observation.iter() { - if let Some(snr) = data.snr { - if let Some((min, max)) = &mut ret { - if snr < *min { - *min = snr; - } else if snr > *max { - *max = snr; - } - } - } - } - } - } - } - ret - } - - /// Extracts signal strength as (min, max) duplet, - /// per vehicle. Only relevant on Observation RINEX - pub fn observation_ssi_sv_minmax(&self) -> HashMap { - let mut map: HashMap = HashMap::new(); - if let Some(r) = self.record.as_obs() { - for (_, (_, vehicles)) in r.iter() { - for (sv, observations) in vehicles.iter() { - let (mut min, mut max) = (observation::Snr::DbHz54, observation::Snr::DbHz0); - for (_, observation) in observations.iter() { - if let Some(ssi) = observation.snr { - min = std::cmp::min(min, ssi); - max = std::cmp::max(max, ssi); - } - } - map.insert(*sv, (min, max)); - } - } - } - map - } - - /* - /// Applies given elevation mask - pub fn elevation_mask_mut( - &mut self, - mask: navigation::ElevationMask, - ref_pos: Option<(f64, f64, f64)>, - ) { - let ref_pos = match ref_pos { - Some(ref_pos) => ref_pos, - _ => self.header.coords.expect( - "can't apply an elevation mask when ground/ref position is unknown. - Specify one yourself with `ref_pos`", - ), - }; - if let Some(r) = self.record.as_mut_nav() { - r.retain(|epoch, classes| { - classes.retain(|class, frames| { - if *class == navigation::FrameClass::Ephemeris { - frames.retain(|fr| { - let (_, _, ephemeris) = fr.as_eph().unwrap(); - if let Some((el, _)) = ephemeris.sat_elev_azim(*epoch, ref_pos) { - mask.fits(el) - } else { - false - } - }); - frames.len() > 0 - } else { - // not an EPH - true // keep it anyway - } - }); - classes.len() > 0 - }) - } - } - */ - /* - pub fn elevation_mask( - &self, - mask: navigation::ElevationMask, - ref_pos: Option<(f64, f64, f64)>, - ) -> Self { - let mut s = self.clone(); - s.elevation_mask_mut(mask, ref_pos); - s - } - */ /// Aligns Phase observations at origin pub fn observation_phase_align_origin_mut(&mut self) { let mut init_phases: HashMap> = HashMap::new(); @@ -1125,30 +991,6 @@ impl Rinex { results } */ - /* - /// Applies Hatch filter to all Pseudo Range observations. - /// When feasible dual frequency dual code method is prefered - /// for optimal, fully unbiased smoothed PR. - /// PR observations get modified in place - pub fn observation_pseudorange_smoothing_mut(&mut self) { - if let Some(r) = self.record.as_mut_obs() { - for ((epoch, _), (_, svs)) in r { - for (sv, observations) in svs { - for (code, observation) in observations { - - } - } - } - } - } - - pub fn observation_pseudorange_smoothing(&self) -> Self { - let mut s = self.clone(); - s.observation_pseudorange_smoothing(); - s - } - */ - /* /// Returns epochs where a so called "cycle slip" has been confirmed. /// We confirm a cycle slip by computing the double difference @@ -1268,7 +1110,7 @@ impl Rinex { /// ``` pub fn dominant_sample_rate(&self) -> Option { self.sampling_histogram() - .max_by(|(_, x_pop), (_, y_pop)| x_pop.cmp(y_pop)) + .max_by(|(_, pop_i), (_, pop_j)| pop_i.cmp(pop_j)) .map(|dominant| dominant.0) } /// Histogram analysis on Epoch interval. Although @@ -1394,6 +1236,7 @@ impl Rinex { * These methods are used to browse data easily and efficiently. * It includes Format dependent extraction methods : one per format. */ +use crate::navigation::NavFrame; use itertools::Itertools; // .unique() use observation::ObservationData; @@ -1408,7 +1251,7 @@ impl Rinex { } else if let Some(r) = self.record.as_clock() { Box::new(r.iter().map(|(k, _)| *k)) } else if let Some(r) = self.record.as_ionex() { - Box::new(r.iter().map(|(k, _)| *k)) + Box::new(r.iter().map(|((k, _), _)| *k)) } else { panic!( "cannot get an epoch iterator for \"{:?}\" RINEX", @@ -1468,7 +1311,7 @@ impl Rinex { // grab all vehicles through all epochs, // fold them into a unique list record - .into_iter() + .iter() .flat_map(|(_, frames)| { frames .iter() @@ -1758,10 +1601,41 @@ impl Rinex { .flat_map(|record| record.iter()), ) } + /// Returns Navigation Data interator (any type of message). + /// NAV records may contain several different types of frames. + /// You should prefer narrowed down methods, like [ephemeris] or + /// [ionosphere_models] but those require the "nav" feature. + /// ``` + /// use rinex::prelude::*; + /// use rinex::navigation::NavMsgType; + /// let rinex = Rinex::from_file("../test_resources/NAV/V2/amel0010.21g") + /// .unwrap(); + /// for (epoch, nav_frames) in rinex.navigation() { + /// for frame in nav_frames { + /// // this record only contains ephemeris frames + /// assert!(frame.as_eph().is_some()); + /// assert!(frame.as_ion().is_none()); + /// assert!(frame.as_eop().is_none()); + /// assert!(frame.as_sto().is_none()); + /// if let Some((msg, sv, data)) = frame.as_eph() { + /// // this record only contains legacy frames + /// assert_eq!(msg, NavMsgType::LNAV); + /// } + /// } + /// } + /// ``` + pub fn navigation(&self) -> Box)> + '_> { + Box::new( + self.record + .as_nav() + .into_iter() + .flat_map(|record| record.iter()), + ) + } } #[cfg(feature = "obs")] -use crate::observation::Snr; +use crate::observation::{LliFlags, Snr}; /* * OBS RINEX specific methods: only available on crate feature. @@ -1793,6 +1667,8 @@ impl Rinex { }) })) } + /// Returns a Unique Iterator over signal Codes, like "1C" or "1P" + /// for precision code. pub fn code(&self) -> Box + '_> { Box::new( self.observation() @@ -1965,6 +1841,26 @@ impl Rinex { }) })) } + /// Returns an Iterator over pseudo range observations in valid + /// Epochs, with valid LLI flags + pub fn pseudo_range_ok(&self) -> Box + '_> { + Box::new(self.observation().flat_map(|((e, flag), (_, vehicles))| { + vehicles.iter().flat_map(|(sv, observations)| { + observations.iter().filter_map(|(obs, obsdata)| { + if obs.is_pseudorange_observable() { + if flag.is_ok() { + Some((*e, *sv, obs, obsdata.obs)) + } else { + None + } + } else { + None + } + }) + }) + })) + } + /// Returns an Iterator over fractional pseudo range observations pub fn pseudo_range_fract( &self, @@ -2045,6 +1941,25 @@ impl Rinex { } /// Returns an Iterator over signal SNR indications. /// All observation that did not come with such indication are filtered out. + /// ``` + /// use rinex::*; + /// let rinex = + /// Rinex::from_file("../test_resources/OBS/V3/ALAC00ESP_R_20220090000_01D_30S_MO.rnx") + /// .unwrap(); + /// for ((e, flag), sv, observable, snr) in rinex.snr() { + /// // See RINEX specs or [Snr] documentation + /// if snr.weak() { + /// } else if snr.strong() { + /// } else if snr.excellent() { + /// } + /// // you can directly compare to dBHz + /// if snr < 29.0.into() { + /// // considered weak signal + /// } else if snr >= 30.0.into() { + /// // considered strong signal + /// } + /// } + /// ``` pub fn snr(&self) -> Box + '_> { Box::new(self.observation().flat_map(|(e, (_, vehicles))| { vehicles.iter().flat_map(|(sv, observations)| { @@ -2054,6 +1969,33 @@ impl Rinex { }) })) } + /// Returns an Iterator over LLI flags that might be associated to an Observation. + /// ``` + /// use rinex::*; + /// use rinex::observation::LliFlags; + /// let rinex = + /// Rinex::from_file("../test_resources/OBS/V3/ALAC00ESP_R_20220090000_01D_30S_MO.rnx") + /// .unwrap(); + /// let custom_mask + /// = LliFlags::OK_OR_UNKNOWN | LliFlags::UNDER_ANTI_SPOOFING; + /// for ((e, flag), sv, observable, lli) in rinex.lli() { + /// // See RINEX specs or [LliFlags] documentation + /// if lli.intersects(custom_mask) { + /// // sane observation but under AS + /// } + /// } + /// ``` + pub fn lli( + &self, + ) -> Box + '_> { + Box::new(self.observation().flat_map(|(e, (_, vehicles))| { + vehicles.iter().flat_map(|(sv, observations)| { + observations + .iter() + .filter_map(|(obs, obsdata)| obsdata.lli.map(|lli| (*e, *sv, obs, lli))) + }) + })) + } /// Returns an Iterator over "complete" Epochs. /// "Complete" Epochs are Epochs were both Phase and Pseudo Range /// observations are present on two carriers, sane sampling conditions are met @@ -2064,46 +2006,50 @@ impl Rinex { ) -> Box)> + '_> { Box::new( self.observation() - .filter_map(|((e, flag), (_, vehicles))| { + .filter_map(move |((e, flag), (_, vehicles))| { if flag.is_ok() { let mut list: Vec<(Sv, Carrier)> = Vec::new(); for (sv, observables) in vehicles { let mut l1_pr_ph = (false, false); let mut lx_pr_ph: HashMap = HashMap::new(); - let mut criteria_met = true; for (observable, observation) in observables { if !observable.is_phase_observable() && !observable.is_pseudorange_observable() { continue; // not interesting here } - let carrier_code = &observable.to_string()[1..2]; + //let carrier_code = &observable.to_string()[1..2]; let carrier = Carrier::from_observable(sv.constellation, observable); if carrier.is_err() { // fail to identify this signal continue; } + if let Some(min_snr) = min_snr { + if let Some(snr) = observation.snr { + if snr < min_snr { + continue; + } + } else { + continue; // can't compare to criteria + } + } let carrier = carrier.unwrap(); if carrier == Carrier::L1 { l1_pr_ph.0 |= observable.is_pseudorange_observable(); l1_pr_ph.1 |= observable.is_phase_observable(); - } else { - if let Some((lx_pr, lx_ph)) = lx_pr_ph.get_mut(&carrier) { - *lx_pr |= observable.is_pseudorange_observable(); - *lx_ph |= observable.is_phase_observable(); - } else { - if observable.is_pseudorange_observable() { - lx_pr_ph.insert(carrier, (true, false)); - } else if observable.is_phase_observable() { - lx_pr_ph.insert(carrier, (false, true)); - } - } + } else if let Some((lx_pr, lx_ph)) = lx_pr_ph.get_mut(&carrier) { + *lx_pr |= observable.is_pseudorange_observable(); + *lx_ph |= observable.is_phase_observable(); + } else if observable.is_pseudorange_observable() { + lx_pr_ph.insert(carrier, (true, false)); + } else if observable.is_phase_observable() { + lx_pr_ph.insert(carrier, (false, true)); } } if l1_pr_ph == (true, true) { for (carrier, (pr, ph)) in lx_pr_ph { - if pr == true && ph == true { + if pr && ph { list.push((*sv, carrier)); } } @@ -2114,14 +2060,14 @@ impl Rinex { None } }) - .filter(|(sv, list)| !list.is_empty()), + .filter(|(_sv, list)| !list.is_empty()), ) } } #[cfg(feature = "nav")] use crate::navigation::{ - BdModel, EopMessage, Ephemeris, IonMessage, KbModel, NavFrame, NavMsgType, NgModel, StoMessage, + BdModel, EopMessage, Ephemeris, IonMessage, KbModel, NavMsgType, NgModel, StoMessage, }; //#[cfg(feature = "nav")] @@ -2138,35 +2084,6 @@ use map_3d::ecef2geodetic; #[cfg(feature = "nav")] #[cfg_attr(docrs, doc(cfg(feature = "nav")))] impl Rinex { - /// Returns NAV frames interator (any types). - /// NAV record may contain several different types of frames. - /// ``` - /// use rinex::prelude::*; - /// use rinex::navigation::NavMsgType; - /// let rinex = Rinex::from_file("../test_resources/NAV/V2/amel0010.21g") - /// .unwrap(); - /// for (epoch, nav_frames) in rinex.navigation() { - /// for frame in nav_frames { - /// // this record only contains ephemeris frames - /// assert!(frame.as_eph().is_some()); - /// assert!(frame.as_ion().is_none()); - /// assert!(frame.as_eop().is_none()); - /// assert!(frame.as_sto().is_none()); - /// if let Some((msg, sv, data)) = frame.as_eph() { - /// // this record only contains legacy frames - /// assert_eq!(msg, NavMsgType::LNAV); - /// } - /// } - /// } - /// ``` - pub fn navigation(&self) -> Box)> + '_> { - Box::new( - self.record - .as_nav() - .into_iter() - .flat_map(|record| record.iter()), - ) - } /// Returns a Unique Iterator over [`NavMsgType`]s that were identified /// ``` /// use rinex::prelude::*; @@ -2185,7 +2102,7 @@ impl Rinex { self.navigation() .map(|(_, frames)| { frames - .into_iter() + .iter() .filter_map(|fr| { if let Some((msg, _, _)) = fr.as_eph() { Some(msg) @@ -2240,41 +2157,51 @@ impl Rinex { }) })) } - /// Select Ephemeris data for given Sv at desired Epoch "t" - /// using closest TOE in time - pub fn sv_ephemeris(&self, sv: Sv, t: Epoch) -> Option<(Epoch, Ephemeris)> { - /* ephemeris data for this sv */ - let ephemeris_toe = self.ephemeris().filter_map(|(toc, (_, svnn, eph))| { - if *svnn == sv { - let ts = sv.constellation.timescale()?; - if let Some(toe) = eph.toe(ts) { - Some((toc, eph, toe)) + /// Ephemeris selection method. Use this method to select Ephemeris + /// to be used in "sv" navigation at "t" instant. Returns (toe and ephemeris frame). + pub fn sv_ephemeris(&self, sv: Sv, t: Epoch) -> Option<(Epoch, &Ephemeris)> { + /* + * minimize self.ephemeris with closest toe to t + * with toe <= t + * and t < toe + max dtoe + * TODO + * = match msg { + NavMsgType::CNAV => { + /* in CNAV : specs says toc is toe actually */ + // TODO Some(toc.in_time_scale(ts)) + None + }, + _ => { + /* determine toe */ + eph.toe(ts) + }, + }; + //TODO : this fails at this point + // on both GLONASS and SBAS + // therfore, kills rtk with these two constellations + let toe = toe?; + let dt = t - toe; + let max_dtoe = Ephemeris::max_dtoe(svnn.constellation)?; + // = None; - - for (toc, ephemeris, toe) in ephemeris_toe { - let dt = t - toe; - if dt <= max_dtoe { - // allowed - if let Some((ref mut ttoc, ref mut ep, ref mut ddt)) = ret.as_mut() { - *ep = ephemeris.clone(); - *ddt = dt; - *ttoc = *toc; - } else { - ret = Some((*toc, ephemeris.clone(), dt)); - } - } - } - let (toc, ephemeris, _) = ret?; - Some((toc, ephemeris)) + }) + .min_by_key(|(toe_i, _)| (t - *toe_i).abs()) } /// Returns an Iterator over Sv (embedded) clock offset (s), drift (s.s⁻¹) and /// drift rate (s.s⁻²) @@ -2295,30 +2222,6 @@ impl Rinex { .map(|(e, (_, sv, data))| (*e, *sv, data.sv_clock())), ) } - /// Returns Sv clock bias, at desired t_tx that should be expressed - /// in correct timescale. - pub fn sv_clock_bias(&self, sv: Sv, t_tx: Epoch) -> Option { - let (toc, ephemeris) = self.sv_ephemeris(sv, t_tx)?; - let (a0, a1, a2) = ephemeris.sv_clock(); - match sv.constellation { - Constellation::Glonass => { - //GLONASST not supported - //let ts = sv.constellation.timescale()?; - //let toe = ephemeris.toe()?; - //let dt = (t_tx - toe).to_seconds(); - //Some(Duration::from_seconds(-a0 + a1 * dt)) - None - }, - Constellation::Geo | Constellation::SBAS(_) => { - let dt = (t_tx - toc).to_seconds(); - Some(Duration::from_seconds(a0 + a1 * dt + a2 * dt.powi(2))) - }, - _ => { - let dt = (t_tx - toc).to_seconds(); - Some(Duration::from_seconds(a0 + a1 * dt + a2 * dt.powi(2))) - }, - } - } /// Returns an Iterator over Sv position vectors, /// expressed in km ECEF for all Epochs. /// ``` @@ -2899,12 +2802,10 @@ impl Rinex { /// ``` pub fn hail_detected(&self) -> bool { if let Some(r) = self.record.as_meteo() { - for (_, observables) in r { + for observables in r.values() { for (observ, value) in observables { - if *observ == Observable::HailIndicator { - if *value > 0.0 { - return true; - } + if *observ == Observable::HailIndicator && *value > 0.0 { + return true; } } } @@ -2925,34 +2826,14 @@ impl Merge for Rinex { /// Merges `rhs` into `Self` in place fn merge_mut(&mut self, rhs: &Self) -> Result<(), merge::Error> { self.header.merge_mut(&rhs.header)?; + if self.epoch().count() == 0 { + // lhs is empty : overwrite + self.record = rhs.record.clone(); + } else if rhs.epoch().count() != 0 { + // real merge + self.record.merge_mut(&rhs.record)?; + } Ok(()) - //TODO: needs to reapply - //if self.epoch().len() == 0 { - // // self is empty - // self.record = rhs.record.clone(); - // Ok(()) - //} else if rhs.epoch().len() == 0 { - // // nothing to merge - // Ok(()) - //} else { - // // add special marker, ts: YYYYDDMM HHMMSS UTC - // let now = hifitime::Epoch::now().expect("failed to retrieve system time"); - // let (y, m, d, hh, mm, ss, _) = now.to_gregorian_utc(); - // self.header.comments.push(format!( - // "rustrnx-{:<20} FILE MERGE {}{}{} {}{}{} {}", - // env!("CARGO_PKG_VERSION"), - // y + 1900, - // m, - // d, - // hh, - // mm, - // ss, - // now.time_scale - // )); - // // RINEX record merging - // self.record.merge_mut(&rhs.record)?; - // Ok(()) - //} } } @@ -3152,6 +3033,7 @@ impl Dcb for Rinex { use observation::Combine; #[cfg(feature = "obs")] +#[cfg_attr(docrs, doc(cfg(feature = "obs")))] impl Combine for Rinex { fn geo_free( &self, @@ -3195,6 +3077,7 @@ impl Combine for Rinex { use observation::IonoDelay; #[cfg(feature = "obs")] +#[cfg_attr(docrs, doc(cfg(feature = "obs")))] impl IonoDelay for Rinex { fn iono_delay( &self, @@ -3208,14 +3091,138 @@ impl IonoDelay for Rinex { } } +/* + * IONEX specific feature + */ +#[cfg(feature = "ionex")] +#[cfg_attr(docrs, doc(cfg(feature = "ionex")))] +impl Rinex { + /// Iterates over IONEX maps, per Epoch and altitude. + /// ``` + /// use rinex::prelude::*; + /// ``` + fn ionex(&self) -> Box + '_> { + Box::new( + self.record + .as_ionex() + .into_iter() + .flat_map(|record| record.iter()), + ) + } + /// Returns an iterator over TEC values exclusively. + /// ``` + /// use rinex::prelude::*; + /// let rnx = Rinex::from_file("../test_resources/IONEX/V1/CKMG0020.22I.gz") + /// .unwrap(); + /// for (t, lat, lon, alt, tec) in rnx.tec() { + /// // t: Epoch + /// // lat: ddeg + /// // lon: ddeg + /// // alt: km + /// // tec: TECu (f64: properly scaled) + /// } + /// ``` + pub fn tec(&self) -> Box + '_> { + Box::new(self.ionex().flat_map(|((e, h), plane)| { + plane.iter().map(|((lat, lon), tec)| { + ( + *e, + *lat as f64 / 1000.0_f64, + *lon as f64 / 1000.0_f64, + *h as f64 / 100.0_f64, + tec.tec, + ) + }) + })) + } + /// Returns an iterator over TEC RMS exclusively + /// ``` + /// use rinex::prelude::*; + /// let rnx = Rinex::from_file("../test_resources/IONEX/V1/jplg0010.17i.gz") + /// .unwrap(); + /// for (t, lat, lon, alt, rms) in rnx.tec_rms() { + /// // t: Epoch + /// // lat: ddeg + /// // lon: ddeg + /// // alt: km + /// // rms|TECu| (f64) + /// } + /// ``` + pub fn tec_rms(&self) -> Box + '_> { + Box::new(self.ionex().flat_map(|((e, h), plane)| { + plane.iter().filter_map(|((lat, lon), tec)| { + tec.rms.map(|rms| { + ( + *e, + *lat as f64 / 1000.0_f64, + *lon as f64 / 1000.0_f64, + *h as f64 / 100.0_f64, + rms, + ) + }) + }) + })) + } + /// Returns 2D fixed altitude value, expressed in km, in case self is a 2D IONEX. + /// ``` + /// use rinex::prelude::*; + /// let rnx = Rinex::from_file("../test_resources/IONEX/V1/jplg0010.17i.gz") + /// .unwrap(); + /// assert_eq!(rnx.tec_fixed_altitude(), Some(450.0)); + /// + /// let rnx = Rinex::from_file("../test_resources/IONEX/V1/CKMG0020.22I.gz") + /// .unwrap(); + /// assert_eq!(rnx.tec_fixed_altitude(), Some(350.0)); + /// ``` + pub fn tec_fixed_altitude(&self) -> Option { + if self.is_ionex_2d() { + let header = self.header.ionex.as_ref()?; + Some(header.grid.height.start) + } else { + None + } + } + /// Returns altitude range of this 3D IONEX as {min, max} + /// both expressed in km. + pub fn tec_altitude_range(&self) -> Option<(f64, f64)> { + if self.is_ionex_3d() { + let header = self.header.ionex.as_ref()?; + Some((header.grid.height.start, header.grid.height.end)) + } else { + None + } + } + /// Returns 2D TEC plane at specified altitude and time. + /// Refer to the header.grid specification for its width and height. + pub fn tec_plane(&self, t: Epoch, h: f64) -> Option<&TECPlane> { + self.ionex() + .filter_map(|((e, alt), plane)| { + if t == *e && (*alt as f64) / 100.0 == h { + Some(plane) + } else { + None + } + }) + .reduce(|plane, _| plane) // is unique, in a normal IONEX + } + /// Returns IONEX map borders, expressed as North Eastern + /// and South Western (latitude; longitude) coordinates, + /// both expressed in ddeg. + pub fn tec_map_borders(&self) -> Option<((f64, f64), (f64, f64))> { + let ionex = self.header.ionex.as_ref()?; + Some(( + (ionex.grid.latitude.start, ionex.grid.longitude.start), + (ionex.grid.latitude.end, ionex.grid.longitude.end), + )) + } +} + #[cfg(test)] mod test { use super::*; use std::str::FromStr; #[test] fn test_macros() { - assert_eq!(is_comment!("This is a comment COMMENT"), true); - assert_eq!(is_comment!("This is a comment"), false); let _ = sv!("G01"); let _ = sv!("R03"); let _ = gnss!("GPS"); @@ -3233,4 +3240,39 @@ mod test { assert_eq!(hourly_session!(5), "f"); assert_eq!(hourly_session!(23), "x"); } + use crate::{fmt_comment, is_rinex_comment}; + #[test] + fn fmt_comments_singleline() { + for desc in [ + "test", + "just a basic comment", + "just another lengthy comment blahblabblah", + ] { + let comment = fmt_comment(desc); + assert!( + comment.len() >= 60, + "comments should be at least 60 byte long" + ); + assert_eq!( + comment.find("COMMENT"), + Some(60), + "comment marker should located @ 60" + ); + assert!(is_rinex_comment(&comment), "should be valid comment"); + } + } + #[test] + fn fmt_wrapped_comments() { + for desc in ["just trying to form a very lengthy comment that will overflow since it does not fit in a single line", + "just trying to form a very very lengthy comment that will overflow since it does fit on three very meaningful lines. Imazdmazdpoakzdpoakzpdokpokddddddddddddddddddaaaaaaaaaaaaaaaaaaaaaaa"] { + let nb_lines = num_integer::div_ceil(desc.len(), 60); + let comments = fmt_comment(desc); + assert_eq!(comments.lines().count(), nb_lines); + for line in comments.lines() { + assert!(line.len() >= 60, "comment line should be at least 60 byte long"); + assert_eq!(line.find("COMMENT"), Some(60), "comment marker should located @ 60"); + assert!(is_rinex_comment(line), "should be valid comment"); + } + } + } } diff --git a/rinex/src/macros.rs b/rinex/src/macros.rs index 0f25d2e00..ea895c089 100644 --- a/rinex/src/macros.rs +++ b/rinex/src/macros.rs @@ -36,14 +36,6 @@ macro_rules! filter { }; } -/// Returns `true` if given `Rinex` line is a comment -#[macro_export] -macro_rules! is_comment { - ($line: expr) => { - $line.trim_end().ends_with("COMMENT") - }; -} - /// Builds a [crate::GroundPosition] in WGS84 #[macro_export] macro_rules! wgs84 { diff --git a/rinex/src/merge.rs b/rinex/src/merge.rs index 9f0217296..d2e49b20b 100644 --- a/rinex/src/merge.rs +++ b/rinex/src/merge.rs @@ -15,10 +15,12 @@ pub enum Error { IonexReferenceMismatch, #[error("cannot merge ionex with different grid definition")] IonexMapGridMismatch, - #[error("cannot merge ionex with different map dimensions")] + #[error("cannot merge ionex of different dimensions")] IonexMapDimensionsMismatch, #[error("cannot merge ionex where base radius differs")] IonexBaseRadiusMismatch, + #[error("failed to retrieve system time for merge ops date")] + HifitimeError(#[from] hifitime::Errors), } /* @@ -35,7 +37,7 @@ pub(crate) fn merge_mut_vec(lhs: &mut Vec, rhs: &Vec) { */ pub(crate) fn merge_mut_unique_vec(lhs: &mut Vec, rhs: &Vec) { for item in rhs { - if !lhs.contains(&item) { + if !lhs.contains(item) { lhs.push(item.clone()); } } @@ -49,9 +51,9 @@ pub(crate) fn merge_mut_unique_map2d>, ) { for (k, values) in rhs.iter() { - if let Some(vvalues) = lhs.get_mut(&k) { + if let Some(vvalues) = lhs.get_mut(k) { for value in values { - if !vvalues.contains(&value) { + if !vvalues.contains(value) { vvalues.push(value.clone()); } } diff --git a/rinex/src/meteo/record.rs b/rinex/src/meteo/record.rs index bdc884ad3..75e80f5bb 100644 --- a/rinex/src/meteo/record.rs +++ b/rinex/src/meteo/record.rs @@ -69,13 +69,13 @@ pub(crate) fn parse_epoch( let codes = &header.meteo.as_ref().unwrap().codes; let nb_codes = codes.len(); - let nb_lines: usize = num_integer::div_ceil(nb_codes, 8).into(); + let nb_lines: usize = num_integer::div_ceil(nb_codes, 8); let mut code_index: usize = 0; for i in 0..nb_lines { for _ in 0..8 { let code = &codes[code_index]; - let obs: Option = match f64::from_str(&line[offset..offset + 7].trim()) { + let obs: Option = match f64::from_str(line[offset..offset + 7].trim()) { Ok(f) => Some(f), Err(_) => None, }; @@ -126,13 +126,13 @@ pub(crate) fn fmt_epoch( if let Some(data) = data.get(obscode) { lines.push_str(&format!("{:7.1}", data)); } else { - lines.push_str(&format!(" ")); + lines.push_str(" "); } if (index % 8) == 0 { - lines.push_str("\n"); + lines.push('\n'); } } - lines.push_str("\n"); + lines.push('\n'); Ok(lines) } @@ -142,35 +142,35 @@ mod test { #[test] fn test_new_epoch() { let content = " 22 1 4 0 0 0 993.4 -6.8 52.9 1.6 337.0 0.0 0.0"; - assert_eq!( - is_new_epoch(content, version::Version { major: 2, minor: 0 }), - true - ); + assert!(is_new_epoch( + content, + version::Version { major: 2, minor: 0 } + )); let content = " 22 1 4 0 0 0 993.4 -6.8 52.9 1.6 337.0 0.0 0.0"; - assert_eq!( - is_new_epoch(content, version::Version { major: 2, minor: 0 }), - true - ); + assert!(is_new_epoch( + content, + version::Version { major: 2, minor: 0 } + )); let content = " 22 1 4 9 55 0 997.9 -6.4 54.2 2.9 342.0 0.0 0.0"; - assert_eq!( - is_new_epoch(content, version::Version { major: 2, minor: 0 }), - true - ); + assert!(is_new_epoch( + content, + version::Version { major: 2, minor: 0 } + )); let content = " 22 1 4 10 0 0 997.9 -6.3 55.4 3.4 337.0 0.0 0.0"; - assert_eq!( - is_new_epoch(content, version::Version { major: 2, minor: 0 }), - true - ); + assert!(is_new_epoch( + content, + version::Version { major: 2, minor: 0 } + )); let content = " 08 1 1 0 0 1 1018.0 25.1 75.9 1.4 95.0 0.0 0.0"; - assert_eq!( - is_new_epoch(content, version::Version { major: 2, minor: 0 }), - true - ); + assert!(is_new_epoch( + content, + version::Version { major: 2, minor: 0 } + )); let content = " 2021 1 7 0 0 0 993.3 23.0 90.0"; - assert_eq!( - is_new_epoch(content, version::Version { major: 4, minor: 0 }), - true - ); + assert!(is_new_epoch( + content, + version::Version { major: 4, minor: 0 } + )); } } @@ -204,7 +204,7 @@ impl Split for Record { .iter() .flat_map(|(k, v)| { if k < &epoch { - Some((k.clone(), v.clone())) + Some((*k, v.clone())) } else { None } @@ -214,7 +214,7 @@ impl Split for Record { .iter() .flat_map(|(k, v)| { if k >= &epoch { - Some((k.clone(), v.clone())) + Some((*k, v.clone())) } else { None } @@ -244,7 +244,7 @@ impl Mask for Record { TargetItem::ObservableItem(filter) => { self.retain(|_, data| { data.retain(|code, _| filter.contains(code)); - data.len() > 0 + !data.is_empty() }); }, _ => {}, @@ -254,7 +254,7 @@ impl Mask for Record { TargetItem::ObservableItem(filter) => { self.retain(|_, data| { data.retain(|code, _| !filter.contains(code)); - data.len() > 0 + !data.is_empty() }); }, _ => {}, @@ -338,7 +338,7 @@ impl Decimate for Record { } fn decimate_match(&self, rhs: &Self) -> Self { let mut s = self.clone(); - s.decimate_match_mut(&rhs); + s.decimate_match_mut(rhs); s } } diff --git a/rinex/src/meteo/sensor.rs b/rinex/src/meteo/sensor.rs index 4181abf9f..086ca4da3 100644 --- a/rinex/src/meteo/sensor.rs +++ b/rinex/src/meteo/sensor.rs @@ -37,14 +37,14 @@ impl std::str::FromStr for Sensor { let (observable, _) = rem.split_at(2); Ok(Self { model: { - if model.trim().len() > 0 { + if !model.trim().is_empty() { Some(model.trim().to_string()) } else { None } }, sensor_type: { - if s_type.trim().len() > 0 { + if !s_type.trim().is_empty() { Some(s_type.trim().to_string()) } else { None @@ -82,14 +82,14 @@ impl std::fmt::Display for Sensor { } else { write!(f, "{:11}", "")? } - write!(f, "{} SENSOR MOD/TYPE/ACC\n", self.observable)?; + writeln!(f, "{} SENSOR MOD/TYPE/ACC", self.observable)?; if let Some((x, y, z, h)) = self.position { write!(f, "{:14.4}", x)?; write!(f, "{:14.4}", y)?; write!(f, "{:14.4}", z)?; write!(f, "{:14.4}", h)?; - write!(f, " {} SENSOR POS XYZ/H\n", self.observable)? + writeln!(f, " {} SENSOR POS XYZ/H", self.observable)? } Ok(()) } @@ -164,7 +164,7 @@ mod test { #[test] fn from_str() { let s = Sensor::from_str(" 0.0 PR "); - assert_eq!(s.is_ok(), true); + assert!(s.is_ok()); let s = s.unwrap(); assert_eq!(s.model, None); assert_eq!(s.sensor_type, None); @@ -174,10 +174,10 @@ mod test { let s = Sensor::from_str( "PAROSCIENTIFIC 740-16B 0.2 PR SENSOR MOD/TYPE/ACC", ); - assert_eq!(s.is_ok(), true); + assert!(s.is_ok()); let s = Sensor::from_str( " 0.0 PR SENSOR MOD/TYPE/ACC", ); - assert_eq!(s.is_ok(), true); + assert!(s.is_ok()); } } diff --git a/rinex/src/navigation/ephemeris.rs b/rinex/src/navigation/ephemeris.rs index 22743f157..3da9e320e 100644 --- a/rinex/src/navigation/ephemeris.rs +++ b/rinex/src/navigation/ephemeris.rs @@ -1,7 +1,7 @@ use super::{orbits::closest_nav_standards, NavMsgType, OrbitItem}; use crate::{epoch, prelude::*, sv, version::Version}; -use hifitime::GPST_REF_EPOCH; +use hifitime::Unit; use std::collections::HashMap; use std::str::FromStr; use thiserror::Error; @@ -127,16 +127,42 @@ impl Ephemeris { self.orbits.get("week").and_then(|field| field.as_u32()) } /* - * Retrieves toe, expressed as an Epoch, if Week + TOE are properly received + * Returns TGD field if such field is not empty + */ + pub fn tgd(&self) -> Option { + self.get_orbit_f64("tgd") + } + /* + * Helper to apply a clock correction to provided time (expressed as Epoch) + */ + pub fn sv_clock_corr(sv: Sv, clock_bias: (f64, f64, f64), t: Epoch, toe: Epoch) -> Duration { + let (a0, a1, a2) = clock_bias; + match sv.constellation { + Constellation::Glonass => { + todo!("sv_clock_corr not supported for glonass @ the moment"); + }, + _ => { + let dt = (t - toe).to_seconds(); + Duration::from_seconds(a0 + a1 * dt + a2 * dt.powi(2)) + }, + } + } + /* + * Retrieves and express TOE as an hifitime Epoch */ pub(crate) fn toe(&self, ts: TimeScale) -> Option { - let week = self.get_week()?; - let toe_f64 = self.get_orbit_f64("toe")?; - Some(Epoch::from_time_of_week( - week, - toe_f64.round() as u64 * 1_000_000_000, - ts, - )) + /* toe week counter */ + let mut week = self.get_week()?; + if ts == TimeScale::GST { + /* Galileo vehicles stream week counter referenced to GPST.. */ + week -= 1024; + } + + // "toe" field is seconds within current week to obtain toe + let secs_dur = self.get_orbit_f64("toe")?; + let week_dur = (week * 7) as f64 * Unit::Day; + + Some(Epoch::from_duration(week_dur + secs_dur * Unit::Second, ts)) } /* * Parses ephemeris from given line iterator @@ -155,49 +181,34 @@ impl Ephemeris { true => 3, false => 4, }; - let date_offset: usize = match version.major < 3 { - true => 19, - false => 19, - }; let (svnn, rem) = line.split_at(svnn_offset); - let (date, rem) = rem.split_at(date_offset); + let (date, rem) = rem.split_at(19); let (clk_bias, rem) = rem.split_at(19); let (clk_dr, clk_drr) = rem.split_at(19); - let mut sv = Sv::default(); - let mut epoch = Epoch::default(); - - match version.major { - 1 | 2 => { - match constellation { - Constellation::Mixed => { - // not sure that even exists - sv = Sv::from_str(svnn.trim())? - }, - _ => { - sv.constellation = constellation; - sv.prn = u8::from_str_radix(svnn.trim(), 10)?; - }, - } - }, - 3 => { - sv = Sv::from_str(svnn.trim())?; + //println!("SVNN \"{}\"", svnn); // DEBUG + let sv = match Sv::from_str(svnn.trim()) { + Ok(sv) => sv, + Err(_) => { + // parsing failed probably due to omitted constellation (old rev.) + let desc = format!("{:x}{:02}", constellation, svnn.trim()); + Sv::from_str(&desc)? }, - _ => unreachable!("V4 is treated in a dedicated method"), }; + //println!("\"{}\"={}", svnn, sv); // DEBUG let ts = sv .constellation - .to_timescale() + .timescale() .ok_or(Error::TimescaleIdentification(sv))?; //println!("V2/V3 CONTENT \"{}\" TIMESCALE {}", line, ts); //DEBUG let (epoch, _) = epoch::parse_in_timescale(date.trim(), ts)?; - let clock_bias = f64::from_str(clk_bias.replace("D", "E").trim())?; - let clock_drift = f64::from_str(clk_dr.replace("D", "E").trim())?; - let clock_drift_rate = f64::from_str(clk_drr.replace("D", "E").trim())?; + let clock_bias = f64::from_str(clk_bias.replace('D', "E").trim())?; + let clock_drift = f64::from_str(clk_dr.replace('D', "E").trim())?; + let clock_drift_rate = f64::from_str(clk_drr.replace('D', "E").trim())?; // parse orbits : // only Legacy Frames in V2 and V3 (old) RINEX let orbits = parse_orbits(version, NavMsgType::LNAV, sv.constellation, lines)?; @@ -233,9 +244,9 @@ impl Ephemeris { let (clk_bias, rem) = rem.split_at(19); let (clk_dr, clk_drr) = rem.split_at(19); - let clock_bias = f64::from_str(clk_bias.replace("D", "E").trim())?; - let clock_drift = f64::from_str(clk_dr.replace("D", "E").trim())?; - let clock_drift_rate = f64::from_str(clk_drr.replace("D", "E").trim())?; + let clock_bias = f64::from_str(clk_bias.replace('D', "E").trim())?; + let clock_drift = f64::from_str(clk_dr.replace('D', "E").trim())?; + let clock_drift_rate = f64::from_str(clk_drr.replace('D', "E").trim())?; let orbits = parse_orbits(Version { major: 4, minor: 0 }, msg, sv.constellation, lines)?; Ok(( epoch, @@ -318,40 +329,39 @@ impl Ephemeris { s } /* - * Manual calculations of satellite position vector, in km ECEF. - * `t_sv`: orbit epoch as parsed in RINEX. - * TODO: this is currently only verified in GPST - need to verify GST/BDT/IRNSST support - * See [Bibliography::AsceAppendix3] and [Bibliography::JLe19] - */ - pub(crate) fn kepler2ecef(&self, sv: &Sv, epoch: Epoch) -> Option<(f64, f64, f64)> { - // To form t_sv : we need to convert UTC time to GNSS time. - // Hifitime v4, once released, will help here - let mut t_sv = epoch.clone(); + * Kepler equation solver at desired instant "t" for given "sv" + * based off Self. Self must be correctly selected in navigation + * record. + * "t" does not have to expressed in correct timescale prior this calculation + * See [Bibliography::AsceAppendix3] and [Bibliography::JLe19] + */ + pub(crate) fn kepler2ecef(&self, sv: &Sv, t: Epoch) -> Option<(f64, f64, f64)> { + let mut t = t; + + /* + * if "t" is not expressed in the correct constellation, + * take that into account + */ + t.time_scale = sv.timescale()?; match sv.constellation { Constellation::GPS | Constellation::QZSS => { - t_sv.time_scale = TimeScale::GPST; - t_sv -= Duration::from_seconds(18.0); // GPST(t=0) number of leap seconds @ the time + t -= Duration::from_seconds(18.0); // GPST(t=0) number of leap seconds @ the time }, Constellation::Galileo => { - t_sv.time_scale = TimeScale::GST; - t_sv -= Duration::from_seconds(31.0); // GST(t=0) number of leap seconds @ the time + t -= Duration::from_seconds(31.0); // GST(t=0) number of leap seconds @ the time }, Constellation::BeiDou => { - t_sv.time_scale = TimeScale::BDT; - t_sv -= Duration::from_seconds(32.0); // BDT(t=0) number of leap seconds @ the time + t -= Duration::from_seconds(32.0); // BDT(t=0) number of leap seconds @ the time }, _ => {}, // either not needed, or most probably not truly supported } + let toe = self.toe(t.time_scale)?; let kepler = self.kepler()?; let perturbations = self.perturbations()?; - let weeks = self.get_week()?; - let t0 = GPST_REF_EPOCH + Duration::from_days((weeks * 7).into()); - let toe = t0 + Duration::from_seconds(kepler.toe as f64); - let t_k = (t_sv - toe).to_seconds(); + let t_k = (t - toe).to_seconds(); let n0 = (Kepler::EARTH_GM_CONSTANT / kepler.a.powf(3.0)).sqrt(); let n = n0 + perturbations.dn; @@ -385,12 +395,10 @@ impl Ephemeris { Some((x_k / 1000.0, y_k / 1000.0, z_k / 1000.0)) } - /* - * Returns Sv position in km ECEF, based off Self Ephemeris data, - * and for given Satellite Vehicle at given Epoch. - * Either by solving Kepler equations, or directly if such data is available. - */ - pub(crate) fn sv_position(&self, sv: &Sv, epoch: Epoch) -> Option<(f64, f64, f64)> { + /// Returns Sv position in km ECEF, based off Self Ephemeris data, + /// and for given Satellite Vehicle at given Epoch. + /// Either by solving Kepler equations, or directly if such data is available. + pub fn sv_position(&self, sv: &Sv, epoch: Epoch) -> Option<(f64, f64, f64)> { let (x_km, y_km, z_km) = ( self.get_orbit_f64("satPosX"), self.get_orbit_f64("satPosY"), @@ -407,27 +415,21 @@ impl Ephemeris { _ => self.kepler2ecef(sv, epoch), } } - /* - * Computes elev, azim angles both in degrees - */ - pub(crate) fn sv_elev_azim( - &self, - sv: &Sv, - epoch: Epoch, - reference: GroundPosition, - ) -> Option<(f64, f64)> { - let (sv_x, sv_y, sv_z) = self.sv_position(sv, epoch)?; - let (ref_x, ref_y, ref_z) = reference.to_ecef_wgs84(); + /// Helper method to calculate elevation and azimuth angles, both in degrees, + /// between a reference position (in meter ECEF WGS84) and a resolved + /// SV position in the sky, expressed in meter ECEF WFS84. + pub fn elevation_azimuth( + sv_position: (f64, f64, f64), + reference_position: (f64, f64, f64), + ) -> (f64, f64) { + let (sv_x, sv_y, sv_z) = sv_position; // convert ref position to radians(lat, lon) + let (ref_x, ref_y, ref_z) = reference_position; let (ref_lat, ref_lon, _) = map_3d::ecef2geodetic(ref_x, ref_y, ref_z, map_3d::Ellipsoid::WGS84); // ||sv - ref_pos|| pseudo range - let a_i = ( - sv_x * 1000.0 - ref_x, - sv_y * 1000.0 - ref_y, - sv_z * 1000.0 - ref_z, - ); + let a_i = (sv_x - ref_x, sv_y - ref_y, sv_z - ref_z); let norm = (a_i.0.powf(2.0) + a_i.1.powf(2.0) + a_i.2.powf(2.0)).sqrt(); let a_i = (a_i.0 / norm, a_i.1 / norm, a_i.2 / norm); @@ -456,7 +458,22 @@ impl Ephemeris { if az < 0.0 { az += 360.0; } - Some((el, az)) + (el, az) + } + /* + * Resolves a position and computes elev, azim angles both in degrees + */ + pub(crate) fn sv_elev_azim( + &self, + sv: &Sv, + epoch: Epoch, + reference: GroundPosition, + ) -> Option<(f64, f64)> { + let (sv_x, sv_y, sv_z) = self.sv_position(sv, epoch)?; + Some(Self::elevation_azimuth( + (sv_x * 1.0E3, sv_y * 1.0E3, sv_z * 1.0E3), + reference.to_ecef_wgs84(), + )) } /* * Returns max time difference between an Epoch and @@ -464,15 +481,19 @@ impl Ephemeris { */ pub(crate) fn max_dtoe(c: Constellation) -> Option { match c { - Constellation::GPS | Constellation::QZSS | Constellation::Geo => { - Some(Duration::from_seconds(7200.0)) - }, + Constellation::GPS | Constellation::QZSS => Some(Duration::from_seconds(7200.0)), Constellation::Galileo => Some(Duration::from_seconds(10800.0)), Constellation::BeiDou => Some(Duration::from_seconds(21600.0)), - Constellation::SBAS(_) => Some(Duration::from_seconds(360.0)), Constellation::IRNSS => Some(Duration::from_seconds(86400.0)), Constellation::Glonass => Some(Duration::from_seconds(1800.0)), - _ => None, + c => { + if c.is_sbas() { + //TODO: verify this please + Some(Duration::from_seconds(7200.0)) + } else { + None + } + }, } } } @@ -488,6 +509,11 @@ fn parse_orbits( constell: Constellation, lines: std::str::Lines<'_>, ) -> Result, Error> { + // convert SBAS constell to compatible "sbas" (undetermined/general constell) + let constell = match constell.is_sbas() { + true => Constellation::SBAS, + false => constell, + }; // Determine closest standards from DB // <=> data fields to parse let nav_standards = match closest_nav_standards(constell, version, msg) { @@ -514,38 +540,41 @@ fn parse_orbits( //println!("LINE \"{}\" | NB MISSING {}", line, nb_missing); //DEBUG loop { - if line.len() == 0 { - key_index += nb_missing as usize; + if line.is_empty() { + key_index += nb_missing; break; } let (content, rem) = line.split_at(std::cmp::min(word_size, line.len())); + let content = content.trim(); - if content.trim().len() == 0 { + if content.is_empty() { // omitted field key_index += 1; - if nb_missing > 0 { - nb_missing -= 1; - } + nb_missing = nb_missing.saturating_sub(1); line = rem.clone(); continue; } - - if let Some((key, token)) = fields.get(key_index) { - //println!( - // "Key \"{}\"(index: {}) | Token \"{}\" | Content \"{}\"", - // key, - // key_index, - // token, - // content.trim() - //); //DEBUG - if !key.contains(&"spare") { - if let Ok(item) = OrbitItem::new(token, content.trim(), constell) { - map.insert(key.to_string(), item); + /* + * In NAV RINEX, unresolved data fields are either + * omitted (handled previously) or put a zeros + */ + if !content.contains(".000000000000E+00") { + if let Some((key, token)) = fields.get(key_index) { + //println!( + // "Key \"{}\"(index: {}) | Token \"{}\" | Content \"{}\"", + // key, + // key_index, + // token, + // content.trim() + //); //DEBUG + if !key.contains("spare") { + if let Ok(item) = OrbitItem::new(token, content, constell) { + map.insert(key.to_string(), item); + } } } } - key_index += 1; line = rem.clone(); } diff --git a/rinex/src/navigation/orbits.rs b/rinex/src/navigation/orbits.rs index 6d3ed7ce4..980c1326b 100644 --- a/rinex/src/navigation/orbits.rs +++ b/rinex/src/navigation/orbits.rs @@ -86,30 +86,30 @@ impl OrbitItem { match type_desc { "u8" => { // float->unsigned conversion - let float = f64::from_str(&content.replace("D", "e"))?; + let float = f64::from_str(&content.replace('D', "e"))?; Ok(OrbitItem::U8(float as u8)) }, "i8" => { // float->signed conversion - let float = f64::from_str(&content.replace("D", "e"))?; + let float = f64::from_str(&content.replace('D', "e"))?; Ok(OrbitItem::I8(float as i8)) }, "u32" => { // float->signed conversion - let float = f64::from_str(&content.replace("D", "e"))?; + let float = f64::from_str(&content.replace('D', "e"))?; Ok(OrbitItem::U32(float as u32)) }, - "f64" => Ok(OrbitItem::F64(f64::from_str(&content.replace("D", "e"))?)), + "f64" => Ok(OrbitItem::F64(f64::from_str(&content.replace('D', "e"))?)), "gloStatus" => { // float->unsigned conversion - let float = f64::from_str(&content.replace("D", "e"))?; + let float = f64::from_str(&content.replace('D', "e"))?; let unsigned = float as u32; let status = GloStatus::from_bits(unsigned).unwrap_or(GloStatus::empty()); Ok(OrbitItem::GloStatus(status)) }, "health" => { // float->unsigned conversion - let float = f64::from_str(&content.replace("D", "e"))?; + let float = f64::from_str(&content.replace('D', "e"))?; let unsigned = float as u32; match constellation { Constellation::GPS | Constellation::QZSS => { @@ -127,19 +127,22 @@ impl OrbitItem { .unwrap_or(health::GalHealth::empty()); Ok(OrbitItem::GalHealth(flags)) }, - Constellation::SBAS(_) | Constellation::Geo => { - let flag: health::GeoHealth = num::FromPrimitive::from_u32(unsigned) - .unwrap_or(health::GeoHealth::default()); - Ok(OrbitItem::GeoHealth(flag)) - }, Constellation::IRNSS => { let flag: health::IrnssHealth = num::FromPrimitive::from_u32(unsigned) .unwrap_or(health::IrnssHealth::default()); Ok(OrbitItem::IrnssHealth(flag)) }, - _ => unreachable!(), // MIXED is not feasible here - // as we use the current vehicle's constellation, - // which is always defined + c => { + if c.is_sbas() { + let flag: health::GeoHealth = num::FromPrimitive::from_u32(unsigned) + .unwrap_or(health::GeoHealth::default()); + Ok(OrbitItem::GeoHealth(flag)) + } else { + // Constellation::Mixed will not happen here, + // it's always defined in the database + unreachable!("unhandled case!"); + } + }, } }, // "health" _ => Err(OrbitItemError::UnknownTypeDescriptor(type_desc.to_string())), @@ -164,7 +167,7 @@ impl OrbitItem { /// Unwraps OrbitItem as f64 pub fn as_f64(&self) -> Option { match self { - OrbitItem::F64(f) => Some(f.clone()), + OrbitItem::F64(f) => Some(*f), _ => None, } } @@ -178,14 +181,14 @@ impl OrbitItem { /// Unwraps OrbitItem as u8 pub fn as_u8(&self) -> Option { match self { - OrbitItem::U8(u) => Some(u.clone()), + OrbitItem::U8(u) => Some(*u), _ => None, } } /// Unwraps OrbitItem as i8 pub fn as_i8(&self) -> Option { match self { - OrbitItem::I8(i) => Some(i.clone()), + OrbitItem::I8(i) => Some(*i), _ => None, } } @@ -213,7 +216,7 @@ impl OrbitItem { /// Unwraps Self as GAL orbit Health indication pub fn as_gal_health(&self) -> Option { match self { - OrbitItem::GalHealth(h) => Some(h.clone()), + OrbitItem::GalHealth(h) => Some(*h), _ => None, } } @@ -257,7 +260,7 @@ pub(crate) fn closest_nav_standards( }) .collect(); - if items.len() == 0 { + if items.is_empty() { if minor == 0 { // we're done with this major // -> downgrade to previous major @@ -301,7 +304,7 @@ mod test { // Like we use it when parsing.. let e = OrbitItem::new(value, &content, constellation); assert!( - e.is_ok() == true, + e.is_ok(), "failed to build Orbit Item from (\"{}\", \"{}\", \"{}\")", key, value, @@ -321,7 +324,7 @@ mod test { ); // Test existing (exact match) entries - for (constellation, rev, msg) in vec![ + for (constellation, rev, msg) in [ (Constellation::GPS, Version::new(1, 0), NavMsgType::LNAV), (Constellation::GPS, Version::new(2, 0), NavMsgType::LNAV), (Constellation::GPS, Version::new(4, 0), NavMsgType::LNAV), @@ -342,7 +345,7 @@ mod test { (Constellation::BeiDou, Version::new(4, 0), NavMsgType::CNV1), (Constellation::BeiDou, Version::new(4, 0), NavMsgType::CNV2), (Constellation::BeiDou, Version::new(4, 0), NavMsgType::CNV3), - (Constellation::Geo, Version::new(4, 0), NavMsgType::SBAS), + (Constellation::SBAS, Version::new(4, 0), NavMsgType::SBAS), ] { let found = closest_nav_standards(constellation, rev, msg); assert!( @@ -369,7 +372,7 @@ mod test { } // Test cases where the nearest revision is used, not that exact revision - for (constellation, desired, expected, msg) in vec![ + for (constellation, desired, expected, msg) in [ ( Constellation::GPS, Version::new(5, 0), @@ -419,21 +422,21 @@ mod test { #[test] fn test_db_item() { let e = OrbitItem::U8(10); - assert_eq!(e.as_u8().is_some(), true); - assert_eq!(e.as_u32().is_some(), false); + assert!(e.as_u8().is_some()); + assert!(!e.as_u32().is_some()); let u = e.as_u8().unwrap(); assert_eq!(u, 10); let e = OrbitItem::F64(10.0); - assert_eq!(e.as_u8().is_some(), false); - assert_eq!(e.as_u32().is_some(), false); - assert_eq!(e.as_f64().is_some(), true); + assert!(!e.as_u8().is_some()); + assert!(!e.as_u32().is_some()); + assert!(e.as_f64().is_some()); let u = e.as_f64().unwrap(); assert_eq!(u, 10.0_f64); let e = OrbitItem::U32(1); - assert_eq!(e.as_u32().is_some(), true); - assert_eq!(e.as_f64().is_some(), false); + assert!(e.as_u32().is_some()); + assert!(!e.as_f64().is_some()); let u = e.as_u32().unwrap(); assert_eq!(u, 1_u32); } diff --git a/rinex/src/navigation/record.rs b/rinex/src/navigation/record.rs index 6732475d5..026678d0d 100644 --- a/rinex/src/navigation/record.rs +++ b/rinex/src/navigation/record.rs @@ -16,7 +16,7 @@ use crate::Bibliography; fn double_exponent_digits(content: &str) -> String { // replace "eN " with "E+0N" let re = Regex::new(r"e\d{1} ").unwrap(); - let lines = re.replace_all(&content, |caps: &Captures| format!("E+0{}", &caps[0][1..])); + let lines = re.replace_all(content, |caps: &Captures| format!("E+0{}", &caps[0][1..])); // replace "eN" with "E+0N" let re = Regex::new(r"e\d{1}").unwrap(); @@ -225,12 +225,12 @@ pub(crate) fn is_new_epoch(line: &str, v: Version) -> bool { } // rest matches a valid epoch descriptor let datestr = &line[3..22]; - epoch::parse_utc(&datestr).is_ok() + epoch::parse_utc(datestr).is_ok() } else if v.major == 3 { // RINEX V3 if line.len() < 24 { return false; // not enough bytes - // to describe an SVN and an Epoch + // to describe an SV and an Epoch } // 1st entry matches a valid SV description let (sv, _) = line.split_at(4); @@ -239,10 +239,10 @@ pub(crate) fn is_new_epoch(line: &str, v: Version) -> bool { } // rest matches a valid epoch descriptor let datestr = &line[4..23]; - epoch::parse_utc(&datestr).is_ok() + epoch::parse_utc(datestr).is_ok() } else { // Modern --> easy - if let Some(c) = line.chars().nth(0) { + if let Some(c) = line.chars().next() { c == '>' // new epoch marker } else { false @@ -256,7 +256,7 @@ pub(crate) fn parse_epoch( constell: Constellation, content: &str, ) -> Result<(Epoch, NavFrame), Error> { - if content.starts_with(">") { + if content.starts_with('>') { parse_v4_record_entry(content) } else { parse_v2_v3_record_entry(version, constell, content) @@ -282,7 +282,7 @@ fn parse_v4_record_entry(content: &str) -> Result<(Epoch, NavFrame), Error> { let ts = sv .constellation - .to_timescale() + .timescale() .ok_or(Error::TimescaleIdentification(sv))?; let (epoch, fr): (Epoch, NavFrame) = match frame_class { @@ -425,20 +425,20 @@ fn fmt_epoch_v2v3(epoch: &Epoch, data: &Vec, header: &Header) -> Resul if let Some(data) = ephemeris.orbits.get(*key) { lines.push_str(&format!("{} ", data.to_string())); } else { - lines.push_str(&format!(" ")); + lines.push_str(" "); } } - lines.push_str(&format!("\n ")); + lines.push_str("\n "); } else { // last row for (key, _) in chunk { if let Some(data) = ephemeris.orbits.get(*key) { - lines.push_str(&format!("{}", data.to_string())); + lines.push_str(&data.to_string()); } else { - lines.push_str(&format!(" ")); + lines.push_str(" "); } } - lines.push_str("\n"); + lines.push('\n'); } } } @@ -458,7 +458,7 @@ fn fmt_epoch_v4(epoch: &Epoch, data: &Vec, header: &Header) -> Result< // Mixed constellation context // we need to fully describe the vehicle lines.push_str(&sv.to_string()); - lines.push_str(" "); + lines.push(' '); }, Some(_) => { // Unique constellation context: @@ -546,43 +546,43 @@ mod test { // NAV V<3 let line = " 1 20 12 31 23 45 0.0 7.282570004460D-05 0.000000000000D+00 7.380000000000D+04"; - assert_eq!(is_new_epoch(line, Version::new(1, 0)), true); - assert_eq!(is_new_epoch(line, Version::new(2, 0)), true); - assert_eq!(is_new_epoch(line, Version::new(3, 0)), false); - assert_eq!(is_new_epoch(line, Version::new(4, 0)), false); + assert!(is_new_epoch(line, Version::new(1, 0))); + assert!(is_new_epoch(line, Version::new(2, 0))); + assert!(!is_new_epoch(line, Version::new(3, 0))); + assert!(!is_new_epoch(line, Version::new(4, 0))); // NAV V<3 let line = " 2 21 1 1 11 45 0.0 4.610531032090D-04 1.818989403550D-12 4.245000000000D+04"; - assert_eq!(is_new_epoch(line, Version::new(1, 0)), true); - assert_eq!(is_new_epoch(line, Version::new(2, 0)), true); - assert_eq!(is_new_epoch(line, Version::new(3, 0)), false); - assert_eq!(is_new_epoch(line, Version::new(4, 0)), false); + assert!(is_new_epoch(line, Version::new(1, 0))); + assert!(is_new_epoch(line, Version::new(2, 0))); + assert!(!is_new_epoch(line, Version::new(3, 0))); + assert!(!is_new_epoch(line, Version::new(4, 0))); // GPS NAV V<3 let line = " 3 17 1 13 23 59 44.0-1.057861372828D-04-9.094947017729D-13 0.000000000000D+00"; - assert_eq!(is_new_epoch(line, Version::new(1, 0)), true); - assert_eq!(is_new_epoch(line, Version::new(2, 0)), true); - assert_eq!(is_new_epoch(line, Version::new(3, 0)), false); - assert_eq!(is_new_epoch(line, Version::new(4, 0)), false); + assert!(is_new_epoch(line, Version::new(1, 0))); + assert!(is_new_epoch(line, Version::new(2, 0))); + assert!(!is_new_epoch(line, Version::new(3, 0))); + assert!(!is_new_epoch(line, Version::new(4, 0))); // NAV V3 let line = "C05 2021 01 01 00 00 00-4.263372393325e-04-7.525180478751e-11 0.000000000000e+00"; - assert_eq!(is_new_epoch(line, Version::new(1, 0)), false); - assert_eq!(is_new_epoch(line, Version::new(2, 0)), false); - assert_eq!(is_new_epoch(line, Version::new(3, 0)), true); - assert_eq!(is_new_epoch(line, Version::new(4, 0)), false); + assert!(!is_new_epoch(line, Version::new(1, 0))); + assert!(!is_new_epoch(line, Version::new(2, 0))); + assert!(is_new_epoch(line, Version::new(3, 0))); + assert!(!is_new_epoch(line, Version::new(4, 0))); // NAV V3 let line = "R21 2022 01 01 09 15 00-2.666609361768E-04-2.728484105319E-12 5.508000000000E+05"; - assert_eq!(is_new_epoch(line, Version::new(1, 0)), false); - assert_eq!(is_new_epoch(line, Version::new(2, 0)), false); - assert_eq!(is_new_epoch(line, Version::new(3, 0)), true); - assert_eq!(is_new_epoch(line, Version::new(4, 0)), false); + assert!(!is_new_epoch(line, Version::new(1, 0))); + assert!(!is_new_epoch(line, Version::new(2, 0))); + assert!(is_new_epoch(line, Version::new(3, 0))); + assert!(!is_new_epoch(line, Version::new(4, 0))); // NAV V4 let line = "> EPH G02 LNAV"; - assert_eq!(is_new_epoch(line, Version::new(2, 0)), false); - assert_eq!(is_new_epoch(line, Version::new(3, 0)), false); - assert_eq!(is_new_epoch(line, Version::new(4, 0)), true); + assert!(!is_new_epoch(line, Version::new(2, 0))); + assert!(!is_new_epoch(line, Version::new(3, 0))); + assert!(is_new_epoch(line, Version::new(4, 0))); } #[test] fn parse_glonass_v2() { @@ -592,8 +592,10 @@ mod test { 1.292880712890D+04-2.049269676210D+00 0.000000000000D+00 1.000000000000D+00 2.193169775390D+04 1.059645652770D+00-9.313225746150D-10 0.000000000000D+00"; let version = Version::new(2, 0); + assert!(is_new_epoch(content, version)); + let entry = parse_epoch(version, Constellation::Glonass, content); - assert_eq!(entry.is_ok(), true); + assert!(entry.is_ok(), "failed to parse epoch {:?}", entry.err()); let (epoch, frame) = entry.unwrap(); assert_eq!( @@ -602,7 +604,7 @@ mod test { ); let fr = frame.as_eph(); - assert_eq!(fr.is_some(), true); + assert!(fr.is_some()); let (msg_type, sv, ephemeris) = fr.unwrap(); assert_eq!(msg_type, NavMsgType::LNAV); @@ -621,60 +623,60 @@ mod test { for (k, v) in orbits.iter() { if k.eq("satPosX") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -1.488799804690E+03); } else if k.eq("velX") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -2.196182250980E+00); } else if k.eq("accelX") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 3.725290298460E-09); } else if k.eq("health") { let v = v.as_glo_health(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); } else if k.eq("satPosY") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 1.292880712890E+04); } else if k.eq("velY") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -2.049269676210E+00); } else if k.eq("accelY") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.0); } else if k.eq("channel") { let v = v.as_i8(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 1); } else if k.eq("satPosZ") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 2.193169775390E+04); } else if k.eq("velZ") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 1.059645652770E+00); } else if k.eq("accelZ") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -9.313225746150E-10); } else if k.eq("ageOp") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.0); } else { @@ -695,13 +697,13 @@ mod test { .432000000000e+06 .000000000000e+00 0.000000000000e+00 0.000000000000e+00"; let version = Version::new(3, 0); let entry = parse_epoch(version, Constellation::Mixed, content); - assert_eq!(entry.is_ok(), true); + assert!(entry.is_ok()); let (epoch, frame) = entry.unwrap(); assert_eq!(epoch, Epoch::from_str("2021-01-01T00:00:00 BDT").unwrap()); let fr = frame.as_eph(); - assert_eq!(fr.is_some(), true); + assert!(fr.is_some()); let (msg_type, sv, ephemeris) = fr.unwrap(); assert_eq!(msg_type, NavMsgType::LNAV); @@ -720,124 +722,124 @@ mod test { for (k, v) in orbits.iter() { if k.eq("aode") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.100000000000e+01); } else if k.eq("crs") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.118906250000e+02); } else if k.eq("deltaN") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.105325815814e-08); } else if k.eq("m0") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.255139531119e+01); } else if k.eq("cuc") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.169500708580e-06); } else if k.eq("e") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.401772442274e-03); } else if k.eq("cus") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.292365439236e-04); } else if k.eq("sqrta") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.649346986580e+04); } else if k.eq("toe") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.432000000000e+06); } else if k.eq("cic") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.105705112219e-06); } else if k.eq("omega0") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.277512444499e+01); } else if k.eq("cis") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.211410224438e-06); } else if k.eq("i0") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.607169709798e-01); } else if k.eq("crc") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.897671875000e+03); } else if k.eq("omega") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.154887266488e+00); } else if k.eq("omegaDot") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.871464871438e-10); } else if k.eq("idot") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.940753471872e-09); // SPARE } else if k.eq("week") { let v = v.as_u32(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 782); //SPARE } else if k.eq("svAccuracy") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.200000000000e+01); } else if k.eq("satH1") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.000000000000e+00); } else if k.eq("tgd1b1b3") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.599999994133e-09); } else if k.eq("tgd2b2b3") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.900000000000e-08); } else if k.eq("t_tm") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.432000000000e+06); } else if k.eq("aodc") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.000000000000e+00); } else { @@ -858,13 +860,13 @@ mod test { .469330000000e+06 0.000000000000e+00 0.000000000000e+00 0.000000000000e+00"; let version = Version::new(3, 0); let entry = parse_epoch(version, Constellation::Mixed, content); - assert_eq!(entry.is_ok(), true); + assert!(entry.is_ok()); let (epoch, frame) = entry.unwrap(); assert_eq!(epoch, Epoch::from_str("2021-01-01T10:10:00 GST").unwrap(),); let fr = frame.as_eph(); - assert_eq!(fr.is_some(), true); + assert!(fr.is_some()); let (msg_type, sv, ephemeris) = fr.unwrap(); assert_eq!(msg_type, NavMsgType::LNAV); @@ -883,121 +885,121 @@ mod test { for (k, v) in orbits.iter() { if k.eq("iodnav") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.130000000000e+02); } else if k.eq("crs") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.435937500000e+02); } else if k.eq("deltaN") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.261510892978e-08); } else if k.eq("m0") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.142304064404e+00); } else if k.eq("cuc") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.201165676117e-05); } else if k.eq("e") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.226471573114e-03); } else if k.eq("cus") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.109840184450e-04); } else if k.eq("sqrta") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.544061822701e+04); } else if k.eq("toe") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.468600000000e+06); } else if k.eq("cic") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.111758708954e-07); } else if k.eq("omega0") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.313008275208e+01); } else if k.eq("cis") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.409781932831e-07); } else if k.eq("i0") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.980287270202e+00); } else if k.eq("crc") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.113593750000e+03); } else if k.eq("omega") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.276495796017e+00); } else if k.eq("omegaDot") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.518200156545e-08); } else if k.eq("idot") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.595381942905e-09); } else if k.eq("dataSrc") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.258000000000e+03); } else if k.eq("week") { let v = v.as_u32(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 2138); //SPARE } else if k.eq("sisa") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.312000000000e+01); } else if k.eq("health") { let v = v.as_gal_health(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); } else if k.eq("bgdE5aE1") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.232830643654e-09); } else if k.eq("bgdE5bE1") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.000000000000e+00); } else if k.eq("t_tm") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.469330000000e+06); } else { @@ -1014,7 +1016,7 @@ mod test { .214479208984e+05 -.131077289581e+01 -.279396772385e-08 .000000000000e+00"; let version = Version::new(3, 0); let entry = parse_epoch(version, Constellation::Mixed, content); - assert_eq!(entry.is_ok(), true); + assert!(entry.is_ok()); let (epoch, frame) = entry.unwrap(); assert_eq!( epoch, @@ -1022,7 +1024,7 @@ mod test { ); let fr = frame.as_eph(); - assert_eq!(fr.is_some(), true); + assert!(fr.is_some()); let (msg_type, sv, ephemeris) = fr.unwrap(); assert_eq!(msg_type, NavMsgType::LNAV); assert_eq!( @@ -1040,60 +1042,60 @@ mod test { for (k, v) in orbits.iter() { if k.eq("satPosX") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.124900639648e+05); } else if k.eq("velX") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.912527084351e+00); } else if k.eq("accelX") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.000000000000e+00); } else if k.eq("health") { let v = v.as_glo_health(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); } else if k.eq("satPosY") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.595546582031e+04); } else if k.eq("velY") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.278496932983e+01); } else if k.eq("accelY") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.000000000000e+00); } else if k.eq("channel") { let v = v.as_i8(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 5); } else if k.eq("satPosZ") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.214479208984e+05); } else if k.eq("velZ") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.131077289581e+01); } else if k.eq("accelZ") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, -0.279396772385e-08); } else if k.eq("ageOp") { let v = v.as_f64(); - assert_eq!(v.is_some(), true); + assert!(v.is_some()); let v = v.unwrap(); assert_eq!(v, 0.000000000000e+00); } else { @@ -1125,10 +1127,10 @@ impl Merge for Record { /// Merges `rhs` into `Self` fn merge_mut(&mut self, rhs: &Self) -> Result<(), merge::Error> { for (rhs_epoch, rhs_frames) in rhs { - if let Some(frames) = self.get_mut(&rhs_epoch) { + if let Some(frames) = self.get_mut(rhs_epoch) { // this epoch already exists for fr in rhs_frames { - if !frames.contains(&fr) { + if !frames.contains(fr) { frames.push(fr.clone()); // insert new NavFrame } } @@ -1147,7 +1149,7 @@ impl Split for Record { .iter() .flat_map(|(k, v)| { if k < &epoch { - Some((k.clone(), v.clone())) + Some((*k, v.clone())) } else { None } @@ -1157,7 +1159,7 @@ impl Split for Record { .iter() .flat_map(|(k, v)| { if k >= &epoch { - Some((k.clone(), v.clone())) + Some((*k, v.clone())) } else { None } @@ -1184,38 +1186,58 @@ fn mask_mut_equal(rec: &mut Record, target: TargetItem) { rec.retain(|_, frames| { frames.retain(|fr| { if let Some((_, sv, _)) = fr.as_eph() { - filter.contains(&sv) + filter.contains(sv) } else if let Some((_, sv, _)) = fr.as_ion() { - filter.contains(&sv) + filter.contains(sv) } else if let Some((_, sv, _)) = fr.as_eop() { - filter.contains(&sv) + filter.contains(sv) } else if let Some((_, sv, _)) = fr.as_sto() { - filter.contains(&sv) + filter.contains(sv) } else { // non existing false } }); - frames.len() > 0 + !frames.is_empty() }); }, TargetItem::ConstellationItem(filter) => { + let mut broad_sbas_filter = false; + for c in &filter { + broad_sbas_filter |= *c == Constellation::SBAS; + } rec.retain(|_, frames| { frames.retain(|fr| { if let Some((_, sv, _)) = fr.as_eph() { - filter.contains(&sv.constellation) + if broad_sbas_filter { + sv.constellation.is_sbas() || filter.contains(&sv.constellation) + } else { + filter.contains(&sv.constellation) + } } else if let Some((_, sv, _)) = fr.as_ion() { - filter.contains(&sv.constellation) + if broad_sbas_filter { + sv.constellation.is_sbas() || filter.contains(&sv.constellation) + } else { + filter.contains(&sv.constellation) + } } else if let Some((_, sv, _)) = fr.as_eop() { - filter.contains(&sv.constellation) + if broad_sbas_filter { + sv.constellation.is_sbas() || filter.contains(&sv.constellation) + } else { + filter.contains(&sv.constellation) + } } else if let Some((_, sv, _)) = fr.as_sto() { - filter.contains(&sv.constellation) + if broad_sbas_filter { + sv.constellation.is_sbas() || filter.contains(&sv.constellation) + } else { + filter.contains(&sv.constellation) + } } else { // non existing false } }); - frames.len() > 0 + !frames.is_empty() }); }, TargetItem::OrbitItem(_filter) => { @@ -1236,19 +1258,19 @@ fn mask_mut_equal(rec: &mut Record, target: TargetItem) { TargetItem::NavFrameItem(filter) => { rec.retain(|_, frames| { frames.retain(|fr| { - if let Some(_) = fr.as_eph() { + if fr.as_eph().is_some() { filter.contains(&FrameClass::Ephemeris) - } else if let Some(_) = fr.as_eop() { + } else if fr.as_eop().is_some() { filter.contains(&FrameClass::EarthOrientation) - } else if let Some(_) = fr.as_ion() { + } else if fr.as_ion().is_some() { filter.contains(&FrameClass::IonosphericModel) - } else if let Some(_) = fr.as_sto() { + } else if fr.as_sto().is_some() { filter.contains(&FrameClass::SystemTimeOffset) } else { false } }); - frames.len() > 0 + !frames.is_empty() }); }, TargetItem::NavMsgItem(filter) => { @@ -1266,7 +1288,7 @@ fn mask_mut_equal(rec: &mut Record, target: TargetItem) { false } }); - frames.len() > 0 + !frames.is_empty() }); }, _ => {}, // Other items: either not supported, or do not apply @@ -1281,19 +1303,19 @@ fn mask_mut_ineq(rec: &mut Record, target: TargetItem) { rec.retain(|_, frames| { frames.retain(|fr| { if let Some((_, sv, _)) = fr.as_eph() { - !filter.contains(&sv) + !filter.contains(sv) } else if let Some((_, sv, _)) = fr.as_ion() { - !filter.contains(&sv) + !filter.contains(sv) } else if let Some((_, sv, _)) = fr.as_eop() { - !filter.contains(&sv) + !filter.contains(sv) } else if let Some((_, sv, _)) = fr.as_sto() { - !filter.contains(&sv) + !filter.contains(sv) } else { // non existing false } }); - frames.len() > 0 + !frames.is_empty() }); }, TargetItem::ConstellationItem(filter) => { @@ -1312,7 +1334,7 @@ fn mask_mut_ineq(rec: &mut Record, target: TargetItem) { false } }); - frames.len() > 0 + !frames.is_empty() }); }, TargetItem::OrbitItem(_filter) => { @@ -1333,13 +1355,13 @@ fn mask_mut_ineq(rec: &mut Record, target: TargetItem) { TargetItem::NavFrameItem(filter) => { rec.retain(|_, frames| { frames.retain(|fr| { - if let Some(_) = fr.as_eph() { + if fr.as_eph().is_some() { !filter.contains(&FrameClass::Ephemeris) - } else if let Some(_) = fr.as_eop() { + } else if fr.as_eop().is_some() { !filter.contains(&FrameClass::EarthOrientation) - } else if let Some(_) = fr.as_ion() { + } else if fr.as_ion().is_some() { !filter.contains(&FrameClass::IonosphericModel) - } else if let Some(_) = fr.as_sto() { + } else if fr.as_sto().is_some() { !filter.contains(&FrameClass::SystemTimeOffset) } else { false @@ -1525,7 +1547,7 @@ fn mask_mut_gt(rec: &mut Record, target: TargetItem) { false } }); - frames.len() > 0 + !frames.is_empty() }); }, _ => {}, // Other items: either not supported, or do not apply @@ -1579,7 +1601,7 @@ fn mask_mut_geq(rec: &mut Record, target: TargetItem) { false } }); - frames.len() > 0 + !frames.is_empty() }); }, _ => {}, // Other items: either not supported, or do not apply @@ -1788,7 +1810,7 @@ impl Decimate for Record { } fn decimate_match(&self, rhs: &Self) -> Self { let mut s = self.clone(); - s.decimate_match_mut(&rhs); + s.decimate_match_mut(rhs); s } } diff --git a/rinex/src/observable.rs b/rinex/src/observable.rs index 00bfa1801..13890a6e7 100644 --- a/rinex/src/observable.rs +++ b/rinex/src/observable.rs @@ -54,28 +54,16 @@ impl Default for Observable { impl Observable { pub fn is_phase_observable(&self) -> bool { - match self { - Self::Phase(_) => true, - _ => false, - } + matches!(self, Self::Phase(_)) } pub fn is_pseudorange_observable(&self) -> bool { - match self { - Self::PseudoRange(_) => true, - _ => false, - } + matches!(self, Self::PseudoRange(_)) } pub fn is_doppler_observable(&self) -> bool { - match self { - Self::Doppler(_) => true, - _ => false, - } + matches!(self, Self::Doppler(_)) } pub fn is_ssi_observable(&self) -> bool { - match self { - Self::SSI(_) => true, - _ => false, - } + matches!(self, Self::SSI(_)) } pub fn code(&self) -> Option { match self { @@ -228,7 +216,7 @@ impl Observable { _ => None, // invalid: not a pseudo range } }, - Constellation::Geo | Constellation::SBAS(_) => { + Constellation::SBAS => { match self { Self::PseudoRange(code) => { match code.as_ref() { @@ -317,13 +305,13 @@ impl std::str::FromStr for Observable { _ => { let len = content.len(); if len > 1 && len < 4 { - if content.starts_with("L") { + if content.starts_with('L') { Ok(Self::Phase(content.to_string())) - } else if content.starts_with("C") || content.starts_with("P") { + } else if content.starts_with('C') || content.starts_with('P') { Ok(Self::PseudoRange(content.to_string())) - } else if content.starts_with("S") { + } else if content.starts_with('S') { Ok(Self::SSI(content.to_string())) - } else if content.starts_with("D") { + } else if content.starts_with('D') { Ok(Self::Doppler(content.to_string())) } else { Err(ParsingError::UnknownObservable(content.to_string())) diff --git a/rinex/src/observation/mod.rs b/rinex/src/observation/mod.rs index c4f82d7d0..0556e01f4 100644 --- a/rinex/src/observation/mod.rs +++ b/rinex/src/observation/mod.rs @@ -81,9 +81,9 @@ impl std::fmt::Display for Crinex { let version = self.version.to_string(); write!(f, "{: Self { let mut s = self.clone(); - s.time_of_first_obs = Some(epoch); + s.time_of_last_obs = Some(epoch); s } /// Insert a data scaling diff --git a/rinex/src/observation/record.rs b/rinex/src/observation/record.rs index 115bead35..b3f6265c8 100644 --- a/rinex/src/observation/record.rs +++ b/rinex/src/observation/record.rs @@ -50,7 +50,7 @@ bitflags! { } } -#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)] +#[derive(Default, Copy, Clone, Debug, PartialEq, PartialOrd)] #[cfg_attr(feature = "serde", derive(Serialize))] pub struct ObservationData { /// physical measurement @@ -146,7 +146,7 @@ pub(crate) fn is_new_epoch(line: &str, v: Version) -> bool { } else { // Modern RINEX // OBS::V3 behaves like all::V4 - match line.chars().nth(0) { + match line.chars().next() { Some(c) => { c == '>' // epochs always delimited // by this new identifier @@ -156,8 +156,7 @@ pub(crate) fn is_new_epoch(line: &str, v: Version) -> bool { } } -/// Builds `Record` entry for `ObservationData` -/// from given epoch content +/// Builds `Record` entry for `ObservationData` from given epoch content pub(crate) fn parse_epoch( header: &Header, content: &str, @@ -190,13 +189,13 @@ pub(crate) fn parse_epoch( } // V > 2 might start with a ">" marker - if line.starts_with(">") { + if line.starts_with('>') { line = line.split_at(1).1.clone(); } let (date, rem) = line.split_at(offset + 3); let (n_sat, rem) = rem.split_at(3); - let n_sat = u16::from_str_radix(n_sat.trim(), 10)?; + let n_sat = n_sat.trim().parse::()?; let epoch = epoch::parse_in_timescale(date, ts)?; // previously identified observables (that we expect) @@ -259,7 +258,7 @@ pub(crate) fn parse_epoch( return Err(Error::MissingData); } } - parse_v2(&header, &systems, observables, lines) + parse_v2(header, &systems, observables, lines) }, _ => parse_v3(observables, lines), }; @@ -284,9 +283,10 @@ fn parse_v2( let mut obs_ptr = 0; // observable pointer let mut data: BTreeMap> = BTreeMap::new(); let mut inner: HashMap = HashMap::with_capacity(5); - let mut sv: Sv; + let mut sv = Sv::default(); let mut observables: &Vec; - //println!("SYSTEMS \"{}\"", systems); // DEBUG + //println!("{:?}", header_observables); // DEBUG + //println!("\"{}\"", systems); // DEBUG // parse first system we're dealing with if systems.len() < svnn_size { @@ -298,32 +298,51 @@ fn parse_v2( /* * identify 1st system */ - let max = std::cmp::min(svnn_size, systems.len()); // covers epoch with a unique vehicle + let max = std::cmp::min(svnn_size, systems.len()); // for epochs with a single vehicle let system = &systems[0..max]; if let Ok(ssv) = Sv::from_str(system) { sv = ssv; } else { - // mono constellation context - if let Ok(prn) = u8::from_str_radix(system.trim(), 10) { - if let Some(constellation) = header.constellation { - sv = Sv { prn, constellation } - } else { - panic!("faulty RINEX2 constellation /sv definition"); - } - } else { - // can't parse 1st vehicle - return data; + // may fail on omitted X in "XYY", + // mainly on OLD RINEX with mono constellation + match header.constellation { + Some(Constellation::Mixed) => panic!("bad gnss definition"), + Some(c) => { + if let Ok(prn) = system.trim().parse::() { + if let Ok(s) = Sv::from_str(&format!("{}{:02}", c, prn)) { + sv = s; + } else { + return data; + } + } + }, + None => return data, } } sv_ptr += svnn_size; // increment pointer - // grab observables for this vehicle - if let Some(o) = header_observables.get(&sv.constellation) { - observables = &o; - } else { - // failed to identify observations for this vehicle - return data; - } + //println!("\"{}\"={}", system, sv); // DEBUG + + // grab observables for this vehicle + observables = match sv.constellation.is_sbas() { + true => { + if let Some(observables) = header_observables.get(&Constellation::SBAS) { + observables + } else { + // failed to identify observations for this vehicle + return data; + } + }, + false => { + if let Some(observables) = header_observables.get(&sv.constellation) { + observables + } else { + // failed to identify observations for this vehicle + return data; + } + }, + }; + //println!("{:?}", observables); // DEBUG for line in lines { // browse all lines provided @@ -364,11 +383,11 @@ fn parse_v2( //println!("OBS \"{}\"", obs); //DEBUG let mut lli: Option = None; let mut snr: Option = None; - if let Ok(obs) = f64::from_str(obs.trim()) { + if let Ok(obs) = obs.trim().parse::() { // parse obs if slice.len() > 14 { let lli_str = &slice[14..15]; - if let Ok(u) = u8::from_str_radix(lli_str, 10) { + if let Ok(u) = lli_str.parse::() { lli = LliFlags::from_bits(u); } if slice.len() > 15 { @@ -406,30 +425,47 @@ fn parse_v2( let start = sv_ptr; let end = std::cmp::min(sv_ptr + svnn_size, systems.len()); // trimed epoch description let system = &systems[start..end]; - //println!("NEW SYSTEM \"{}\"\n", system); //DEBUG - if let Ok(ssv) = Sv::from_str(system) { - sv = ssv; + if let Ok(s) = Sv::from_str(system) { + sv = s; } else { - // mono constellation context - if let Ok(prn) = u8::from_str_radix(system.trim(), 10) { - if let Some(constellation) = header.constellation { - sv = Sv { prn, constellation } - } else { - panic!("faulty RINEX2 constellation /sv definition"); - } - } else { - // can't parse vehicle - return data; + // may fail on omitted X in "XYY", + // mainly on OLD RINEX with mono constellation + match header.constellation { + Some(c) => { + if let Ok(prn) = system.trim().parse::() { + if let Ok(s) = Sv::from_str(&format!("{}{:02}", c, prn)) { + sv = s; + } else { + return data; + } + } + }, + _ => unreachable!(), } } + //println!("\"{}\"={}", system, sv); //DEBUG sv_ptr += svnn_size; // increment pointer - // grab observables for this vehicle - if let Some(o) = header_observables.get(&sv.constellation) { - observables = &o; - } else { - // failed to identify observations for this vehicle - return data; - } + + // grab observables for this vehicle + observables = match sv.constellation.is_sbas() { + true => { + if let Some(observables) = header_observables.get(&Constellation::SBAS) { + observables + } else { + // failed to identify observations for this vehicle + return data; + } + }, + false => { + if let Some(observables) = header_observables.get(&sv.constellation) { + observables + } else { + // failed to identify observations for this vehicle + return data; + } + }, + }; + //println!("{:?}", observables); // DEBUG } } // for all lines provided data @@ -452,11 +488,14 @@ fn parse_v3( //println!("parse_v3: \"{}\"", line); //DEBUG let (sv, line) = line.split_at(svnn_size); if let Ok(sv) = Sv::from_str(sv) { - //println!("SV: \"{}\"", sv); //DEBUG - if let Some(obscodes) = observables.get(&sv.constellation) { + let obscodes = match sv.constellation.is_sbas() { + true => observables.get(&Constellation::SBAS), + false => observables.get(&sv.constellation), + }; + //println!("SV: {} OBSERVABLES: {:?}", sv, obscodes); // DEBUG + if let Some(obscodes) = obscodes { let nb_obs = line.len() / observable_width; inner.clear(); - //println!("NB OBS: {}", nb_obs); //DEBUG let mut rem = line; for i in 0..nb_obs { if i == obscodes.len() { @@ -492,7 +531,27 @@ fn parse_v3( inner.insert(obscodes[i].clone(), ObservationData { obs, lli, snr }); } } - if inner.len() > 0 { + if rem.len() >= observable_width - 2 { + let mut snr: Option = None; + let mut lli: Option = None; + let obs = &rem[0..observable_width - 2]; + if let Ok(obs) = obs.trim().parse::() { + if rem.len() > observable_width - 2 { + let lli_str = &rem[observable_width - 2..observable_width - 1]; + if let Ok(u) = lli_str.parse::() { + lli = LliFlags::from_bits(u); + if rem.len() > observable_width - 1 { + let snr_str = &rem[observable_width - 1..]; + if let Ok(s) = Snr::from_str(snr_str) { + snr = Some(s); + } + } + } + } + inner.insert(obscodes[nb_obs].clone(), ObservationData { obs, lli, snr }); + } + } + if !inner.is_empty() { data.insert(sv, inner.clone()); } } //got some observables to work with @@ -536,29 +595,33 @@ fn fmt_epoch_v3( lines.push_str(&format!("{:13.4}", data)); } - lines.push_str("\n"); + lines.push('\n'); for (sv, data) in data.iter() { - lines.push_str(&format!("{}", sv.to_string())); - if let Some(observables) = observables.get(&sv.constellation) { + lines.push_str(&format!("{:x}", sv)); + let observables = match sv.constellation.is_sbas() { + true => observables.get(&Constellation::SBAS), + false => observables.get(&sv.constellation), + }; + if let Some(observables) = observables { for observable in observables { if let Some(observation) = data.get(observable) { lines.push_str(&format!("{:14.3}", observation.obs)); if let Some(flag) = observation.lli { lines.push_str(&format!("{}", flag.bits())); } else { - lines.push_str(" "); + lines.push(' '); } if let Some(flag) = observation.snr { lines.push_str(&format!("{:x}", flag)); } else { - lines.push_str(" "); + lines.push(' '); } } else { - lines.push_str(&format!(" ")); + lines.push_str(" "); } } } - lines.push_str("\n"); + lines.push('\n'); } lines } @@ -590,9 +653,9 @@ fn fmt_epoch_v2( lines.push_str(&format!(" {:9.1}", data)); } } - lines.push_str(&format!("\n ")); + lines.push_str("\n "); } - lines.push_str(&sv.to_string()); + lines.push_str(&format!("{:x}", sv)); index += 1; } let obs_per_line = 5; @@ -600,10 +663,14 @@ fn fmt_epoch_v2( for (sv, observations) in data.iter() { // follow list of observables, as described in header section // for given constellation - if let Some(observables) = observables.get(&sv.constellation) { + let observables = match sv.constellation.is_sbas() { + true => observables.get(&Constellation::SBAS), + false => observables.get(&sv.constellation), + }; + if let Some(observables) = observables { for (obs_index, observable) in observables.iter().enumerate() { if obs_index % obs_per_line == 0 { - lines.push_str("\n"); + lines.push('\n'); } if let Some(observation) = observations.get(observable) { let formatted_obs = format!("{:14.3}", observation.obs); @@ -626,7 +693,7 @@ fn fmt_epoch_v2( } } } - lines.push_str("\n"); + lines.push('\n'); lines } @@ -653,7 +720,7 @@ impl Merge for Record { *data = *rhs_data; // overwrite } else { // new observation: insert it - observations.insert(rhs_observable.clone(), rhs_data.clone()); + observations.insert(rhs_observable.clone(), *rhs_data); } } } else { @@ -676,7 +743,7 @@ impl Split for Record { .iter() .flat_map(|(k, v)| { if k.0 < epoch { - Some((k.clone(), v.clone())) + Some((*k, v.clone())) } else { None } @@ -686,7 +753,7 @@ impl Split for Record { .iter() .flat_map(|(k, v)| { if k.0 >= epoch { - Some((k.clone(), v.clone())) + Some((*k, v.clone())) } else { None } @@ -762,8 +829,8 @@ impl Smooth for Record { let phase_data = ph_data.unwrap(); - if let Some(data) = buffer.get_mut(&sv) { - if let Some((n, prev_result, prev_phase)) = data.get_mut(&pr_observable) { + if let Some(data) = buffer.get_mut(sv) { + if let Some((n, prev_result, prev_phase)) = data.get_mut(pr_observable) { let delta_phase = phase_data - *prev_phase; // implement corrector equation pr_observation.obs = 1.0 / *n * pr_observation.obs @@ -821,24 +888,34 @@ impl Mask for Record { self.retain(|_, (clk, _)| clk.is_some()); }, TargetItem::ConstellationItem(constells) => { + let mut broad_sbas_filter = false; + for c in &constells { + broad_sbas_filter |= *c == Constellation::SBAS; + } self.retain(|_, (_, svs)| { - svs.retain(|sv, _| constells.contains(&sv.constellation)); - svs.len() > 0 + svs.retain(|sv, _| { + if broad_sbas_filter { + sv.constellation.is_sbas() || constells.contains(&sv.constellation) + } else { + constells.contains(&sv.constellation) + } + }); + !svs.is_empty() }); }, TargetItem::SvItem(items) => { self.retain(|_, (_, svs)| { - svs.retain(|sv, _| items.contains(&sv)); - svs.len() > 0 + svs.retain(|sv, _| items.contains(sv)); + !svs.is_empty() }); }, TargetItem::ObservableItem(filter) => { self.retain(|_, (_, svs)| { svs.retain(|_, obs| { - obs.retain(|code, _| filter.contains(&code)); - obs.len() > 0 + obs.retain(|code, _| filter.contains(code)); + !obs.is_empty() }); - svs.len() > 0 + !svs.is_empty() }); }, TargetItem::SnrItem(filter) => { @@ -852,9 +929,9 @@ impl Mask for Record { false // no snr: drop out } }); - obs.len() > 0 + !obs.is_empty() }); - svs.len() > 0 + !svs.is_empty() }); }, _ => {}, @@ -868,22 +945,22 @@ impl Mask for Record { TargetItem::ConstellationItem(constells) => { self.retain(|_, (_, svs)| { svs.retain(|sv, _| !constells.contains(&sv.constellation)); - svs.len() > 0 + !svs.is_empty() }); }, TargetItem::SvItem(items) => { self.retain(|_, (_, svs)| { - svs.retain(|sv, _| !items.contains(&sv)); - svs.len() > 0 + svs.retain(|sv, _| !items.contains(sv)); + !svs.is_empty() }); }, TargetItem::ObservableItem(filter) => { self.retain(|_, (_, svs)| { svs.retain(|_, obs| { - obs.retain(|code, _| !filter.contains(&code)); - obs.len() > 0 + obs.retain(|code, _| !filter.contains(code)); + !obs.is_empty() }); - svs.len() > 0 + !svs.is_empty() }); }, _ => {}, @@ -903,7 +980,7 @@ impl Mask for Record { } retain }); - svs.len() > 0 + !svs.is_empty() }); }, TargetItem::SnrItem(filter) => { @@ -917,9 +994,9 @@ impl Mask for Record { false // no snr: drop out } }); - obs.len() > 0 + !obs.is_empty() }); - svs.len() > 0 + !svs.is_empty() }); }, _ => {}, @@ -939,7 +1016,7 @@ impl Mask for Record { } retain }); - svs.len() > 0 + !svs.is_empty() }); }, TargetItem::SnrItem(filter) => { @@ -953,9 +1030,9 @@ impl Mask for Record { false // no snr: drop out } }); - obs.len() > 0 + !obs.is_empty() }); - svs.len() > 0 + !svs.is_empty() }); }, _ => {}, @@ -975,7 +1052,7 @@ impl Mask for Record { } retain }); - svs.len() > 0 + !svs.is_empty() }); }, TargetItem::SnrItem(filter) => { @@ -989,9 +1066,9 @@ impl Mask for Record { false // no snr: drop out } }); - obs.len() > 0 + !obs.is_empty() }); - svs.len() > 0 + !svs.is_empty() }); }, _ => {}, @@ -1011,7 +1088,7 @@ impl Mask for Record { } retain }); - svs.len() > 0 + !svs.is_empty() }); }, TargetItem::SnrItem(filter) => { @@ -1025,9 +1102,9 @@ impl Mask for Record { false // no snr: drop out } }); - obs.len() > 0 + !obs.is_empty() }); - svs.len() > 0 + !svs.is_empty() }); }, _ => {}, @@ -1254,7 +1331,7 @@ impl Decimate for Record { } fn decimate_match(&self, rhs: &Self) -> Self { let mut s = self.clone(); - s.decimate_match_mut(&rhs); + s.decimate_match_mut(rhs); s } } @@ -1318,7 +1395,7 @@ impl Combine for Record { if let Some(data) = ret.get_mut(&(lhs_observable.clone(), ref_observable.clone())) { - if let Some(data) = data.get_mut(&sv) { + if let Some(data) = data.get_mut(sv) { data.insert(*epoch, gf); } else { let mut bmap: BTreeMap<(Epoch, EpochFlag), f64> = BTreeMap::new(); @@ -1328,7 +1405,7 @@ impl Combine for Record { } else { // new combination let mut inject = true; // insert only if not already combined to some other signal - for ((lhs, rhs), _) in &ret { + for (lhs, rhs) in ret.keys() { if lhs == lhs_observable { inject = false; break; @@ -1408,7 +1485,7 @@ impl Combine for Record { if let Some(data) = ret.get_mut(&(lhs_observable.clone(), ref_observable.clone())) { - if let Some(data) = data.get_mut(&sv) { + if let Some(data) = data.get_mut(sv) { data.insert(*epoch, gf); } else { let mut bmap: BTreeMap<(Epoch, EpochFlag), f64> = BTreeMap::new(); @@ -1418,7 +1495,7 @@ impl Combine for Record { } else { // new combination let mut inject = true; // insert only if not already combined to some other signal - for ((lhs, rhs), _) in &ret { + for (lhs, rhs) in ret.keys() { if lhs == lhs_observable { inject = false; break; @@ -1504,7 +1581,7 @@ impl Combine for Record { if let Some(data) = ret.get_mut(&(lhs_observable.clone(), ref_observable.clone())) { - if let Some(data) = data.get_mut(&sv) { + if let Some(data) = data.get_mut(sv) { data.insert(*epoch, yp); } else { let mut bmap: BTreeMap<(Epoch, EpochFlag), f64> = BTreeMap::new(); @@ -1514,7 +1591,7 @@ impl Combine for Record { } else { // new combination let mut inject = true; // insert only if not already combined to some other signal - for ((lhs, rhs), _) in &ret { + for (lhs, rhs) in ret.keys() { if lhs == lhs_observable { inject = false; break; @@ -1610,7 +1687,7 @@ impl Combine for Record { if let Some(data) = ret.get_mut(&(lhs_observable.clone(), ref_observable.clone())) { - if let Some(data) = data.get_mut(&sv) { + if let Some(data) = data.get_mut(sv) { data.insert(*epoch, gf); } else { let mut bmap: BTreeMap<(Epoch, EpochFlag), f64> = BTreeMap::new(); @@ -1620,7 +1697,7 @@ impl Combine for Record { } else { // new combination let mut inject = true; // insert only if not already combined to some other signal - for ((lhs, rhs), _) in &ret { + for (lhs, rhs) in ret.keys() { if lhs == lhs_observable { inject = false; break; @@ -1691,11 +1768,11 @@ impl Dcb for Record { // determine this code's role in the diff op // so it remains consistent - let items: Vec<&str> = op.split("-").collect(); + let items: Vec<&str> = op.split('-').collect(); if lhs_code == items[0] { // code is differenced - if let Some(data) = vehicles.get_mut(&sv) { + if let Some(data) = vehicles.get_mut(sv) { data.insert( *epoch, lhs_observation.obs - rhs_observation.obs, @@ -1713,7 +1790,7 @@ impl Dcb for Record { } } else { // code is refered to - if let Some(data) = vehicles.get_mut(&sv) { + if let Some(data) = vehicles.get_mut(sv) { data.insert( *epoch, rhs_observation.obs - lhs_observation.obs, @@ -1821,54 +1898,33 @@ mod test { use super::*; #[test] fn obs_record_is_new_epoch() { - assert_eq!( - is_new_epoch( - "95 01 01 00 00 00.0000000 0 7 06 17 21 22 23 28 31", - Version { major: 2, minor: 0 } - ), - true - ); - assert_eq!( - is_new_epoch( - "21700656.31447 16909599.97044 .00041 24479973.67844 24479975.23247", - Version { major: 2, minor: 0 } - ), - false - ); - assert_eq!( - is_new_epoch( - "95 01 01 11 00 00.0000000 0 8 04 16 18 19 22 24 27 29", - Version { major: 2, minor: 0 } - ), - true - ); - assert_eq!( - is_new_epoch( - "95 01 01 11 00 00.0000000 0 8 04 16 18 19 22 24 27 29", - Version { major: 3, minor: 0 } - ), - false - ); - assert_eq!( - is_new_epoch( - "> 2022 01 09 00 00 30.0000000 0 40", - Version { major: 3, minor: 0 } - ), - true - ); - assert_eq!( - is_new_epoch( - "> 2022 01 09 00 00 30.0000000 0 40", - Version { major: 2, minor: 0 } - ), - false - ); - assert_eq!( - is_new_epoch( - "G01 22331467.880 117352685.28208 48.950 22331469.28", - Version { major: 3, minor: 0 } - ), - false - ); + assert!(is_new_epoch( + "95 01 01 00 00 00.0000000 0 7 06 17 21 22 23 28 31", + Version { major: 2, minor: 0 } + )); + assert!(!is_new_epoch( + "21700656.31447 16909599.97044 .00041 24479973.67844 24479975.23247", + Version { major: 2, minor: 0 } + )); + assert!(is_new_epoch( + "95 01 01 11 00 00.0000000 0 8 04 16 18 19 22 24 27 29", + Version { major: 2, minor: 0 } + )); + assert!(!is_new_epoch( + "95 01 01 11 00 00.0000000 0 8 04 16 18 19 22 24 27 29", + Version { major: 3, minor: 0 } + )); + assert!(is_new_epoch( + "> 2022 01 09 00 00 30.0000000 0 40", + Version { major: 3, minor: 0 } + )); + assert!(!is_new_epoch( + "> 2022 01 09 00 00 30.0000000 0 40", + Version { major: 2, minor: 0 } + )); + assert!(!is_new_epoch( + "G01 22331467.880 117352685.28208 48.950 22331469.28", + Version { major: 3, minor: 0 } + )); } } diff --git a/rinex/src/observation/snr.rs b/rinex/src/observation/snr.rs index 8d95fee0c..b5f3b707c 100644 --- a/rinex/src/observation/snr.rs +++ b/rinex/src/observation/snr.rs @@ -1,6 +1,6 @@ use std::str::FromStr; -#[derive(Debug, Clone)] +#[derive(PartialEq, Debug, Clone)] pub enum Error { InvalidSnrCode, } @@ -83,14 +83,53 @@ impl FromStr for Snr { "7" => Ok(Snr::DbHz42_47), "8" => Ok(Snr::DbHz48_53), "9" => Ok(Snr::DbHz54), + "bad" => Ok(Snr::DbHz18_23), + "weak" => Ok(Snr::DbHz24_29), + "strong" => Ok(Snr::DbHz30_35), + "excellent" => Ok(Snr::DbHz48_53), _ => Err(Error::InvalidSnrCode), } } } impl From for Snr { - fn from(f: f64) -> Self { - Self::from(f as u8) + fn from(f_db: f64) -> Self { + if f_db < 12.0 { + Self::DbHz12 + } else if f_db <= 17.0 { + Self::DbHz12_17 + } else if f_db <= 23.0 { + Self::DbHz18_23 + } else if f_db <= 29.0 { + Self::DbHz24_29 + } else if f_db <= 35.0 { + Self::DbHz30_35 + } else if f_db <= 41.0 { + Self::DbHz36_41 + } else if f_db <= 47.0 { + Self::DbHz42_47 + } else if f_db <= 53.0 { + Self::DbHz48_53 + } else { + Self::DbHz54 + } + } +} + +impl From for f64 { + fn from(val: Snr) -> Self { + match val { + Snr::DbHz0 => 0.0_f64, + Snr::DbHz12 => 12.0_f64, + Snr::DbHz12_17 => 17.0_f64, + Snr::DbHz18_23 => 23.0_f64, + Snr::DbHz24_29 => 29.0_f64, + Snr::DbHz30_35 => 35.0_f64, + Snr::DbHz36_41 => 41.0_f64, + Snr::DbHz42_47 => 47.0_f64, + Snr::DbHz48_53 => 53.0_f64, + Snr::DbHz54 => 54.0_f64, + } } } @@ -121,14 +160,6 @@ impl From for Snr { } impl Snr { - pub fn new(quality: &str) -> Self { - match quality.trim() { - "excellent" => Self::DbHz42_47, - "strong" => Self::DbHz30_35, - "weak" => Self::DbHz24_29, - _ => Self::DbHz18_23, - } - } /// Returns true if self describes a bad signal level pub fn bad(self) -> bool { self <= Snr::DbHz18_23 @@ -150,6 +181,7 @@ impl Snr { #[cfg(test)] mod test { use super::*; + use std::str::FromStr; #[test] fn observation_snr() { let snr = Snr::from_str("0").unwrap(); @@ -176,9 +208,14 @@ mod test { assert_eq!(snr, Snr::DbHz12); assert!(snr.bad()); - assert_eq!(Snr::new("excellent"), Snr::DbHz42_47); - assert_eq!(Snr::new("strong"), Snr::DbHz30_35); - assert_eq!(Snr::new("weak"), Snr::DbHz24_29); - assert_eq!(Snr::new("bad"), Snr::DbHz18_23); + assert_eq!(Snr::from_str("excellent"), Ok(Snr::DbHz48_53)); + assert_eq!(Snr::from_str("strong"), Ok(Snr::DbHz30_35)); + assert_eq!(Snr::from_str("weak"), Ok(Snr::DbHz24_29)); + assert_eq!(Snr::from_str("bad"), Ok(Snr::DbHz18_23)); + + assert!(Snr::from_str("bad").unwrap().bad()); + assert!(Snr::from_str("weak").unwrap().weak()); + assert!(Snr::from_str("strong").unwrap().strong()); + assert!(Snr::from_str("excellent").unwrap().excellent()); } } diff --git a/rinex/src/record.rs b/rinex/src/record.rs index 3034ad4f2..c7f52876a 100644 --- a/rinex/src/record.rs +++ b/rinex/src/record.rs @@ -7,8 +7,9 @@ use serde::Serialize; use super::{ antex, clocks, + clocks::{ClockData, ClockDataType}, hatanaka::{Compressor, Decompressor}, - header, ionex, is_comment, merge, + header, ionex, is_rinex_comment, merge, merge::Merge, meteo, navigation, observation, reader::BufferedReader, @@ -145,11 +146,11 @@ impl Record { Type::ObservationData => { let record = self.as_obs().unwrap(); let obs_fields = &header.obs.as_ref().unwrap(); - let mut compressor = Compressor::new(); + let mut compressor = Compressor::default(); for ((epoch, flag), (clock_offset, data)) in record.iter() { let epoch = observation::record::fmt_epoch(*epoch, *flag, clock_offset, data, header); - if let Some(_) = &obs_fields.crinex { + if obs_fields.crinex.is_some() { let major = header.version.major; let constell = &header.constellation.as_ref().unwrap(); for line in epoch.lines() { @@ -184,38 +185,38 @@ impl Record { } }, Type::IonosphereMaps => { - if let Some(r) = self.as_ionex() { - for (index, (epoch, (_map, _, _))) in r.iter().enumerate() { - let _ = write!(writer, "{:6} START OF TEC MAP", index); - let _ = write!( - writer, - "{} EPOCH OF CURRENT MAP", - epoch::format(*epoch, None, Type::IonosphereMaps, 1) - ); - let _ = write!(writer, "{:6} END OF TEC MAP", index); - } - /* - * not efficient browsing, but matches provided examples and common formatting. - * RMS and Height maps are passed after TEC maps. - */ - for (index, (epoch, (_, _map, _))) in r.iter().enumerate() { - let _ = write!(writer, "{:6} START OF RMS MAP", index); - let _ = write!( - writer, - "{} EPOCH OF CURRENT MAP", - epoch::format(*epoch, None, Type::IonosphereMaps, 1) - ); - let _ = write!(writer, "{:6} END OF RMS MAP", index); - } - for (index, (epoch, (_, _, _map))) in r.iter().enumerate() { - let _ = write!(writer, "{:6} START OF HEIGHT MAP", index); - let _ = write!( - writer, - "{} EPOCH OF CURRENT MAP", - epoch::format(*epoch, None, Type::IonosphereMaps, 1) - ); - let _ = write!(writer, "{:6} END OF HEIGHT MAP", index); - } + if let Some(_r) = self.as_ionex() { + //for (index, (epoch, (_map, _, _))) in r.iter().enumerate() { + // let _ = write!(writer, "{:6} START OF TEC MAP", index); + // let _ = write!( + // writer, + // "{} EPOCH OF CURRENT MAP", + // epoch::format(*epoch, None, Type::IonosphereMaps, 1) + // ); + // let _ = write!(writer, "{:6} END OF TEC MAP", index); + //} + // /* + // * not efficient browsing, but matches provided examples and common formatting. + // * RMS and Height maps are passed after TEC maps. + // */ + //for (index, (epoch, (_, _map, _))) in r.iter().enumerate() { + // let _ = write!(writer, "{:6} START OF RMS MAP", index); + // let _ = write!( + // writer, + // "{} EPOCH OF CURRENT MAP", + // epoch::format(*epoch, None, Type::IonosphereMaps, 1) + // ); + // let _ = write!(writer, "{:6} END OF RMS MAP", index); + //} + //for (index, (epoch, (_, _, _map))) in r.iter().enumerate() { + // let _ = write!(writer, "{:6} START OF HEIGHT MAP", index); + // let _ = write!( + // writer, + // "{} EPOCH OF CURRENT MAP", + // epoch::format(*epoch, None, Type::IonosphereMaps, 1) + // ); + // let _ = write!(writer, "{:6} END OF HEIGHT MAP", index); + //} } }, _ => panic!("record type not supported yet"), @@ -249,13 +250,15 @@ pub enum Error { /// Returns true if given line matches the start /// of a new epoch, inside a RINEX record. pub fn is_new_epoch(line: &str, header: &header::Header) -> bool { - if is_comment!(line) { + if is_rinex_comment(line) { return false; } match &header.rinex_type { Type::AntennaData => antex::record::is_new_epoch(line), Type::ClockData => clocks::record::is_new_epoch(line), - Type::IonosphereMaps => ionex::record::is_new_map(line), + Type::IonosphereMaps => { + ionex::record::is_new_tec_plane(line) || ionex::record::is_new_rms_plane(line) + }, Type::NavigationData => navigation::record::is_new_epoch(line, header.version), Type::ObservationData => observation::record::is_new_epoch(line, header.version), Type::MeteoData => meteo::record::is_new_epoch(line, header.version), @@ -300,7 +303,7 @@ pub fn parse_record( }, Some(constellation) => { obs_ts = constellation - .to_timescale() + .timescale() .ok_or(Error::ObservationDataTimescaleIdentification)?; }, } @@ -310,12 +313,8 @@ pub fn parse_record( // but others may exist: // in this case we used the previously identified Epoch // and attach other kinds of maps - let mut ionx_rms = false; - let mut ionx_height = false; let mut ionx_rec = ionex::Record::new(); - // we need to store encountered epochs, to relate RMS and H maps - // that might be provided in a separate sequence - let mut ionx_epochs: Vec = Vec::with_capacity(128); + let mut ionex_rms_plane = false; for l in reader.lines() { // iterates one line at a time @@ -323,7 +322,7 @@ pub fn parse_record( // COMMENTS special case // --> store // ---> append later with epoch.timestamp attached to it - if is_comment!(line) { + if is_rinex_comment(&line) { let comment = line.split_at(60).0.trim_end(); comment_content.push(comment.to_string()); continue; @@ -333,7 +332,7 @@ pub fn parse_record( if line.contains("EXPONENT") { if let Some(ionex) = header.ionex.as_mut() { let content = line.split_at(60).0; - if let Ok(e) = i8::from_str_radix(content.trim(), 10) { + if let Ok(e) = content.trim().parse::() { *ionex = ionex.with_exponent(e); // scaling update } } @@ -367,7 +366,7 @@ pub fn parse_record( /* * RINEX */ - if line.len() == 0 { + if line.is_empty() { // we might encounter empty lines // and the following parsers (.lines() iterator) // do not like it @@ -380,7 +379,7 @@ pub fn parse_record( /* * RINEX */ - if line.len() == 0 { + if line.is_empty() { // we might encounter empty lines // and the following parsers (.lines() iterator) // do not like it @@ -393,9 +392,8 @@ pub fn parse_record( for line in content.lines() { // in case of CRINEX -> RINEX < 3 being recovered, // we have more than 1 ligne to process - let new_epoch = is_new_epoch(line, &header); - ionx_rms |= ionex::record::is_new_rms_map(line); - ionx_height |= ionex::record::is_new_height_map(line); + let new_epoch = is_new_epoch(line, header); + ionex_rms_plane = ionex::record::is_new_rms_plane(line); if new_epoch && !first_epoch { match &header.rinex_type { @@ -410,21 +408,21 @@ pub fn parse_record( .entry(e) .and_modify(|frames| frames.push(fr.clone())) .or_insert_with(|| vec![fr.clone()]); - comment_ts = e.clone(); // for comments classification & management + comment_ts = e; // for comments classification & management } }, Type::ObservationData => { if let Ok((e, ck_offset, map)) = - observation::record::parse_epoch(&header, &epoch_content, obs_ts) + observation::record::parse_epoch(header, &epoch_content, obs_ts) { obs_rec.insert(e, (ck_offset, map)); - comment_ts = e.0.clone(); // for comments classification & management + comment_ts = e.0; // for comments classification & management } }, Type::MeteoData => { - if let Ok((e, map)) = meteo::record::parse_epoch(&header, &epoch_content) { + if let Ok((e, map)) = meteo::record::parse_epoch(header, &epoch_content) { met_rec.insert(e, map); - comment_ts = e.clone(); // for comments classification & management + comment_ts = e; // for comments classification & management } }, Type::ClockData => { @@ -436,24 +434,23 @@ pub fn parse_record( d.insert(system, data); } else { // --> new system entry for this `epoch` - let mut inner: HashMap = + let mut inner: HashMap = HashMap::new(); inner.insert(system, data); e.insert(dtype, inner); } } else { // --> new epoch entry - let mut inner: HashMap = - HashMap::new(); + let mut inner: HashMap = HashMap::new(); inner.insert(system, data); let mut map: HashMap< - clocks::DataType, - HashMap, + ClockDataType, + HashMap, > = HashMap::new(); map.insert(dtype, inner); clk_rec.insert(epoch, map); } - comment_ts = epoch.clone(); // for comments classification & management + comment_ts = epoch; // for comments classification & management } }, Type::AntennaData => { @@ -476,31 +473,31 @@ pub fn parse_record( } }, Type::IonosphereMaps => { - if let Ok((index, epoch, map)) = - ionex::record::parse_map(header, &epoch_content) + if let Ok((epoch, altitude, plane)) = + ionex::record::parse_plane(&epoch_content, header, ionex_rms_plane) { - if ionx_rms { - ionx_rms = false; - if let Some(e) = ionx_epochs.get(index) { - // relate - if let Some((_, rms, _)) = ionx_rec.get_mut(e) { - // locate - *rms = Some(map); // insert + if ionex_rms_plane { + if let Some(rec_plane) = ionx_rec.get_mut(&(epoch, altitude)) { + // provide RMS value for the entire plane + for ((_, rec_tec), (_, tec)) in + rec_plane.iter_mut().zip(plane.iter()) + { + rec_tec.rms = tec.rms; } + } else { + // insert RMS values + ionx_rec.insert((epoch, altitude), plane); } - } else if ionx_height { - ionx_height = false; - if let Some(e) = ionx_epochs.get(index) { - // relate - if let Some((_, _, h)) = ionx_rec.get_mut(e) { - // locate - *h = Some(map); // insert - } + } else if let Some(rec_plane) = ionx_rec.get_mut(&(epoch, altitude)) { + // provide TEC value for the entire plane + for ((_, rec_tec), (_, tec)) in + rec_plane.iter_mut().zip(plane.iter()) + { + rec_tec.tec = tec.tec; } } else { - // TEC map => insert epoch - ionx_epochs.push(epoch.clone()); - ionx_rec.insert(epoch, (map, None, None)); + // insert TEC values + ionx_rec.insert((epoch, altitude), plane); } } }, @@ -538,21 +535,21 @@ pub fn parse_record( .entry(e) .and_modify(|current| current.push(fr.clone())) .or_insert_with(|| vec![fr.clone()]); - comment_ts = e.clone(); // for comments classification & management + comment_ts = e; // for comments classification & management } }, Type::ObservationData => { if let Ok((e, ck_offset, map)) = - observation::record::parse_epoch(&header, &epoch_content, obs_ts) + observation::record::parse_epoch(header, &epoch_content, obs_ts) { obs_rec.insert(e, (ck_offset, map)); - comment_ts = e.0.clone(); // for comments classification + management + comment_ts = e.0; // for comments classification + management } }, Type::MeteoData => { - if let Ok((e, map)) = meteo::record::parse_epoch(&header, &epoch_content) { + if let Ok((e, map)) = meteo::record::parse_epoch(header, &epoch_content) { met_rec.insert(e, map); - comment_ts = e.clone(); // for comments classification + management + comment_ts = e; // for comments classification + management } }, Type::ClockData => { @@ -569,46 +566,47 @@ pub fn parse_record( } else { // --> new system entry for this `epoch` let mut map: HashMap< - clocks::DataType, - HashMap, + ClockDataType, + HashMap, > = HashMap::new(); - let mut inner: HashMap = HashMap::new(); + let mut inner: HashMap = HashMap::new(); inner.insert(system, data); map.insert(dtype, inner); } } else { // --> new epoch entry - let mut map: HashMap> = + let mut map: HashMap> = HashMap::new(); - let mut inner: HashMap = HashMap::new(); + let mut inner: HashMap = HashMap::new(); inner.insert(system, data); map.insert(dtype, inner); clk_rec.insert(e, map); } - comment_ts = e.clone(); // for comments classification & management + comment_ts = e; // for comments classification & management } }, Type::IonosphereMaps => { - if let Ok((index, epoch, map)) = ionex::record::parse_map(header, &epoch_content) { - if ionx_rms { - if let Some(e) = ionx_epochs.get(index) { - // relate - if let Some((_, rms, _)) = ionx_rec.get_mut(e) { - // locate - *rms = Some(map); // insert + if let Ok((epoch, altitude, plane)) = + ionex::record::parse_plane(&epoch_content, header, ionex_rms_plane) + { + if ionex_rms_plane { + if let Some(rec_plane) = ionx_rec.get_mut(&(epoch, altitude)) { + // provide RMS value for the entire plane + for ((_, rec_tec), (_, tec)) in rec_plane.iter_mut().zip(plane.iter()) { + rec_tec.rms = tec.rms; } + } else { + // insert RMS values + ionx_rec.insert((epoch, altitude), plane); } - } else if ionx_height { - if let Some(e) = ionx_epochs.get(index) { - // relate - if let Some((_, _, h)) = ionx_rec.get_mut(e) { - // locate - *h = Some(map); // insert - } + } else if let Some(rec_plane) = ionx_rec.get_mut(&(epoch, altitude)) { + // provide TEC value for the entire plane + for ((_, rec_tec), (_, tec)) in rec_plane.iter_mut().zip(plane.iter()) { + rec_tec.tec = tec.tec; } } else { - // introduce TEC+epoch - ionx_rec.insert(epoch, (map, None, None)); + // insert TEC values + ionx_rec.insert((epoch, altitude), plane); } } }, @@ -657,15 +655,15 @@ impl Merge for Record { fn merge_mut(&mut self, rhs: &Self) -> Result<(), merge::Error> { if let Some(lhs) = self.as_mut_nav() { if let Some(rhs) = rhs.as_nav() { - lhs.merge_mut(&rhs)?; + lhs.merge_mut(rhs)?; } } else if let Some(lhs) = self.as_mut_obs() { if let Some(rhs) = rhs.as_obs() { - lhs.merge_mut(&rhs)?; + lhs.merge_mut(rhs)?; } } else if let Some(lhs) = self.as_mut_meteo() { if let Some(rhs) = rhs.as_meteo() { - lhs.merge_mut(&rhs)?; + lhs.merge_mut(rhs)?; } /*} else if let Some(lhs) = self.as_mut_ionex() { if let Some(rhs) = rhs.as_ionex() { @@ -673,11 +671,11 @@ impl Merge for Record { }*/ } else if let Some(lhs) = self.as_mut_antex() { if let Some(rhs) = rhs.as_antex() { - lhs.merge_mut(&rhs)?; + lhs.merge_mut(rhs)?; } } else if let Some(lhs) = self.as_mut_clock() { if let Some(rhs) = rhs.as_clock() { - lhs.merge_mut(&rhs)?; + lhs.merge_mut(rhs)?; } } Ok(()) diff --git a/rinex/src/split.rs b/rinex/src/split.rs index a104d6c97..ffaaf9055 100644 --- a/rinex/src/split.rs +++ b/rinex/src/split.rs @@ -6,6 +6,8 @@ use thiserror::Error; pub enum Error { #[error("this record type is not indexed by epoch")] NoEpochIteration, + #[error("this record does not contained specified epoch")] + NonExistingEpoch, } pub trait Split { diff --git a/rinex/src/sv.rs b/rinex/src/sv.rs index be066e4e6..c5480e392 100644 --- a/rinex/src/sv.rs +++ b/rinex/src/sv.rs @@ -1,5 +1,7 @@ //! Satellite vehicle use super::{constellation, Constellation}; +use hifitime::Epoch; +use hifitime::TimeScale; use thiserror::Error; #[cfg(feature = "serde")] @@ -15,6 +17,11 @@ pub struct Sv { pub constellation: Constellation, } +/* + * Database, built by build.rs, for detailed SBAS vehicle identification + */ +include!(concat!(env!("OUT_DIR"), "/sbas.rs")); + /// ̀`Sv` parsing & identification related errors #[derive(Error, Debug, Clone, PartialEq)] pub enum ParsingError { @@ -29,30 +36,84 @@ impl Sv { pub fn new(constellation: Constellation, prn: u8) -> Self { Self { prn, constellation } } + /// Returns timescale associated to this SV + pub fn timescale(&self) -> Option { + self.constellation.timescale() + } + /* + * Tries to retrieve SBAS detailed definitions for self. + * For that, we use the PRN number (+100 for SBAS) + */ + pub(crate) fn sbas_definitions(&self) -> Option<&SBASHelper> { + let to_find = (self.prn as u16) + 100; + SBAS_VEHICLES + .iter() + .filter(|e| e.prn == to_find) + .reduce(|e, _| e) + } + /// Returns datetime at which Self was either launched or its serviced was deployed. + /// This only applies to SBAS vehicles. Datetime expressed as [Epoch] at midnight UTC. + pub fn launched_date(&self) -> Option { + let definition = self.sbas_definitions()?; + Some(Epoch::from_gregorian_utc_at_midnight( + definition.launched_year, + definition.launched_month, + definition.launched_day, + )) + } } impl std::str::FromStr for Sv { type Err = ParsingError; - /// Builds an `Sv` from XYY identification code. - /// code should strictly follow rinex conventions. - /// This method tolerates trailing whitespaces - fn from_str(s: &str) -> Result { - Ok(Sv { - constellation: Constellation::from_1_letter_code(&s[0..1])?, - prn: u8::from_str_radix(&s[1..].trim(), 10)?, - }) + /* + * Parse SV from "XYY" standardized format. + * On "sbas" crate feature, we have the ability to identify + * vehicles in detail. For example S23 is Eutelsat 5WB. + */ + fn from_str(string: &str) -> Result { + let constellation = Constellation::from_str(&string[0..1])?; + let prn = string[1..].trim().parse::()?; + let mut ret = Sv::new(constellation, prn); + if constellation.is_sbas() { + // map the SXX to meaningful SBAS + if let Some(sbas) = ret.sbas_definitions() { + // this can't fail because the SBAS database only + // contains valid Constellations + ret.constellation = Constellation::from_str(sbas.constellation).unwrap(); + } + } + Ok(ret) + } +} + +impl std::fmt::UpperHex for Sv { + /* + * Possibly detailed identity for SBAS vehicles + */ + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + if let Some(sbas) = self.sbas_definitions() { + write!(f, "{}", sbas.id) + } else { + write!(f, "{:x}", self) + } + } +} + +impl std::fmt::LowerHex for Sv { + /* + * Prints self as XYY standard format + */ + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + write!(f, "{:x}{:02}", self.constellation, self.prn) } } impl std::fmt::Display for Sv { - /// Formats self as XYY RINEX three letter code - fn fmt(&self, fmt: &mut std::fmt::Formatter) -> std::fmt::Result { - write!( - fmt, - "{}{:02}", - self.constellation.to_1_letter_code(), - self.prn - ) + /* + * Prints self as XYY standard format + */ + fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { + write!(f, "{:x}{:02}", self.constellation, self.prn) } } @@ -75,11 +136,14 @@ mod test { ("R 9", Sv::new(Constellation::Glonass, 9)), ("I 3", Sv::new(Constellation::IRNSS, 3)), ("I16", Sv::new(Constellation::IRNSS, 16)), - ("S36", Sv::new(Constellation::Geo, 36)), - ("S 6", Sv::new(Constellation::Geo, 6)), ] { let sv = Sv::from_str(descriptor); - assert!(sv.is_ok(), "failed to parse sv from \"{}\"", descriptor); + assert!( + sv.is_ok(), + "failed to parse sv from \"{}\" - {:?}", + descriptor, + sv.err().unwrap() + ); let sv = sv.unwrap(); assert_eq!( sv, expected, @@ -88,4 +152,40 @@ mod test { ); } } + #[test] + fn from_str_with_sbas() { + for (desc, parsed, lowerhex, upperhex) in vec![ + ("S 3", Sv::new(Constellation::SBAS, 3), "S03", "S03"), + ( + "S22", + Sv::new(Constellation::AusNZ, 22), + "S22", + "INMARSAT-4F1", + ), + ("S23", Sv::new(Constellation::EGNOS, 23), "S23", "ASTRA-5B"), + ("S25", Sv::new(Constellation::SDCM, 25), "S25", "Luch-5A"), + ("S 5", Sv::new(Constellation::SBAS, 5), "S05", "S05"), + ("S48", Sv::new(Constellation::ASAL, 48), "S48", "ALCOMSAT-1"), + ] { + let sv = Sv::from_str(desc).unwrap(); + assert_eq!(sv, parsed, "failed to parse correct sv from \"{}\"", desc); + assert_eq!(format!("{:x}", sv), lowerhex); + assert_eq!(format!("{:X}", sv), upperhex); + } + } + #[test] + fn sbas_db_sanity() { + for sbas in SBAS_VEHICLES.iter() { + assert!( + Constellation::from_str(sbas.constellation).is_ok(), + "sbas database should only contain valid constellations: \"{}\"", + sbas.constellation, + ); + let _ = Epoch::from_gregorian_utc_at_midnight( + sbas.launched_year, + sbas.launched_month, + sbas.launched_day, + ); + } + } } diff --git a/rinex/src/tests/antex.rs b/rinex/src/tests/antex.rs index c6c53ff9a..ab0e6361c 100644 --- a/rinex/src/tests/antex.rs +++ b/rinex/src/tests/antex.rs @@ -8,17 +8,17 @@ mod test { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/ATX/V1/TROSAR25.R4__LEIT_2020_09_23.atx"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.is_antex(), true); + assert!(rinex.is_antex()); let header = rinex.header; assert_eq!(header.version.major, 1); assert_eq!(header.version.minor, 4); - assert_eq!(header.antex.is_some(), true); + assert!(header.antex.is_some()); let atx_header = header.antex.as_ref().unwrap(); assert_eq!(atx_header.pcv, Pcv::Absolute); let record = rinex.record.as_antex(); - assert_eq!(record.is_some(), true); + assert!(record.is_some()); let record = record.unwrap(); assert_eq!(record.len(), 1); // Only 1 antenna let (antenna, frequencies) = record.first().unwrap(); @@ -31,17 +31,17 @@ mod test { assert_eq!(antenna.dazi, 5.0); assert_eq!(antenna.zen, (0.0, 90.0)); assert_eq!(antenna.dzen, 5.0); - assert_eq!(antenna.valid_from.is_none(), true); - assert_eq!(antenna.valid_until.is_none(), true); + assert!(antenna.valid_from.is_none()); + assert!(antenna.valid_until.is_none()); for freq in frequencies.iter() { let first = freq.patterns.first(); - assert_eq!(first.is_some(), true); + assert!(first.is_some()); let first = first.unwrap(); - assert_eq!(first.is_azimuth_dependent(), false); + assert!(!first.is_azimuth_dependent()); let mut angle = 0.0_f64; for i in 1..freq.patterns.len() { let p = &freq.patterns[i]; - assert_eq!(p.is_azimuth_dependent(), true); + assert!(p.is_azimuth_dependent()); let (a, _) = p.azimuth_pattern().unwrap(); assert_eq!(angle, a); angle += antenna.dzen; diff --git a/rinex/src/tests/clocks.rs b/rinex/src/tests/clocks.rs index ad6239118..8abe8730f 100644 --- a/rinex/src/tests/clocks.rs +++ b/rinex/src/tests/clocks.rs @@ -1,25 +1,30 @@ #[cfg(test)] mod test { use crate::clocks; - use crate::clocks::record::{DataType, System}; + use crate::clocks::{ClockAnalysisAgency, ClockDataType, System}; use crate::prelude::*; #[test] fn v3_usno_example() { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/CLK/V3/USNO1.txt"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.is_clocks_rinex(), true); - assert_eq!(rinex.header.clocks.is_some(), true); + assert!(rinex.is_clocks_rinex()); + assert!(rinex.header.clocks.is_some()); let clocks = rinex.header.clocks.as_ref().unwrap(); assert_eq!( clocks.codes, - vec![DataType::AS, DataType::AR, DataType::CR, DataType::DR] + vec![ + ClockDataType::AS, + ClockDataType::AR, + ClockDataType::CR, + ClockDataType::DR + ] ); assert_eq!( clocks.agency, - Some(clocks::Agency { + Some(ClockAnalysisAgency { code: String::from("USN"), name: String::from("USNO USING GIPSY/OASIS-II"), }) @@ -33,21 +38,21 @@ mod test { ); assert_eq!(rinex.epoch().count(), 1); let record = rinex.record.as_clock(); - assert_eq!(record.is_some(), true); + assert!(record.is_some()); let record = record.unwrap(); for (e, data_types) in record.iter() { assert_eq!(*e, Epoch::from_gregorian_utc(1994, 07, 14, 20, 59, 00, 00)); for (data_type, systems) in data_types.iter() { assert_eq!(systems.len(), 1); - if *data_type == DataType::AR { + if *data_type == ClockDataType::AR { for (system, data) in systems.iter() { assert_eq!(*system, System::Station("AREQ".to_string())); assert_eq!(data.bias, -0.123456789012); - assert_eq!(data.bias_sigma, Some(-1.23456789012E+0)); - assert_eq!(data.rate, Some(-12.3456789012)); - assert_eq!(data.rate_sigma, Some(-123.456789012)); + assert_eq!(data.bias_dev, Some(-1.23456789012E+0)); + assert_eq!(data.drift, Some(-12.3456789012)); + assert_eq!(data.drift_dev, Some(-123.456789012)); } - } else if *data_type == DataType::AS { + } else if *data_type == ClockDataType::AS { for (system, _) in systems.iter() { assert_eq!( *system, @@ -57,11 +62,11 @@ mod test { }) ); } - } else if *data_type == DataType::CR { + } else if *data_type == ClockDataType::CR { for (system, _) in systems.iter() { assert_eq!(*system, System::Station("USNO".to_string())); } - } else if *data_type == DataType::DR { + } else if *data_type == ClockDataType::DR { for (system, _) in systems.iter() { assert_eq!(*system, System::Station("USNO".to_string())); } @@ -76,27 +81,27 @@ mod test { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/CLK/V3/example1.txt"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.is_clocks_rinex(), true); - assert_eq!(rinex.header.clocks.is_some(), true); + assert!(rinex.is_clocks_rinex()); + assert!(rinex.header.clocks.is_some()); let clocks = rinex.header.clocks.as_ref().unwrap(); - assert_eq!(clocks.codes, vec![DataType::AS, DataType::AR]); + assert_eq!(clocks.codes, vec![ClockDataType::AS, ClockDataType::AR]); assert_eq!( clocks.agency, - Some(clocks::Agency { + Some(ClockAnalysisAgency { code: String::from("USN"), name: String::from("USNO USING GIPSY/OASIS-II"), }) ); assert_eq!(rinex.epoch().count(), 1); let record = rinex.record.as_clock(); - assert_eq!(record.is_some(), true); + assert!(record.is_some()); let record = record.unwrap(); for (e, data_types) in record.iter() { assert_eq!(*e, Epoch::from_gregorian_utc(1994, 07, 14, 20, 59, 00, 00)); for (data_type, systems) in data_types.iter() { - if *data_type == DataType::AR { + if *data_type == ClockDataType::AR { assert_eq!(systems.len(), 4); for (system, data) in systems.iter() { let areq_usa = System::Station("AREQ00USA".to_string()); @@ -105,25 +110,25 @@ mod test { let hark = System::Station("HARK".to_string()); if *system == areq_usa { assert_eq!(data.bias, -0.123456789012); - assert_eq!(data.bias_sigma, Some(-0.123456789012E+01)); - assert_eq!(data.rate, Some(-0.123456789012E+02)); - assert_eq!(data.rate_sigma, Some(-0.123456789012E+03)); + assert_eq!(data.bias_dev, Some(-0.123456789012E+01)); + assert_eq!(data.drift, Some(-0.123456789012E+02)); + assert_eq!(data.drift_dev, Some(-0.123456789012E+03)); } else if *system == gold { assert_eq!(data.bias, -0.123456789012E-01); - assert_eq!(data.bias_sigma, Some(-0.123456789012E-02)); - assert_eq!(data.rate, Some(-0.123456789012E-03)); - assert_eq!(data.rate_sigma, Some(-0.123456789012E-04)); + assert_eq!(data.bias_dev, Some(-0.123456789012E-02)); + assert_eq!(data.drift, Some(-0.123456789012E-03)); + assert_eq!(data.drift_dev, Some(-0.123456789012E-04)); } else if *system == tidb { assert_eq!(data.bias, 0.123456789012E+00); - assert_eq!(data.bias_sigma, Some(0.123456789012E+00)); + assert_eq!(data.bias_dev, Some(0.123456789012E+00)); } else if *system == hark { assert_eq!(data.bias, 0.123456789012E+00); - assert_eq!(data.bias_sigma, Some(0.123456789012E+00)); + assert_eq!(data.bias_dev, Some(0.123456789012E+00)); } else { panic!("falsely identified system \"{}\"", *system); } } - } else if *data_type == DataType::AS { + } else if *data_type == ClockDataType::AS { assert_eq!(systems.len(), 1); for (system, data) in systems.iter() { assert_eq!( @@ -134,7 +139,7 @@ mod test { }) ); assert_eq!(data.bias, -0.123456789012E+00); - assert_eq!(data.bias_sigma, Some(-0.123456789012E-01)); + assert_eq!(data.bias_dev, Some(-0.123456789012E-01)); } } else { panic!("identified unexpected data type \"{}\"", data_type); @@ -147,29 +152,29 @@ mod test { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/CLK/V3/example2.txt"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.is_clocks_rinex(), true); - assert_eq!(rinex.header.clocks.is_some(), true); + assert!(rinex.is_clocks_rinex()); + assert!(rinex.header.clocks.is_some()); let clocks = rinex.header.clocks.as_ref().unwrap(); - assert_eq!(clocks.codes, vec![DataType::AR, DataType::AS]); + assert_eq!(clocks.codes, vec![ClockDataType::AR, ClockDataType::AS]); assert_eq!( clocks.agency, - Some(clocks::Agency { + Some(ClockAnalysisAgency { code: String::from("IGS"), name: String::from("IGSACC @ GA and MIT"), }) ); assert_eq!(rinex.epoch().count(), 1); let record = rinex.record.as_clock(); - assert_eq!(record.is_some(), true); + assert!(record.is_some()); //let record = record.unwrap(); /*for (e, data_types) in record.iter() { assert_eq!(*e, Epoch::from_gregorian_utc(2017, 03, 11, 00, 00, 00, 00)); for (data_type, systems) in data_types.iter() { - if *data_type == DataType::AR { + if *data_type == ClockDataType::AR { assert_eq!(systems.len(), 4); - } else if *data_type == DataType::AS { + } else if *data_type == ClockDataType::AS { assert_eq!(systems.len(), 2); } else { panic!("identified unexpected data type \"{}\"", data_type); diff --git a/rinex/src/tests/compression.rs b/rinex/src/tests/compression.rs index be8a93ccd..8b40f6f39 100644 --- a/rinex/src/tests/compression.rs +++ b/rinex/src/tests/compression.rs @@ -1,16 +1,15 @@ #[cfg(test)] mod test { use crate::prelude::*; - use crate::tests::toolkit::*; + use crate::tests::toolkit::{random_name, test_against_model}; use std::path::PathBuf; #[test] #[ignore] fn crinex1() { let pool = vec![ - ("AJAC3550.21D", "AJAC3550.21O"), - ("aopr0010.17d", "aopr0010.17o"), - ("npaz3550.21d", "npaz3550.21o"), - ("pdel0010.21d", "pdel0010.21o"), + //("AJAC3550.21D", "AJAC3550.21O"), + //("aopr0010.17d", "aopr0010.17o"), + //("npaz3550.21d", "npaz3550.21o"), ("wsra0010.21d", "wsra0010.21o"), ("zegv0010.21d", "zegv0010.21o"), ]; @@ -19,18 +18,24 @@ mod test { let crnx_path = PathBuf::new() .join(env!("CARGO_MANIFEST_DIR")) - .join("CRNX/V1") + .join("../") + .join("test_resources") + .join("CRNX") + .join("V1") .join(crnx_name); let rnx_path = PathBuf::new() .join(env!("CARGO_MANIFEST_DIR")) - .join("OBS/V2") + .join("../") + .join("test_resources") + .join("OBS") + .join("V2") .join(rnx_name); let rnx = Rinex::from_file(&rnx_path.to_string_lossy()); assert!( rnx.is_ok(), - "failed to parse test file \"{}\"", + "failed to parse \"{}\"", rnx_path.to_string_lossy() ); let rnx = rnx.unwrap(); @@ -49,10 +54,12 @@ mod test { // compare to CRINEX1 model let model = model.unwrap(); - compare_with_panic( + let epsilon = 1.0E-3; // CRNX2RNX is not a lossless compression + test_against_model( &dut, &model, &format!("compression::crinx1::{}", rnx_path.to_string_lossy()), + epsilon, ); } } @@ -63,7 +70,6 @@ mod test { ("AJAC3550.21O"), ("aopr0010.17o"), ("npaz3550.21o"), - ("pdel0010.21o"), ("wsra0010.21o"), ("zegv0010.21o"), ]; @@ -99,6 +105,110 @@ mod test { testfile ); + // remove generated file + let _ = std::fs::remove_file(&tmp_path); + } + } + #[test] + #[ignore] + fn crinex3() { + let pool = vec![ + ( + "ACOR00ESP_R_20213550000_01D_30S_MO.crx", + "ACOR00ESP_R_20213550000_01D_30S_MO.rnx", + ), + ("DUTH0630.22D", "DUTH0630.22O"), + ("VLNS0010.22D", "VLNS0010.22O"), + ("VLNS0630.22D", "VLNS0630.22O"), + ("flrs0010.12d", "flrs0010.12o"), + ("pdel0010.21d", "pdel0010.21o"), + ]; + for duplet in pool { + let (crnx_name, rnx_name) = duplet; + + let crnx_path = PathBuf::new() + .join(env!("CARGO_MANIFEST_DIR")) + .join("../") + .join("test_resources") + .join("CRNX") + .join("V3") + .join(crnx_name); + + let rnx_path = PathBuf::new() + .join(env!("CARGO_MANIFEST_DIR")) + .join("../") + .join("test_resources") + .join("OBS") + .join("V3") + .join(rnx_name); + + let rnx = Rinex::from_file(&rnx_path.to_string_lossy()); + assert!( + rnx.is_ok(), + "failed to parse \"{}\"", + rnx_path.to_string_lossy() + ); + let rnx = rnx.unwrap(); + + // convert to CRINEX3 + println!("compressing \"{}\"..", rnx_path.to_string_lossy()); + let dut = rnx.rnx2crnx1(); + + // parse model + let model = Rinex::from_file(&crnx_path.to_string_lossy()); + assert!( + model.is_ok(), + "failed to parse test file \"{}\"", + crnx_path.to_string_lossy() + ); + + // compare + let model = model.unwrap(); + let epsilon = 1.0E-3; // CRNX2RNX is not a lossless compression + test_against_model( + &dut, + &model, + &format!("compression::crinx3::{}", rnx_path.to_string_lossy()), + epsilon, + ); + } + } + #[test] + #[ignore] + fn crinex3_reciprocity() { + let pool = vec![("pdel0010.21o")]; + for testfile in pool { + let rnx_path = format!("../test_resources/OBS/V3/{}", testfile); + + let rnx = Rinex::from_file(&rnx_path); + assert!( + rnx.is_ok(), + "Failed to parse test pool file \"{}\"", + testfile + ); + + // compress + let rnx = rnx.unwrap(); + let compressed = rnx.rnx2crnx1(); + + let tmp_path = format!("test-{}.crx", random_name(8)); + + assert!( + compressed.to_file(&tmp_path).is_ok(), + "{}{}", + "failed to format compressed rinex", + testfile + ); + + // test reciprocity + let uncompressed = compressed.crnx2rnx(); + assert!( + rnx == uncompressed, + "{}{}", + "reciprocity test failed for \"{}\"", + testfile + ); + // remove generated file let _ = std::fs::remove_file(&tmp_path); } diff --git a/rinex/src/tests/decompression.rs b/rinex/src/tests/decompression.rs index 72c619712..eee0080d3 100644 --- a/rinex/src/tests/decompression.rs +++ b/rinex/src/tests/decompression.rs @@ -1,7 +1,11 @@ #[cfg(test)] mod test { use crate::hatanaka::Decompressor; + use crate::tests::toolkit::random_name; + use crate::tests::toolkit::test_observation_rinex; + use crate::{erratic_time_frame, evenly_spaced_time_frame, tests::toolkit::TestTimeFrame}; use crate::{observable, prelude::*}; + use itertools::Itertools; use std::collections::HashMap; use std::path::Path; use std::str::FromStr; @@ -10,7 +14,7 @@ mod test { let pool = vec![ ("zegv0010.21d", "zegv0010.21o"), ("AJAC3550.21D", "AJAC3550.21O"), - ("KOSG0010.95D", "KOSG0010.95O"), + //("KOSG0010.95D", "KOSG0010.95O"), //TODO@ fix tests/obs/v2_kosg first ("aopr0010.17d", "aopr0010.17o"), ("npaz3550.21d", "npaz3550.21o"), ("wsra0010.21d", "wsra0010.21o"), @@ -21,14 +25,13 @@ mod test { let path = format!("../test_resources/CRNX/V1/{}", crnx_name); let crnx = Rinex::from_file(&path); - assert_eq!(crnx.is_ok(), true); + assert!(crnx.is_ok()); let mut rnx = crnx.unwrap(); - assert_eq!(rnx.header.obs.is_some(), true); - let obs = rnx.header.obs.as_ref().unwrap(); + let header = rnx.header.obs.as_ref().unwrap(); - assert_eq!(obs.crinex.is_some(), true); - let infos = obs.crinex.as_ref().unwrap(); + assert!(header.crinex.is_some()); + let infos = header.crinex.as_ref().unwrap(); if crnx_name.eq("zegv0010.21d") { assert_eq!(infos.version.major, 1); @@ -38,6 +41,21 @@ mod test { infos.date, Epoch::from_gregorian_utc(2021, 01, 02, 00, 01, 00, 00) ); + test_observation_rinex( + &rnx, + "2.11", + Some("MIXED"), + "GPS, GLO", + "G07, G08, G10, G13, G15, G16, G18, G20, G21, G23, G26, G27, G30, R01, R02, R03, R08, R09, R15, R16, R17, R18, R19, R24", + "C1, C2, C5, L1, L2, L5, P1, P2, S1, S2, S5", + Some("2021-01-01T00:00:00 GPST"), + Some("2021-01-01T23:59:30 GPST"), + evenly_spaced_time_frame!( + "2021-01-01T00:00:00 GPST", + "2021-01-01T00:09:00 GPST", + "30 s" + ), + ); } else if crnx_name.eq("npaz3550.21d") { assert_eq!(infos.version.major, 1); assert_eq!(infos.version.minor, 0); @@ -46,30 +64,112 @@ mod test { infos.date, Epoch::from_gregorian_utc(2021, 12, 28, 00, 18, 00, 00) ); - } else if crnx_name.eq("pdel0010.21d") { - assert_eq!(infos.version.major, 1); - assert_eq!(infos.version.minor, 0); - assert_eq!(infos.prog, "RNX2CRX ver.4.0.7"); - assert_eq!( - infos.date, - Epoch::from_gregorian_utc(2021, 01, 09, 00, 24, 00, 00) + + test_observation_rinex( + &rnx, + "2.11", + Some("MIXED"), + "GPS, GLO", + "G08,G10,G15,G16,G18,G21,G23,G26,G32,R04,R05,R06,R10,R12,R19,R20,R21", + "C1, L1, L2, P2, S1, S2", + Some("2021-12-21T00:00:00 GPST"), + Some("2021-12-21T23:59:30 GPST"), + evenly_spaced_time_frame!( + "2021-12-21T00:00:00 GPST", + "2021-12-21T01:04:00 GPST", + "30 s" + ), + ); + } else if crnx_name.eq("wsra0010.21d") { + test_observation_rinex( + &rnx, + "2.11", + Some("MIXED"), + "GPS, GLO", + "R09, R02, G07, G13, R17, R16, R01, G18, G26, G10, G30, G23, G27, G08, R18, G20, R15, G21, G15, R24, G16", + "L1, L2, C1, P2, P1, S1, S2", + Some("2021-01-01T00:00:00 GPST"), + None, + evenly_spaced_time_frame!( + "2021-01-01T00:00:00 GPST", + "2021-01-01T00:08:00 GPST", + "30 s" + ), + ); + } else if crnx_name.eq("aopr0010.17d") { + test_observation_rinex( + &rnx, + "2.10", + Some("GPS"), + "GPS", + "G31, G27, G03, G32, G16, G08, G14, G23, G22, G26", + "C1, L1, L2, P1, P2", + Some("2017-01-01T00:00:00 GPST"), + None, + erratic_time_frame!( + " + 2017-01-01T00:00:00 GPST, + 2017-01-01T03:33:40 GPST, + 2017-01-01T06:09:10 GPST + " + ), + ); + //} else if crnx_name.eq("KOSG0010.95D") { + // test_observation_rinex( + // &rnx, + // "2.0", + // Some("GPS"), + // "GPS", + // "G01, G04, G05, G06, G16, G17, G18, G19, G20, G21, G22, G23, G24, G25, G27, G29, G31", + // "C1, L1, L2, P2, S1", + // Some("1995-01-01T00:00:00 GPST"), + // Some("1995-01-01T23:59:30 GPST"), + // erratic_time_frame!(" + // 1995-01-01T00:00:00 GPST, + // 1995-01-01T11:00:00 GPST, + // 1995-01-01T20:44:30 GPST + // "), + // ); + } else if crnx_name.eq("AJAC3550.21D") { + test_observation_rinex( + &rnx, + "2.11", + Some("MIXED"), + "GPS, GLO, GAL, EGNOS", + "G07, G08, G10, G16, G18, G21, G23, G26, G32, R04, R05, R10, R12, R19, R20, R21, E04, E11, E12, E19, E24, E25, E31, E33, S23, S36", + "L1, L2, C1, C2, P1, P2, D1, D2, S1, S2, L5, C5, D5, S5, L7, C7, D7, S7, L8, C8, D8, S8", + Some("2021-12-21T00:00:00 GPST"), + None, + evenly_spaced_time_frame!( + "2021-12-21T00:00:00 GPST", + "2021-12-21T00:00:30 GPST", + "30 s"), ); } - - // convert to RINEX + // decompress and write to file rnx.crnx2rnx_mut(); - + let filename = format!("{}.rnx", random_name(10)); + assert!( + rnx.to_file(&filename).is_ok(), + "failed to dump \"{}\" after decompression", + crnx_name + ); + + // then run comparison with model let obs = rnx.header.obs.as_ref().unwrap(); - assert_eq!(obs.crinex.is_some(), false); + assert!(!obs.crinex.is_some()); - // parse Model for testbench + // parse plain RINEX and run reciprocity let path = format!("../test_resources/OBS/V2/{}", rnx_name); let model = Rinex::from_file(&path); assert!(model.is_ok(), "Failed to parse test model \"{}\"", path); //let model = model.unwrap(); // run testbench - // test_toolkit::compare_with_panic(&rnx, &model, &path); + // test_toolkit::test_against_model(&rnx, &model, &path); + + // remove copy + let _ = std::fs::remove_file(filename); } } #[test] @@ -94,11 +194,11 @@ mod test { let path = format!("../test_resources/CRNX/V3/{}", crnx_name); let crnx = Rinex::from_file(&path); - assert_eq!(crnx.is_ok(), true); + assert!(crnx.is_ok()); let mut rnx = crnx.unwrap(); - assert_eq!(rnx.header.obs.is_some(), true); + assert!(rnx.header.obs.is_some()); let obs = rnx.header.obs.as_ref().unwrap(); - assert_eq!(obs.crinex.is_some(), true); + assert!(obs.crinex.is_some()); let infos = obs.crinex.as_ref().unwrap(); if crnx_name.eq("ACOR00ESP_R_20213550000_01D_30S_MO.crx") { @@ -115,7 +215,7 @@ mod test { rnx.crnx2rnx_mut(); let obs = rnx.header.obs.as_ref().unwrap(); - assert_eq!(obs.crinex.is_some(), false); + assert!(!obs.crinex.is_some()); // parse Model for testbench let path = format!("../test_resources/OBS/V3/{}", rnx_name); @@ -123,7 +223,7 @@ mod test { assert!(model.is_ok(), "Failed to parse test model \"{}\"", path); // run testbench - // test_toolkit::compare_with_panic(&rnx, &model, &path); + // test_toolkit::test_against_model(&rnx, &model, &path); } } /* @@ -171,31 +271,26 @@ mod test { .join("V1") .join("zegv0010.21d"); let fullpath = path.to_string_lossy(); - let rnx = Rinex::from_file(&fullpath.to_string()); + let rnx = Rinex::from_file(fullpath.as_ref()); + assert!(rnx.is_ok(), "failed to parse CRNX/V1/zegv0010.21d"); let rnx = rnx.unwrap(); - let epochs = vec![ - Epoch::from_str("2021-01-01T00:00:00 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:00:30 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:01:00 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:01:30 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:02:00 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:02:30 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:03:00 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:03:30 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:04:00 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:04:30 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:05:00 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:05:30 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:06:00 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:06:30 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:07:00 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:07:30 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:08:00 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:08:30 GPST").unwrap(), - Epoch::from_str("2021-01-01T00:09:00 GPST").unwrap(), - ]; - assert!(rnx.epoch().eq(epochs), "Parsed wrong epoch content",); + + test_observation_rinex( + &rnx, + "2.11", + Some("MIXED"), + "GPS, GLO", + "G07, G08, G10, G13, G15, G16, G18, G20, G21, G23, G26, G27, G30, R01, R02, R03, R08, R09, R15, R16, R17, R18, R19, R24", + "C1, C2, C5, L1, L2, L5, P1, P2, S1, S2, S5", + Some("2021-01-01T00:00:00 GPST"), + Some("2021-01-01T23:59:30 GPST"), + evenly_spaced_time_frame!( + "2021-01-01T00:00:00 GPST", + "2021-01-01T00:09:00 GPST", + "30 s" + ), + ); let record = rnx.record.as_obs().unwrap(); @@ -206,7 +301,7 @@ mod test { assert_eq!(vehicles.len(), 24); for (sv, observations) in vehicles { if *sv == Sv::new(Constellation::GPS, 07) { - let mut keys: Vec<_> = observations.keys().map(|k| k.clone()).collect(); + let mut keys: Vec<_> = observations.keys().cloned().collect(); keys.sort(); let mut expected: Vec = "C1 C2 L1 L2 P1 P2 S1 S2" .split_ascii_whitespace() @@ -246,8 +341,7 @@ mod test { .unwrap(); assert_eq!(s2.obs, 22.286); } else if *sv == Sv::new(Constellation::GPS, 08) { - let mut keys: Vec = - observations.keys().map(|k| k.clone()).collect(); + let mut keys: Vec = observations.keys().cloned().collect(); keys.sort(); let mut expected: Vec = "C1 C2 C5 L1 L2 L5 P1 P2 S1 S2 S5" .split_ascii_whitespace() @@ -299,8 +393,7 @@ mod test { .unwrap(); assert_eq!(s5.obs, 52.161); } else if *sv == Sv::new(Constellation::GPS, 13) { - let mut keys: Vec = - observations.keys().map(|k| k.clone()).collect(); + let mut keys: Vec = observations.keys().cloned().collect(); keys.sort(); let mut expected: Vec = "C1 L1 L2 P1 P2 S1 S2" .split_ascii_whitespace() @@ -355,13 +448,13 @@ mod test { .join("V3") .join("ACOR00ESP_R_20213550000_01D_30S_MO.crx"); let fullpath = path.to_string_lossy(); - let crnx = Rinex::from_file(&fullpath.to_string()); - assert_eq!(crnx.is_ok(), true); + let crnx = Rinex::from_file(fullpath.as_ref()); + assert!(crnx.is_ok()); let rnx = crnx.unwrap(); - assert_eq!(rnx.header.obs.is_some(), true); + assert!(rnx.header.obs.is_some()); let obs = rnx.header.obs.as_ref().unwrap(); - assert_eq!(obs.crinex.is_some(), true); + assert!(obs.crinex.is_some()); let infos = obs.crinex.as_ref().unwrap(); assert_eq!(infos.version.major, 3); @@ -372,44 +465,22 @@ mod test { Epoch::from_gregorian_utc(2021, 12, 28, 01, 01, 00, 00) ); - //assert!( - // rinex.sv_epoch() - // .sorted() - // .eq( - - // ) - // ), - // "sv_epoch() failed", - //); - - let epochs: Vec = vec![ - Epoch::from_str("2021-12-21T00:00:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:00:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:01:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:01:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:02:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:02:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:03:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:03:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:04:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:04:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:05:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:05:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:06:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:06:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:07:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:07:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:08:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:08:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:09:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:09:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:10:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:10:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:11:00 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:11:30 GPST").unwrap(), - Epoch::from_str("2021-12-21T00:12:00 GPST").unwrap(), - ]; - assert!(rnx.epoch().eq(epochs.clone()), "parsed wrong epoch content"); + test_observation_rinex( + &rnx, + "3.04", + Some("MIXED"), + "GPS, GLO, GAL, BDS", + "G01, G07, G08, G10, G16, G18, G21, G23, G26, G30, R04, R05, R10, R12, R20, R21, E02, E11, E12, E24, E25, E31, E33, E36, C05, C11, C14, C21, C22, C23, C25, C28, C34, C37, C42, C43, C44, C58", + "C1C, L1C, S1C, C2S, L2S, S2S, C2W, L2W, S2W, C5Q, L5Q, S5Q, C1C, L1C, S1C, C2P, L2P, S2P, C2C, L2C, S2C, C3Q, L3Q, S3Q, C1C, L1C, S1C, C5Q, L5Q, S5Q, C6C, L6C, S6C, C7Q, L7Q, S7Q, C8Q, L8Q, S8Q, C2I, L2I, S2I, C6I, L6I, S6I, C7I, L7I, S7I", + Some("2021-12-21T00:00:00 GPST"), + Some("2021-12-21T23:59:30 GPST"), + evenly_spaced_time_frame!( + "2021-12-21T00:00:00 GPST", + "2021-12-21T00:12:00 GPST", + "30 s" + ), + ); + /* * record test */ @@ -419,16 +490,14 @@ mod test { assert!(clk_offset.is_none()); } - for e_index in 0..epochs.len() { - let e = epochs.get(e_index).unwrap(); - let flag = EpochFlag::Ok; - let (_, vehicles) = record.get(&(*e, flag)).unwrap(); + for (e_index, epoch) in rnx.epoch().enumerate() { + let (_, vehicles) = record.get(&(epoch, EpochFlag::Ok)).unwrap(); if e_index == 0 { /* * 1st epoch */ assert_eq!(vehicles.len(), 38); - let keys: Vec<_> = vehicles.keys().map(|sv| *sv).collect(); + let keys: Vec<_> = vehicles.keys().copied().collect(); let mut expected: Vec = vec![ Sv::new(Constellation::GPS, 01), Sv::new(Constellation::GPS, 07), @@ -471,12 +540,12 @@ mod test { ]; expected.sort(); assert_eq!(keys, expected); - } else if e_index == epochs.len() - 1 { + } else if e_index == rnx.epoch().count() - 1 { /* * last epoch */ assert_eq!(vehicles.len(), 38); - let keys: Vec<_> = vehicles.keys().map(|sv| *sv).collect(); + let keys: Vec<_> = vehicles.keys().copied().collect(); let mut expected: Vec = vec![ Sv::new(Constellation::GPS, 01), Sv::new(Constellation::GPS, 07), @@ -521,7 +590,7 @@ mod test { assert_eq!(keys, expected); let c58 = vehicles.get(&Sv::new(Constellation::BeiDou, 58)).unwrap(); - let mut keys: Vec = c58.keys().map(|k| k.clone()).collect(); + let mut keys: Vec = c58.keys().cloned().collect(); keys.sort(); let mut expected: Vec = "C2I L2I S2I" @@ -550,14 +619,13 @@ mod test { fn v3_mojn00dnk_sig_strength_regression() { let crnx = Rinex::from_file("../test_resources/CRNX/V3/MOJN00DNK_R_20201770000_01D_30S_MO.crx.gz"); - assert_eq!(crnx.is_ok(), true); + assert!(crnx.is_ok()); let rnx = crnx.unwrap(); - /* * Verify identified observables */ let obs = rnx.header.obs.unwrap().codes.clone(); - for constell in vec![Constellation::Glonass, Constellation::GPS] { + for constell in [Constellation::Glonass, Constellation::GPS] { let codes = obs.get(&constell); assert!(codes.is_some(), "MOJN00DNK_R_20201770000_01D_30S_MO: missing observable codes for constellation {:?}", constell); diff --git a/rinex/src/tests/ionex.rs b/rinex/src/tests/ionex.rs index 06dd6e18c..2b38dccfc 100644 --- a/rinex/src/tests/ionex.rs +++ b/rinex/src/tests/ionex.rs @@ -1,23 +1,114 @@ #[cfg(test)] mod test { use crate::prelude::*; + use std::path::Path; #[test] + #[cfg(feature = "flate2")] + fn v1_ckmg0090_12i() { + let path = Path::new(env!("CARGO_MANIFEST_DIR")) + .join("..") + .join("test_resources") + .join("IONEX") + .join("V1") + .join("CKMG0090.21I.gz"); + let fullpath = path.to_string_lossy(); + + let rinex = Rinex::from_file(fullpath.as_ref()); + assert!(rinex.is_ok(), "failed to parse IONEX/V1CKMG0090.21I.gz"); + + let rinex = rinex.unwrap(); + assert_eq!( + rinex.tec_fixed_altitude(), + Some(350.0), + "bad fixed altitude" + ); + assert_eq!( + rinex.tec_rms().count(), + 0, + "falsely identified some RMS maps" + ); + assert_eq!( + rinex.epoch().count(), + 25, + "wrong amount of epochs identified" + ); + assert_eq!( + rinex.first_epoch(), + Some(Epoch::from_gregorian_utc(2021, 1, 9, 0, 0, 0, 0)) + ); + assert_eq!( + rinex.last_epoch(), + Some(Epoch::from_gregorian_utc(2021, 1, 10, 0, 0, 0, 0)) + ); + assert_eq!( + rinex.dominant_sample_rate(), + Some(Duration::from_hours(1.0)), + "bad dominant sample rate identified" + ); + } + #[test] + #[cfg(feature = "flate2")] + fn v1_jplg0010_17i() { + let path = Path::new(env!("CARGO_MANIFEST_DIR")) + .join("..") + .join("test_resources") + .join("IONEX") + .join("V1") + .join("jplg0010.17i.gz"); + let fullpath = path.to_string_lossy(); + + let rinex = Rinex::from_file(fullpath.as_ref()); + assert!(rinex.is_ok(), "failed to parse IONEX/jplg0010.17i.gz"); + + let rinex = rinex.unwrap(); + assert_eq!( + rinex.tec_fixed_altitude(), + Some(450.0), + "bad fixed altitude" + ); + assert!(rinex.tec_rms().count() > 0, "failed to identify RMS maps"); + assert!( + rinex.tec().count() > 0, + "failed to parse both RMS + TEC maps" + ); + assert_eq!( + rinex.tec().count(), + rinex.tec_rms().count(), + "this file contains one RMS map per TEC map" + ); + + assert_eq!( + rinex.dominant_sample_rate(), + Some(Duration::from_hours(2.0)), + "bad dominant sample rate identified" + ); + } + #[test] + #[cfg(feature = "flate2")] fn v1_ckmg0020_22i() { - let test_resource = - env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/IONEX/V1/CKMG0020.22I.gz"; - let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + let path = Path::new(env!("CARGO_MANIFEST_DIR")) + .join("..") + .join("test_resources") + .join("IONEX") + .join("V1") + .join("CKMG0020.22I.gz"); + let fullpath = path.to_string_lossy(); + + let rinex = Rinex::from_file(fullpath.as_ref()); + assert!(rinex.is_ok(), "failed to parse IONEX/V1/CKMG0020.22I.gz"); + let rinex = rinex.unwrap(); - assert_eq!(rinex.is_ionex(), true); + assert!(rinex.is_ionex()); let header = rinex.header.clone(); assert_eq!(header.version.major, 1); assert_eq!(header.version.minor, 0); - assert_eq!(header.ionex.is_some(), true); + assert!(header.ionex.is_some()); let header = header.ionex.as_ref().unwrap(); + let grid = header.grid.clone(); assert_eq!(grid.height.start, 350.0); assert_eq!(grid.height.end, 350.0); - assert_eq!(rinex.is_ionex_2d(), true); + assert!(rinex.is_ionex_2d()); assert_eq!(grid.latitude.start, 87.5); assert_eq!(grid.latitude.end, -87.5); assert_eq!(grid.latitude.spacing, -2.5); @@ -29,81 +120,86 @@ mod test { assert_eq!(header.elevation_cutoff, 0.0); assert_eq!(header.mapping, None); - let record = rinex.record.as_ionex(); - assert_eq!(record.is_some(), true); - let record = record.unwrap(); - assert_eq!(record.len(), 25); - - // test: this is a 2D IONEX - for (_, (_, rms, h)) in record { - assert_eq!(h.is_none(), true); - assert_eq!(rms.is_none(), true); - } + assert_eq!( + rinex.tec_fixed_altitude(), + Some(350.0), + "bad fixed altitude" + ); + assert_eq!( + rinex.tec_rms().count(), + 0, + "falsely identified some RMS maps" + ); + assert_eq!( + rinex.dominant_sample_rate(), + Some(Duration::from_hours(1.0)), + "bad dominant sample rate identified" + ); // epoch [1] - let e = Epoch::from_gregorian_utc(2022, 1, 2, 0, 0, 0, 0); - let data = record.get(&e); - let (tec, _, _) = data.unwrap(); - for p in tec { - assert_eq!(p.altitude, 350.0); - if p.latitude == 87.5 { - if p.longitude == -180.0 { - assert!((p.value - 9.2).abs() < 1E-3); - } - if p.longitude == -175.0 { - assert!((p.value - 9.2).abs() < 1E-3); - } - } - if p.latitude == 85.0 { - if p.longitude == -180.0 { - assert!((p.value - 9.2).abs() < 1E-3); - } - } - if p.latitude == 32.5 { - if p.longitude == -180.0 { - assert!((p.value - 17.7).abs() < 1E-3); - } - if p.longitude == -175.0 { - assert!((p.value - 16.7).abs() < 1E-3); - } - } - } - // epoch [N-2] - let e = Epoch::from_gregorian_utc(2022, 1, 2, 23, 0, 0, 0); - let data = record.get(&e); - let (tec, _, _) = data.unwrap(); - for p in tec { - assert_eq!(p.altitude, 350.0); - if p.latitude == 87.5 { - if p.longitude == -180.0 { - assert!((p.value - 9.2).abs() < 1E-3); - } - if p.longitude == -175.0 { - assert!((p.value - 9.2).abs() < 1E-3); - } - } - if p.latitude == 27.5 { - if p.longitude == -180.0 { - assert!((p.value - 21.6).abs() < 1E-3); - } - if p.longitude == -175.0 { - assert!((p.value - 21.4).abs() < 1E-3); - } - } - if p.latitude == 25.0 { - if p.longitude == -180.0 { - assert!((p.value - 23.8).abs() < 1E-3); - } - if p.longitude == -175.0 { - assert!((p.value - 23.8).abs() < 1E-3); - } - if p.longitude == 170.0 { - assert!((p.value - 23.2).abs() < 1E-3); - } - if p.longitude == 160.0 { - assert!((p.value - 21.8).abs() < 1E-3); - } - } - } + // let e = Epoch::from_gregorian_utc(2022, 1, 2, 0, 0, 0, 0); + // let data = record.get(&e); + // let (tec, _, _) = data.unwrap(); + // for p in tec { + // assert_eq!(p.altitude, 350.0); + // if p.latitude == 87.5 { + // if p.longitude == -180.0 { + // assert!((p.value - 9.2).abs() < 1E-3); + // } + // if p.longitude == -175.0 { + // assert!((p.value - 9.2).abs() < 1E-3); + // } + // } + // if p.latitude == 85.0 { + // if p.longitude == -180.0 { + // assert!((p.value - 9.2).abs() < 1E-3); + // } + // } + // if p.latitude == 32.5 { + // if p.longitude == -180.0 { + // assert!((p.value - 17.7).abs() < 1E-3); + // } + // if p.longitude == -175.0 { + // assert!((p.value - 16.7).abs() < 1E-3); + // } + // } + // } + // // epoch [N-2] + // let e = Epoch::from_gregorian_utc(2022, 1, 2, 23, 0, 0, 0); + // let data = record.get(&e); + // let (tec, _, _) = data.unwrap(); + // for p in tec { + // assert_eq!(p.altitude, 350.0); + // if p.latitude == 87.5 { + // if p.longitude == -180.0 { + // assert!((p.value - 9.2).abs() < 1E-3); + // } + // if p.longitude == -175.0 { + // assert!((p.value - 9.2).abs() < 1E-3); + // } + // } + // if p.latitude == 27.5 { + // if p.longitude == -180.0 { + // assert!((p.value - 21.6).abs() < 1E-3); + // } + // if p.longitude == -175.0 { + // assert!((p.value - 21.4).abs() < 1E-3); + // } + // } + // if p.latitude == 25.0 { + // if p.longitude == -180.0 { + // assert!((p.value - 23.8).abs() < 1E-3); + // } + // if p.longitude == -175.0 { + // assert!((p.value - 23.8).abs() < 1E-3); + // } + // if p.longitude == 170.0 { + // assert!((p.value - 23.2).abs() < 1E-3); + // } + // if p.longitude == 160.0 { + // assert!((p.value - 21.8).abs() < 1E-3); + // } + // } + // } } } diff --git a/rinex/src/tests/masking.rs b/rinex/src/tests/masking.rs index ca0526eb3..c99b46875 100644 --- a/rinex/src/tests/masking.rs +++ b/rinex/src/tests/masking.rs @@ -1,46 +1,44 @@ #[cfg(test)] mod test { - use rinex::filter; - use rinex::prelude::*; - use rinex::preprocessing::*; + use crate::filter; + use crate::prelude::*; + use crate::preprocessing::*; use std::str::FromStr; #[test] - fn v3_duth0630_g01_g02_filter() { - let mut rnx = Rinex::from_file("../test_resources/OBS/V3/DUTH0630.22O").unwrap(); - rnx.filter_mut(filter!("G01,G03")); + fn sv_filter_v3_duth0630() { + let rnx = Rinex::from_file("../test_resources/OBS/V3/DUTH0630.22O").unwrap(); + let rnx = rnx.filter(filter!("G01,G03")); assert_eq!(rnx.sv().count(), 2); } #[test] - fn v3_duth0630_gps_filter() { + #[ignore] + fn gnss_filter_v3_duth0630() { let mut rnx = Rinex::from_file("../test_resources/OBS/V3/DUTH0630.22O").unwrap(); rnx.filter_mut(filter!("GPS")); assert_eq!(rnx.sv().count(), 12); } - //#[test] - //fn v3_duth0630_gps_prn_filter() { - // let mut rnx = Rinex::from_file("../test_resources/OBS/V3/DUTH0630.22O").unwrap(); - // rnx.filter_mut(filter!(">=G26")); - // assert_eq!(rnx.sv().len(), 2); - //} - //#[test] - //fn v2_cari0010_07m_phys_filter() { - // let mut rnx = Rinex::from_file("../test_resources/MET/V2/cari0010.07m").unwrap(); - // let rnx = rnx.filter(filter!("L1C")); - // assert_eq!(rnx.observables().len(), 3); - // let rnx = rnx.filter(filter!("TD")); - // assert_eq!(rnx.observables().len(), 1); - //} - //#[test] - //fn v2_clar0020_00m_phys_filter() { - // let mut rnx = Rinex::from_file("../test_resources/MET/V2/clar0020.00m").unwrap(); - // rnx.filter_mut(filter!("L1C")); - // assert_eq!(rnx.observables().len(), 3); - // rnx.filter_mut(filter!("PR")); - // assert_eq!(rnx.observables().len(), 1); - //} - //#[test] - //fn v2_cari0010_07m_time_filter() { - // let mut rnx = Rinex::from_file("../test_resources/MET/V2/cari0010.07m").unwrap(); - // rnx.filter_mut(filter!(">= 2000-01-02T22:00:00UTC")); - //} + #[test] + #[ignore] + fn v2_cari0010_07m_phys_filter() { + let rnx = Rinex::from_file("../test_resources/MET/V2/cari0010.07m").unwrap(); + let rnx = rnx.filter(filter!("L1C")); + assert_eq!(rnx.observable().count(), 3); + let rnx = rnx.filter(filter!("TD")); + assert_eq!(rnx.observable().count(), 1); + } + #[test] + #[ignore] + fn v2_clar0020_00m_phys_filter() { + let mut rnx = Rinex::from_file("../test_resources/MET/V2/clar0020.00m").unwrap(); + rnx.filter_mut(filter!("L1C")); + assert_eq!(rnx.observable().count(), 3); + rnx.filter_mut(filter!("PR")); + assert_eq!(rnx.observable().count(), 1); + } + #[test] + #[ignore] + fn v2_cari0010_07m_time_filter() { + let mut rnx = Rinex::from_file("../test_resources/MET/V2/cari0010.07m").unwrap(); + rnx.filter_mut(filter!(">= 2000-01-02T22:00:00UTC")); + } } diff --git a/rinex/src/tests/merge.rs b/rinex/src/tests/merge.rs index ed0761830..592602eea 100644 --- a/rinex/src/tests/merge.rs +++ b/rinex/src/tests/merge.rs @@ -1,8 +1,16 @@ #[cfg(test)] mod test { use crate::prelude::*; + use crate::tests::toolkit::test_observation_rinex; use crate::Merge; + use crate::{ + //erratic_time_frame, + evenly_spaced_time_frame, + tests::toolkit::TestTimeFrame, + }; + //use itertools::Itertools; use std::path::PathBuf; + use std::str::FromStr; #[test] fn fail_on_type_mismatch() { let test_resources = PathBuf::new() @@ -21,10 +29,10 @@ mod test { .join("LARM0630.22O"); let mut r1 = Rinex::from_file(&path1.to_string_lossy()).unwrap(); let r2 = Rinex::from_file(&path2.to_string_lossy()).unwrap(); - assert_eq!(r1.merge_mut(&r2).is_err(), true) + assert!(r1.merge_mut(&r2).is_err()) } #[test] - fn merge() { + fn merge_nav() { let test_resources = PathBuf::new() .join(env!("CARGO_MANIFEST_DIR")) .join("..") @@ -53,7 +61,10 @@ mod test { let rnx_a = rnx_a.unwrap(); let rnx_b = rnx_b.unwrap(); let merged = rnx_a.merge(&rnx_b); - assert!(merged.is_ok(), "failed to merge NAV/V3/CBW100NLD_R_20210010000_01D_MN.rnx into NAV/V3/AMEL00NLD_R_20210010000_01D_MN.rnx"); + assert!( + merged.is_ok(), + "failed to merge NAV/V3/CBW100NLD_R_20210010000_01D_MN.rnx into NAV/V3/AMEL00NLD_R_20210010000_01D_MN.rnx" + ); // dump let merged = merged.unwrap(); @@ -61,11 +72,21 @@ mod test { merged.to_file("merge.txt").is_ok(), "failed to generate file previously merged" ); + assert!( + merged.is_merged(), + "is_merged() should be true after merging!" + ); // parse back let rnx = Rinex::from_file("merge.txt"); assert!(rnx.is_ok(), "Failed to parsed back previously merged file"); + let rnx = rnx.unwrap(); + assert!( + rnx.is_merged(), + "failed to identify a merged file correctly" + ); + /* * Unlock reciprocity test in near future * NAV file production does not work correctly at the moment, @@ -73,6 +94,78 @@ mod test { */ // assert_eq!(rnx, merged, "Merge::ops reciprocity"); + // remove file we just generated + let _ = std::fs::remove_file("merge.txt"); + } + #[test] + #[ignore] + fn merge_obs() { + let test_resources = PathBuf::new() + .join(env!("CARGO_MANIFEST_DIR")) + .join("..") + .join("test_resources"); + let path1 = test_resources + .clone() + .join("OBS") + .join("V2") + .join("AJAC3550.21O"); + let rnx_a = Rinex::from_file(&path1.to_string_lossy()); + assert!(rnx_a.is_ok(), "failed to parse OBS/V2/AJAC3550.21O"); + let path2 = test_resources + .clone() + .join("OBS") + .join("V2") + .join("npaz3550.21o"); + let rnx_b = Rinex::from_file(&path2.to_string_lossy()); + assert!(rnx_b.is_ok(), "failed to parse OBS/V2/npaz3550.21o"); + + let rnx_a = rnx_a.unwrap(); + let rnx_b = rnx_b.unwrap(); + let merged = rnx_a.merge(&rnx_b); + assert!( + merged.is_ok(), + "failed to merge OBS/V2/npaz3550.21o into OBS/V2/AJAC3550.21O" + ); + + let merged = merged.unwrap(); + + test_observation_rinex( + &merged, + "2.11", + Some("MIXED"), + "GPS, GLO, GAL, EGNOS", + "G07, G08, G10, G15, G16, G18, G21, G23, G26, G32, R04, R05, R06, R10, R12, R19, R20, R21, E04, E11, E12, E19, E24, E25, E31, E33, S23, S36", + "L1, L2, C1, C2, P1, P2, D1, D2, S1, S2, L5, C5, D5, S5, L7, C7, D7, S7, L8, C8, D8, S8", + Some("2021-21-12T00:00:00 GPST"), + Some("2021-12-21T23:59:30 GPST"), + evenly_spaced_time_frame!( + "2021-12-21T00:00:00 GPST", + "2021-12-21T01:04:00 GPST", + "30 s") + ); + + // dump + assert!( + merged.to_file("merge.txt").is_ok(), + "failed to generate file previously merged" + ); + assert!( + merged.is_merged(), + "is_merged() should be true after merging!" + ); + + // parse back + let rnx = Rinex::from_file("merge.txt"); + assert!(rnx.is_ok(), "Failed to parsed back previously merged file"); + + let rnx = rnx.unwrap(); + assert!( + rnx.is_merged(), + "failed to identify a merged file correctly" + ); + + assert_eq!(rnx, merged, "merge() reciprocity"); + // remove file we just generated let _ = std::fs::remove_file("merge.txt"); } diff --git a/rinex/src/tests/meteo.rs b/rinex/src/tests/meteo.rs index 060e21ec7..1c842a330 100644 --- a/rinex/src/tests/meteo.rs +++ b/rinex/src/tests/meteo.rs @@ -1,35 +1,102 @@ #[cfg(test)] mod test { use crate::prelude::*; + use crate::tests::toolkit::test_meteo_rinex; + use crate::{erratic_time_frame, evenly_spaced_time_frame, tests::toolkit::TestTimeFrame}; + use itertools::Itertools; use std::str::FromStr; #[test] fn v2_abvi0010_15m() { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/MET/V2/abvi0010.15m"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.is_meteo_rinex(), true); - assert_eq!(rinex.header.obs.is_none(), true); - assert_eq!(rinex.header.meteo.is_some(), true); - - let mut observables: Vec<_> = rinex.observable().collect(); - observables.sort(); // for comparison - - let mut expected: Vec<&Observable> = vec![ - &Observable::Temperature, - &Observable::Pressure, - &Observable::RainIncrement, - &Observable::HumidityRate, - &Observable::WindSpeed, - &Observable::WindDirection, - &Observable::HailIndicator, - ]; - expected.sort(); // for comparison - - assert!(observables == expected, "parsed wrong observable content"); + test_meteo_rinex( + &rinex, + "2.11", + "PR, TD, HR, WS, WD, RI, HI", + erratic_time_frame!( + " + 2015-01-01T00:00:00 UTC, + 2015-01-01T00:01:00 UTC, + 2015-01-01T00:02:00 UTC, + 2015-01-01T00:03:00 UTC, + 2015-01-01T00:04:00 UTC, + 2015-01-01T00:05:00 UTC, + 2015-01-01T00:06:00 UTC, + 2015-01-01T00:07:00 UTC, + 2015-01-01T00:08:00 UTC, + 2015-01-01T00:09:00 UTC, + 2015-01-01T09:00:00 UTC, + 2015-01-01T09:01:00 UTC, + 2015-01-01T09:02:00 UTC, + 2015-01-01T09:03:00 UTC, + 2015-01-01T09:04:00 UTC, + 2015-01-01T19:25:00 UTC, + 2015-01-01T19:26:00 UTC, + 2015-01-01T19:27:00 UTC, + 2015-01-01T19:28:00 UTC, + 2015-01-01T19:29:00 UTC, + 2015-01-01T19:30:00 UTC, + 2015-01-01T19:31:00 UTC, + 2015-01-01T19:32:00 UTC, + 2015-01-01T19:33:00 UTC, + 2015-01-01T19:34:00 UTC, + 2015-01-01T19:35:00 UTC, + 2015-01-01T19:36:00 UTC, + 2015-01-01T19:37:00 UTC, + 2015-01-01T19:38:00 UTC, + 2015-01-01T19:39:00 UTC, + 2015-01-01T19:40:00 UTC, + 2015-01-01T19:41:00 UTC, + 2015-01-01T19:42:00 UTC, + 2015-01-01T19:43:00 UTC, + 2015-01-01T19:44:00 UTC, + 2015-01-01T19:45:00 UTC, + 2015-01-01T19:46:00 UTC, + 2015-01-01T19:47:00 UTC, + 2015-01-01T19:48:00 UTC, + 2015-01-01T19:49:00 UTC, + 2015-01-01T19:50:00 UTC, + 2015-01-01T19:51:00 UTC, + 2015-01-01T19:52:00 UTC, + 2015-01-01T19:53:00 UTC, + 2015-01-01T19:54:00 UTC, + 2015-01-01T22:55:00 UTC, + 2015-01-01T22:56:00 UTC, + 2015-01-01T22:57:00 UTC, + 2015-01-01T22:58:00 UTC, + 2015-01-01T22:59:00 UTC, + 2015-01-01T23:01:00 UTC, + 2015-01-01T23:01:00 UTC, + 2015-01-01T23:02:00 UTC, + 2015-01-01T23:09:00 UTC, + 2015-01-01T23:10:00 UTC, + 2015-01-01T23:11:00 UTC, + 2015-01-01T23:12:00 UTC, + 2015-01-01T23:13:00 UTC, + 2015-01-01T23:14:00 UTC, + 2015-01-01T23:15:00 UTC, + 2015-01-01T23:16:00 UTC, + 2015-01-01T23:17:00 UTC, + 2015-01-01T23:18:00 UTC, + 2015-01-01T23:19:00 UTC, + 2015-01-01T23:20:00 UTC, + 2015-01-01T23:21:00 UTC, + 2015-01-01T23:52:00 UTC, + 2015-01-01T23:53:00 UTC, + 2015-01-01T23:54:00 UTC, + 2015-01-01T23:55:00 UTC, + 2015-01-01T23:56:00 UTC, + 2015-01-01T23:57:00 UTC, + 2015-01-01T23:58:00 UTC, + 2015-01-01T23:59:00 UTC + " + ), + ); - let labels = vec![ + let labels = [ "pressure", "temp", "moisture", @@ -108,7 +175,7 @@ mod test { let content = epochs.get(index as usize); assert!(content.is_some(), "missing epoch {}", epoch); - let content = content.unwrap(); + //let content = content.unwrap(); for (field_index, expected_value) in expected_values.iter().enumerate() { let label = labels[field_index]; let value = record_values[field_index].get(index as usize); @@ -151,24 +218,24 @@ mod test { 0.0, "Error: it did not rain on that day" ); - assert_eq!( - rinex.hail_detected(), - false, - "Error: it did not hail on that day" - ); + assert!(!rinex.hail_detected(), "Error: it did not hail on that day"); } #[test] fn v4_example1() { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/MET/V4/example1.txt"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.is_meteo_rinex(), true); - assert_eq!(rinex.header.obs.is_none(), true); - assert_eq!(rinex.header.meteo.is_some(), true); + test_meteo_rinex( + &rinex, + "4.00", + "PR, TD, HR", + evenly_spaced_time_frame!("2021-01-07T00:00:00 UTC", "2021-01-07T00:02:00 UTC", "30 s"), + ); + let record = rinex.record.as_meteo(); - assert_eq!(record.is_some(), true); + assert!(record.is_some()); let record = record.unwrap(); assert_eq!(record.len(), 5); diff --git a/rinex/src/tests/mod.rs b/rinex/src/tests/mod.rs index dacb79326..c97b7ab9b 100644 --- a/rinex/src/tests/mod.rs +++ b/rinex/src/tests/mod.rs @@ -5,15 +5,17 @@ mod antex; mod clocks; mod compression; mod decompression; -mod ionex; +mod masking; mod merge; - -#[cfg(feature = "meteo")] -mod meteo; - mod nav; mod obs; mod parsing; mod production; mod sampling; mod smoothing; + +#[cfg(feature = "meteo")] +mod meteo; + +#[cfg(feature = "ionex")] +mod ionex; diff --git a/rinex/src/tests/nav.rs b/rinex/src/tests/nav.rs index 3ac205f93..24a6020f4 100644 --- a/rinex/src/tests/nav.rs +++ b/rinex/src/tests/nav.rs @@ -4,25 +4,25 @@ mod test { use crate::prelude::*; use crate::sv; use itertools::*; + use std::path::Path; use std::path::PathBuf; use std::str::FromStr; #[test] + #[cfg(feature = "nav")] fn v2_amel0010_21g() { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/NAV/V2/amel0010.21g"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.is_navigation_rinex(), true); - assert_eq!(rinex.header.obs.is_none(), true); - assert_eq!(rinex.header.meteo.is_none(), true); let record = rinex.record.as_nav(); - assert_eq!(record.is_some(), true); + assert!(record.is_some()); let record = record.unwrap(); assert_eq!(record.len(), 4); // Test: parsed correct amount of entries assert_eq!(rinex.navigation().count(), 4); + // Test: only Ephemeris in this record assert_eq!(rinex.ephemeris().count(), 6); // Test: only Legacy Ephemeris frames in this record @@ -245,54 +245,52 @@ mod test { assert_eq!(ephemeris.get_week(), Some(2138)); } - } else if *e == Epoch::from_str("2021-01-02T00:00:00").unwrap() { - if sv.prn == 30 { - assert_eq!( - ephemeris.sv_clock(), - (-3.621461801230E-04, -6.139089236970E-12, 0.000000000000), - "parsed wrong clock data" - ); + } else if *e == Epoch::from_str("2021-01-02T00:00:00").unwrap() && sv.prn == 30 { + assert_eq!( + ephemeris.sv_clock(), + (-3.621461801230E-04, -6.139089236970E-12, 0.000000000000), + "parsed wrong clock data" + ); - for (field, data) in vec![ - ("iode", Some(8.500000000000E1)), - ("crs", Some(-7.500000000000)), - ("deltaN", Some(5.476656696160E-9)), - ("m0", Some(-1.649762378650)), - ("cuc", Some(-6.072223186490E-7)), - ("e", Some(4.747916595080E-3)), - ("cus", Some(5.392357707020E-6)), - ("sqrta", Some(5.153756387710E+3)), - ("toe", Some(5.184000000000E+5)), - ("cic", Some(7.636845111850E-8)), - ("omega0", Some(2.352085289360E+00)), - ("cis", Some(-2.421438694000E-8)), - ("i0", Some(9.371909002540E-1)), - ("crc", Some(2.614687500000E+2)), - ("omega", Some(-2.846234079630)), - ("omegaDot", Some(-8.435351366240E-9)), - ("idot", Some(-7.000291590240E-11)), - ("l2Codes", Some(1.000000000000)), - ("l2pDataFlag", Some(0.0)), - ("svAccuracy", Some(0.0)), - ("tgd", Some(3.725290298460E-9)), - ("iodc", Some(8.500000000000E1)), - ("t_tm", Some(5.146680000000E5)), - ] { - let value = ephemeris.get_orbit_f64(field); - assert!(value.is_some(), "missing orbit filed \"{}\"", field); - assert_eq!( - value, data, - "parsed wrong \"{}\" value, expecting {:?} got {:?}", - field, data, value - ); - } - assert!( - ephemeris.get_orbit_f64("fitInt").is_none(), - "parsed fitInt unexpectedly" + for (field, data) in vec![ + ("iode", Some(8.500000000000E1)), + ("crs", Some(-7.500000000000)), + ("deltaN", Some(5.476656696160E-9)), + ("m0", Some(-1.649762378650)), + ("cuc", Some(-6.072223186490E-7)), + ("e", Some(4.747916595080E-3)), + ("cus", Some(5.392357707020E-6)), + ("sqrta", Some(5.153756387710E+3)), + ("toe", Some(5.184000000000E+5)), + ("cic", Some(7.636845111850E-8)), + ("omega0", Some(2.352085289360E+00)), + ("cis", Some(-2.421438694000E-8)), + ("i0", Some(9.371909002540E-1)), + ("crc", Some(2.614687500000E+2)), + ("omega", Some(-2.846234079630)), + ("omegaDot", Some(-8.435351366240E-9)), + ("idot", Some(-7.000291590240E-11)), + ("l2Codes", Some(1.000000000000)), + ("l2pDataFlag", Some(0.0)), + ("svAccuracy", Some(0.0)), + ("tgd", Some(3.725290298460E-9)), + ("iodc", Some(8.500000000000E1)), + ("t_tm", Some(5.146680000000E5)), + ] { + let value = ephemeris.get_orbit_f64(field); + assert!(value.is_some(), "missing orbit filed \"{}\"", field); + assert_eq!( + value, data, + "parsed wrong \"{}\" value, expecting {:?} got {:?}", + field, data, value ); - - assert_eq!(ephemeris.get_week(), Some(2138)); } + assert!( + ephemeris.get_orbit_f64("fitInt").is_none(), + "parsed fitInt unexpectedly" + ); + + assert_eq!(ephemeris.get_week(), Some(2138)); } } } @@ -302,15 +300,15 @@ mod test { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/NAV/V3/AMEL00NLD_R_20210010000_01D_MN.rnx"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.is_navigation_rinex(), true); - assert_eq!(rinex.header.obs.is_none(), true); - assert_eq!(rinex.header.meteo.is_none(), true); + assert!(rinex.is_navigation_rinex()); + assert!(rinex.header.obs.is_none()); + assert!(rinex.header.meteo.is_none()); let record = rinex.record.as_nav(); - assert_eq!(record.is_some(), true); + assert!(record.is_some()); let record = record.unwrap(); assert_eq!(record.len(), 6); @@ -472,7 +470,7 @@ mod test { }, _ => panic!("identified unexpected GAL vehicle \"{}\"", sv.prn), }, - _ => panic!("falsely identified \"{}\"", sv.to_string()), + _ => panic!("falsely identified \"{}\"", sv), } } //match sv.constellation } @@ -483,14 +481,14 @@ mod test { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/NAV/V4/KMS300DNK_R_20221591000_01H_MN.rnx.gz"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.is_navigation_rinex(), true); - assert_eq!(rinex.header.obs.is_none(), true); - assert_eq!(rinex.header.meteo.is_none(), true); + assert!(rinex.is_navigation_rinex()); + assert!(rinex.header.obs.is_none()); + assert!(rinex.header.meteo.is_none()); let record = rinex.record.as_nav(); - assert_eq!(record.is_some(), true); + assert!(record.is_some()); let record = record.unwrap(); // test first epoch @@ -876,7 +874,7 @@ mod test { } else { panic!("got unexpected system time \"{}\"", sto.system) } - } else if let Some(fr) = fr.as_eop() { + } else if let Some(_fr) = fr.as_eop() { eop_count += 1; // EOP test //TODO // we do not have EOP frame examples at the moment @@ -945,16 +943,16 @@ mod test { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/NAV/V3/BRDC00GOP_R_20210010000_01D_MN.rnx.gz"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.is_navigation_rinex(), true); - assert_eq!(rinex.header.obs.is_none(), true); - assert_eq!(rinex.header.meteo.is_none(), true); + assert!(rinex.is_navigation_rinex()); + assert!(rinex.header.obs.is_none()); + assert!(rinex.header.meteo.is_none()); assert_eq!(rinex.epoch().count(), 4); let record = rinex.record.as_nav(); - assert_eq!(record.is_some(), true); + assert!(record.is_some()); let record = record.unwrap(); let mut epochs: Vec = vec![ @@ -985,24 +983,25 @@ mod test { for fr in frames { let fr = fr.as_eph(); assert!(fr.is_some(), "only ephemeris frames expected here"); - let (msg, sv, data) = fr.unwrap(); + let (msg, _sv, _data) = fr.unwrap(); assert!(msg == NavMsgType::LNAV, "only lnav frame expected here"); } } } #[test] + #[cfg(feature = "nav")] #[cfg(feature = "flate2")] fn v4_nav_messages() { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/NAV/V4/KMS300DNK_R_20221591000_01H_MN.rnx.gz"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - for (epoch, (msg, sv, ephemeris)) in rinex.ephemeris() { + for (_epoch, (msg, sv, _ephemeris)) in rinex.ephemeris() { match sv.constellation { Constellation::GPS | Constellation::QZSS => { - let expected = vec![NavMsgType::LNAV, NavMsgType::CNAV, NavMsgType::CNV2]; + let expected = [NavMsgType::LNAV, NavMsgType::CNAV, NavMsgType::CNV2]; assert!( expected.contains(&msg), "parsed invalid GPS/QZSS V4 message \"{}\"", @@ -1010,7 +1009,7 @@ mod test { ); }, Constellation::Galileo => { - let expected = vec![NavMsgType::FNAV, NavMsgType::INAV]; + let expected = [NavMsgType::FNAV, NavMsgType::INAV]; assert!( expected.contains(&msg), "parsed invalid Galileo V4 message \"{}\"", @@ -1018,7 +1017,7 @@ mod test { ); }, Constellation::BeiDou => { - let expected = vec![ + let expected = [ NavMsgType::D1, NavMsgType::D2, NavMsgType::CNV1, @@ -1039,7 +1038,7 @@ mod test { msg ); }, - Constellation::Geo => { + Constellation::SBAS => { assert_eq!( msg, NavMsgType::SBAS, @@ -1052,6 +1051,7 @@ mod test { } } #[test] + #[cfg(feature = "nav")] #[cfg(feature = "flate2")] fn v4_brd400dlr_s2023() { let path = PathBuf::new() @@ -1226,31 +1226,31 @@ mod test { } } else if *sv == sv!("G21") { assert_eq!(msg, NavMsgType::CNVX); - } else if *sv == sv!("J04") { - if *epoch == Epoch::from_str("2023-03-12T02:01:54 UTC").unwrap() { - let kb = iondata.as_klobuchar(); - assert!(kb.is_some()); - let kb = kb.unwrap(); - assert_eq!( - kb.alpha, - ( - 3.259629011154e-08, - -1.490116119385e-08, - -4.172325134277e-07, - -1.788139343262e-07 - ) - ); - assert_eq!( - kb.beta, - ( - 1.269760000000e+05, - -1.474560000000e+05, - 1.310720000000e+05, - 2.490368000000e+06 - ) - ); - assert_eq!(kb.region, KbRegionCode::WideArea); - } + } else if *sv == sv!("J04") + && *epoch == Epoch::from_str("2023-03-12T02:01:54 UTC").unwrap() + { + let kb = iondata.as_klobuchar(); + assert!(kb.is_some()); + let kb = kb.unwrap(); + assert_eq!( + kb.alpha, + ( + 3.259629011154e-08, + -1.490116119385e-08, + -4.172325134277e-07, + -1.788139343262e-07 + ) + ); + assert_eq!( + kb.beta, + ( + 1.269760000000e+05, + -1.474560000000e+05, + 1.310720000000e+05, + 2.490368000000e+06 + ) + ); + assert_eq!(kb.region, KbRegionCode::WideArea); } } for (epoch, (msg, sv, eop)) in rinex.earth_orientation() { @@ -1292,8 +1292,8 @@ mod test { } } #[test] - #[cfg(feature = "flate2")] #[cfg(feature = "nav")] + #[cfg(feature = "flate2")] #[ignore] fn sv_interp() { let path = PathBuf::new() @@ -1315,7 +1315,7 @@ mod test { let dt = rinex.dominant_sample_rate().unwrap(); let total_epochs = rinex.epoch().count(); - for (order, max_error) in vec![(7, 1E-1_f64), (9, 1.0E-2_64), (11, 0.5E-3_f64)] { + for (order, max_error) in [(7, 1E-1_f64), (9, 1.0E-2_64), (11, 0.5E-3_f64)] { let tmin = first_epoch + (order / 2) * dt; let tmax = last_epoch - (order / 2) * dt; println!("running Interp({}) testbench..", order); @@ -1324,20 +1324,23 @@ mod test { let interpolated = rinex.sv_position_interpolate(sv, epoch, order as usize); let achieved = interpolated.is_some(); //DEBUG - // println!("tmin: {} | tmax: {} | epoch: {} | feasible : {} | achieved: {}", tmin, tmax, epoch, feasible, achieved); - //if feasible { - // assert!( - // achieved == feasible, - // "interpolation should have been feasible @ epoch {}", - // epoch, - // ); - //} else { - // assert!( - // achieved == feasible, - // "interpolation should not have been feasible @ epoch {}", - // epoch, - // ); - //} + println!( + "tmin: {} | tmax: {} | epoch: {} | feasible : {} | achieved: {}", + tmin, tmax, epoch, feasible, achieved + ); + if feasible { + assert!( + achieved == feasible, + "interpolation should have been feasible @ epoch {}", + epoch, + ); + } else { + assert!( + achieved == feasible, + "interpolation should not have been feasible @ epoch {}", + epoch, + ); + } if !feasible { continue; } @@ -1384,4 +1387,57 @@ mod test { } } } + #[test] + #[cfg(feature = "nav")] + fn sv_toe_ephemeris() { + let path = Path::new(env!("CARGO_MANIFEST_DIR")) + .join("..") + .join("test_resources") + .join("NAV") + .join("V3") + .join("AMEL00NLD_R_20210010000_01D_MN.rnx"); + let rinex = Rinex::from_file(&path.to_string_lossy().to_string()); + assert!(rinex.is_ok()); + let rinex = rinex.unwrap(); + for (toc, (_, sv, ephemeris)) in rinex.ephemeris() { + let e0 = Epoch::from_str("2021-01-01T00:00:00 BDT").unwrap(); + let e1 = Epoch::from_str("2021-01-01T05:00:00 BDT").unwrap(); + let e2 = Epoch::from_str("2021-01-01T10:10:00 GST").unwrap(); + let e3 = Epoch::from_str("2021-01-01T15:40:00 GST").unwrap(); + + let ts = sv.timescale(); + assert!(ts.is_some(), "timescale should be determined"); + let ts = ts.unwrap(); + + if let Some(toe) = ephemeris.toe(ts) { + let mut expected_sv = Sv::default(); + let mut expected_toe = Epoch::default(); + if *toc == e0 { + expected_toe = Epoch::from_str("2021-01-01T00:00:33 BDT").unwrap(); + expected_sv = sv!("C05"); + } else if *toc == e1 { + expected_toe = Epoch::from_str("2021-01-01T05:00:33 BDT").unwrap(); + expected_sv = sv!("C21"); + } else if *toc == e2 { + expected_toe = Epoch::from_str("2021-01-01T10:10:19 GST").unwrap(); + expected_sv = sv!("E01"); + } else if *toc == e3 { + expected_toe = Epoch::from_str("2021-01-01T15:40:19 GST").unwrap(); + expected_sv = sv!("E03"); + } else { + panic!("unhandled toc {}", toc); + } + assert_eq!(*sv, expected_sv, "wrong sv"); + assert_eq!(toe, expected_toe, "wrong toe evaluated"); + /* + * Rinex.sv_ephemeris(@ toe) should return exact ephemeris + */ + assert_eq!( + rinex.sv_ephemeris(expected_sv, toe), + Some((expected_toe, ephemeris)), + "sv_ephemeris(sv,t) @ toe should strictly identical ephemeris" + ); + } + } + } } diff --git a/rinex/src/tests/obs.rs b/rinex/src/tests/obs.rs index 3377d277e..e37760e7e 100644 --- a/rinex/src/tests/obs.rs +++ b/rinex/src/tests/obs.rs @@ -2,83 +2,12 @@ mod test { use crate::observable; use crate::sv; + use crate::tests::toolkit::test_observation_rinex; + use crate::{erratic_time_frame, evenly_spaced_time_frame, tests::toolkit::TestTimeFrame}; use crate::{header::*, observation::*, prelude::*}; + use itertools::Itertools; use std::path::Path; use std::str::FromStr; - /* - * Helper: to create a list of observable - */ - fn create_observ_list(descriptors: Vec<&str>) -> Vec { - let mut r: Vec = vec![]; - for desc in descriptors { - if desc.starts_with("L") { - let obs = Observable::Phase(String::from(desc)); - r.push(obs.clone()); - } else if desc.starts_with("P") { - let obs = Observable::PseudoRange(String::from(desc)); - r.push(obs.clone()); - } else if desc.starts_with("C") { - let obs = Observable::PseudoRange(String::from(desc)); - r.push(obs.clone()); - } else if desc.starts_with("S") { - let obs = Observable::SSI(String::from(desc)); - r.push(obs.clone()); - } - } - r.sort(); // for comparison purposes - r - } - /* - * General testbench - * shared accross all Observation files - */ - fn testbench( - rnx: &Rinex, - _major: u8, - _minor: u8, - c: Constellation, - epochs: Vec, - observables: Vec, - ) { - // must have dedicated fields - assert!(rnx.header.obs.is_some()); - /* - * Test epoch parsing and identification - */ - assert!(rnx.epoch().eq(epochs), "parsed wrong epoch content"); - - let mut parsed_observables: Vec = rnx.observable().cloned().collect(); - parsed_observables.sort(); - - assert!( - observables == parsed_observables, - "parsed wrong observable content,expecting\n{:?}\ngot\n{:?}", - observables, - parsed_observables - ); - - /* - * Test Record content - */ - let record = rnx.record.as_obs(); - assert!(record.is_some()); - let record = record.unwrap(); - assert!(record.len() > 0); - for ((_, _), (clk_offset, vehicles)) in record { - /* - * We don't have any files with clock offsets as of today - */ - assert!(clk_offset.is_none()); - /* - * test GNSS identification - */ - if c != Constellation::Mixed { - for (sv, _) in vehicles { - assert_eq!(sv.constellation, c); - } - } - } - } #[test] fn v2_aopr0010_17o() { let path = Path::new(env!("CARGO_MANIFEST_DIR")) @@ -88,19 +17,27 @@ mod test { .join("V2") .join("aopr0010.17o"); let fullpath = path.to_string_lossy(); - let rinex = Rinex::from_file(&fullpath.to_string()); - assert_eq!(rinex.is_ok(), true); + let rinex = Rinex::from_file(fullpath.as_ref()); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - let epochs: Vec = vec![ - Epoch::from_str("2017-01-01T00:00:00 GPST").unwrap(), - Epoch::from_str("2017-01-01T03:33:40 GPST").unwrap(), - Epoch::from_str("2017-01-01T06:09:10 GPST").unwrap(), - ]; - - let observables = create_observ_list(vec!["L1", "L2", "P1", "P2", "C1"]); + test_observation_rinex( + &rinex, + "2.10", + Some("GPS"), + "GPS", + "G31,G27,G03,G32,G16,G14,G08,G23,G22,G07, G30, G11, G19, G07", + "C1, L1, L2, P2, P1", + Some("2017-01-01T00:00:00 GPST"), + None, + erratic_time_frame!( + "2017-01-01T00:00:00 GPST, + 2017-01-01T03:33:40 GPST, + 2017-01-01T06:09:10 GPST" + ), + ); - testbench(&rinex, 2, 11, Constellation::GPS, epochs, observables); + //testbench(&rinex, 2, 11, Constellation::GPS, epochs, observables); let record = rinex.record.as_obs().unwrap(); for (index, (_e, (_, vehicles))) in record.iter().enumerate() { @@ -205,55 +142,60 @@ mod test { .join("V2") .join("npaz3550.21o"); let fullpath = path.to_string_lossy(); - let rinex = Rinex::from_file(&fullpath.to_string()); - assert_eq!(rinex.is_ok(), true); + let rinex = Rinex::from_file(fullpath.as_ref()); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - //testbench(&rinex, 2, 11, Constellation::Mixed, epochs); - - let obs_hd = rinex.header.obs.as_ref().unwrap(); - let record = rinex.record.as_obs(); - assert_eq!(record.is_some(), true); - let record = record.unwrap(); - ////////////////////////////// - // This file is GPS + GLONASS - ////////////////////////////// - let obscodes = obs_hd.codes.get(&Constellation::GPS); - assert_eq!(obscodes.is_some(), true); - let obscodes = obscodes.unwrap(); - assert_eq!( - obscodes, - &vec![ - Observable::from_str("C1").unwrap(), - Observable::from_str("L1").unwrap(), - Observable::from_str("L2").unwrap(), - Observable::from_str("P2").unwrap(), - Observable::from_str("S1").unwrap(), - Observable::from_str("S2").unwrap(), - ] - ); - let obscodes = obs_hd.codes.get(&Constellation::Glonass); - assert_eq!(obscodes.is_some(), true); - let obscodes = obscodes.unwrap(); - assert_eq!( - obscodes, - &vec![ - Observable::from_str("C1").unwrap(), - Observable::from_str("L1").unwrap(), - Observable::from_str("L2").unwrap(), - Observable::from_str("P2").unwrap(), - Observable::from_str("S1").unwrap(), - Observable::from_str("S2").unwrap(), - ] + test_observation_rinex( + &rinex, + "2.11", + Some("MIXED"), + "GPS, GLO", + "G08,G10,G15,G16,G18,G21,G23,G26,G32,R04,R05,R06,R10,R12,R19,R20,R21", + "C1, L1, L2, P2, S1, S2", + Some("2021-12-21T00:00:00 GPST"), + Some("2021-12-21T23:59:30 GPST"), + evenly_spaced_time_frame!( + "2021-12-21T00:00:00 GPST", + "2021-12-21T01:04:00 GPST", + "30 s" + ), ); + //let obscodes = obs_hd.codes.get(&Constellation::GPS); + //assert_eq!( + // obscodes, + // &vec![ + // Observable::from_str("C1").unwrap(), + // Observable::from_str("L1").unwrap(), + // Observable::from_str("L2").unwrap(), + // Observable::from_str("P2").unwrap(), + // Observable::from_str("S1").unwrap(), + // Observable::from_str("S2").unwrap(), + // ] + //); + //let obscodes = obs_hd.codes.get(&Constellation::Glonass); + //assert_eq!( + // obscodes, + // &vec![ + // Observable::from_str("C1").unwrap(), + // Observable::from_str("L1").unwrap(), + // Observable::from_str("L2").unwrap(), + // Observable::from_str("P2").unwrap(), + // Observable::from_str("S1").unwrap(), + // Observable::from_str("S2").unwrap(), + // ] + //); + + let record = rinex.record.as_obs().unwrap(); + // test epoch [1] let epoch = Epoch::from_str("2021-12-21T00:00:00 GPST").unwrap(); let flag = EpochFlag::Ok; let epoch = record.get(&(epoch, flag)); - assert_eq!(epoch.is_some(), true); + assert!(epoch.is_some()); let (clk_offset, epoch) = epoch.unwrap(); - assert_eq!(clk_offset.is_none(), true); + assert!(clk_offset.is_none()); assert_eq!(epoch.len(), 17); // G08 @@ -262,47 +204,47 @@ mod test { prn: 08, }; let observations = epoch.get(&sv); - assert_eq!(observations.is_some(), true); + assert!(observations.is_some()); let observations = observations.unwrap(); // C1 let observed = observations.get(&Observable::from_str("C1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 22288985.512); assert_eq!(observed.lli, None); assert_eq!(observed.snr, None); // L1 let observed = observations.get(&Observable::from_str("L1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); //assert_eq!(observed.obs, 117129399.048); assert_eq!(observed.lli, Some(LliFlags::OK_OR_UNKNOWN)); assert_eq!(observed.snr, Some(Snr::DbHz36_41)); // L2 let observed = observations.get(&Observable::from_str("L2").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); //assert_eq!(observed.obs, 91269672.416); assert_eq!(observed.lli, Some(LliFlags::UNDER_ANTI_SPOOFING)); assert_eq!(observed.snr, Some(Snr::DbHz36_41)); // P2 let observed = observations.get(&Observable::from_str("P2").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 22288987.972); assert_eq!(observed.lli, None); assert_eq!(observed.snr, None); // S1 let observed = observations.get(&Observable::from_str("S1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 44.000); assert_eq!(observed.lli, None); assert_eq!(observed.snr, None); // S2 let observed = observations.get(&Observable::from_str("S2").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 27.000); assert_eq!(observed.lli, None); @@ -314,39 +256,39 @@ mod test { prn: 19, }; let observations = epoch.get(&sv); - assert_eq!(observations.is_some(), true); + assert!(observations.is_some()); let observations = observations.unwrap(); // C1 let observed = observations.get(&Observable::from_str("C1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 23250776.648); assert_eq!(observed.lli, None); assert_eq!(observed.snr, None); // L1 let observed = observations.get(&Observable::from_str("L1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); //assert_eq!(observed.obs, 124375967.254); assert_eq!(observed.lli, Some(LliFlags::OK_OR_UNKNOWN)); assert_eq!(observed.snr, Some(Snr::DbHz0)); // L2 let observed = observations.get(&Observable::from_str("L2").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); // P2 let observed = observations.get(&Observable::from_str("P2").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); // S1 let observed = observations.get(&Observable::from_str("S1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 32.000); assert_eq!(observed.lli, None); assert_eq!(observed.snr, None); // S2 let observed = observations.get(&Observable::from_str("S2").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); } #[test] fn v2_rovn0010_21o() { @@ -357,16 +299,69 @@ mod test { .join("V2") .join("rovn0010.21o"); let fullpath = path.to_string_lossy(); - let rinex = Rinex::from_file(&fullpath.to_string()); - assert_eq!(rinex.is_ok(), true); + let rinex = Rinex::from_file(fullpath.as_ref()); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); + + test_observation_rinex(&rinex, "2.11", Some("MIXED"), "GPS, GLO", + "G07, G08, G10, G13, G15, G16, G18, G21, G23, G26, G27, G30, R01, R02, R03, R08, R09, R15, R16, R17, R18, R19, R24", "C1, C2, C5, L1, L2, L5, P1, P2, S1, S2, S5", Some("2021-01-01T00:00:00 GPST"), Some("2021-01-01T23:59:30 GPST"), + erratic_time_frame!(" + 2021-01-01T00:00:00 GPST, + 2021-01-01T00:00:30 GPST, + 2021-01-01T01:10:00 GPST, + 2021-01-01T02:25:00 GPST, + 2021-01-01T02:25:30 GPST, + 2021-01-01T02:26:00 GPST + ") + ); + + ////////////////////////////// + // This file is GPS + GLONASS + ////////////////////////////// + //let obscodes = obs_hd.codes.get(&Constellation::GPS); + //assert_eq!(obscodes.is_some(), true); + //let obscodes = obscodes.unwrap(); + //assert_eq!( + // obscodes, + // &vec![ + // Observable::from_str("C1").unwrap(), + // Observable::from_str("C2").unwrap(), + // Observable::from_str("C5").unwrap(), + // Observable::from_str("L1").unwrap(), + // Observable::from_str("L2").unwrap(), + // Observable::from_str("L5").unwrap(), + // Observable::from_str("P1").unwrap(), + // Observable::from_str("P2").unwrap(), + // Observable::from_str("S1").unwrap(), + // Observable::from_str("S2").unwrap(), + // Observable::from_str("S5").unwrap(), + // ] + //); + + //let obscodes = obs_hd.codes.get(&Constellation::Glonass); + //assert_eq!(obscodes.is_some(), true); + //let obscodes = obscodes.unwrap(); + //assert_eq!( + // obscodes, + // &vec![ + // Observable::from_str("C1").unwrap(), + // Observable::from_str("C2").unwrap(), + // Observable::from_str("C5").unwrap(), + // Observable::from_str("L1").unwrap(), + // Observable::from_str("L2").unwrap(), + // Observable::from_str("L5").unwrap(), + // Observable::from_str("P1").unwrap(), + // Observable::from_str("P2").unwrap(), + // Observable::from_str("S1").unwrap(), + // Observable::from_str("S2").unwrap(), + // Observable::from_str("S5").unwrap(), + // ] + //); + /* * Header tb */ let header = &rinex.header; - assert!(rinex.is_observation_rinex()); - assert!(header.obs.is_some()); - assert!(header.meteo.is_none()); assert_eq!( header.ground_position, Some(GroundPosition::from_ecef_wgs84(( @@ -379,62 +374,16 @@ mod test { assert_eq!(header.observer, "Hans van der Marel"); assert_eq!(header.agency, "TU Delft for Deltares"); - let obs_hd = header.obs.as_ref(); - assert!(obs_hd.is_some()); - let obs_hd = obs_hd.unwrap(); - let record = rinex.record.as_obs(); assert!(record.is_some()); let record = record.unwrap(); - ////////////////////////////// - // This file is GPS + GLONASS - ////////////////////////////// - let obscodes = obs_hd.codes.get(&Constellation::GPS); - assert_eq!(obscodes.is_some(), true); - let obscodes = obscodes.unwrap(); - assert_eq!( - obscodes, - &vec![ - Observable::from_str("C1").unwrap(), - Observable::from_str("C2").unwrap(), - Observable::from_str("C5").unwrap(), - Observable::from_str("L1").unwrap(), - Observable::from_str("L2").unwrap(), - Observable::from_str("L5").unwrap(), - Observable::from_str("P1").unwrap(), - Observable::from_str("P2").unwrap(), - Observable::from_str("S1").unwrap(), - Observable::from_str("S2").unwrap(), - Observable::from_str("S5").unwrap(), - ] - ); - - let obscodes = obs_hd.codes.get(&Constellation::Glonass); - assert_eq!(obscodes.is_some(), true); - let obscodes = obscodes.unwrap(); - assert_eq!( - obscodes, - &vec![ - Observable::from_str("C1").unwrap(), - Observable::from_str("C2").unwrap(), - Observable::from_str("C5").unwrap(), - Observable::from_str("L1").unwrap(), - Observable::from_str("L2").unwrap(), - Observable::from_str("L5").unwrap(), - Observable::from_str("P1").unwrap(), - Observable::from_str("P2").unwrap(), - Observable::from_str("S1").unwrap(), - Observable::from_str("S2").unwrap(), - Observable::from_str("S5").unwrap(), - ] - ); // test epoch [1] let epoch = Epoch::from_str("2021-01-01T00:00:00 GPST").unwrap(); let epoch = record.get(&(epoch, EpochFlag::Ok)); - assert_eq!(epoch.is_some(), true); + assert!(epoch.is_some()); let (clk_offset, epoch) = epoch.unwrap(); - assert_eq!(clk_offset.is_none(), true); + assert!(clk_offset.is_none()); assert_eq!(epoch.len(), 24); // G07 @@ -443,74 +392,74 @@ mod test { prn: 07, }; let observations = epoch.get(&sv); - assert_eq!(observations.is_some(), true); + assert!(observations.is_some()); let observations = observations.unwrap(); // C1 let observed = observations.get(&Observable::from_str("C1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 24225566.040); assert_eq!(observed.lli, None); assert_eq!(observed.snr, Some(Snr::DbHz36_41)); //C2 let observed = observations.get(&Observable::from_str("C2").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 24225562.932); assert_eq!(observed.lli, None); assert_eq!(observed.snr, Some(Snr::from_str("6").unwrap())); //C5 [missing] let observed = observations.get(&Observable::from_str("C5").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); //L1 let observed = observations.get(&Observable::from_str("L1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); //assert_eq!(observed.obs, 127306204.852); assert_eq!(observed.lli, Some(LliFlags::OK_OR_UNKNOWN)); assert_eq!(observed.snr, Some(Snr::from_str("6").unwrap())); //L2 let observed = observations.get(&Observable::from_str("L2").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); //assert_eq!(observed.obs, 99199629.819); assert_eq!(observed.lli, Some(LliFlags::OK_OR_UNKNOWN)); assert_eq!(observed.snr, Some(Snr::from_str("4").unwrap())); //L5 [missing] let observed = observations.get(&Observable::from_str("L5").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); //P1 let observed = observations.get(&Observable::from_str("P1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 24225565.620); assert_eq!(observed.lli, None); assert_eq!(observed.snr, Some(Snr::from_str("4").unwrap())); //P2 let observed = observations.get(&Observable::from_str("P2").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 24225563.191); assert_eq!(observed.lli, None); assert_eq!(observed.snr, Some(Snr::from_str("4").unwrap())); //S1 let observed = observations.get(&Observable::from_str("S1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 40.586); assert_eq!(observed.lli, None); assert_eq!(observed.snr, None); //S2 let observed = observations.get(&Observable::from_str("S2").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 25.564); assert_eq!(observed.lli, None); assert_eq!(observed.snr, None); //S5 (missing) let observed = observations.get(&Observable::from_str("S5").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); // G07 let sv = Sv { @@ -518,59 +467,59 @@ mod test { prn: 24, }; let observations = epoch.get(&sv); - assert_eq!(observations.is_some(), true); + assert!(observations.is_some()); let observations = observations.unwrap(); //C1,C2,C5 let observed = observations.get(&Observable::from_str("C1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 23126824.976); assert_eq!(observed.lli, None); assert_eq!(observed.snr, Some(Snr::from_str("6").unwrap())); let observed = observations.get(&Observable::from_str("C2").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 23126830.088); assert_eq!(observed.lli, None); assert_eq!(observed.snr, Some(Snr::from_str("6").unwrap())); let observed = observations.get(&Observable::from_str("C5").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); //L1,L2,L5 let observed = observations.get(&Observable::from_str("L1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); //assert_eq!(observed.obs, 123669526.377); assert_eq!(observed.lli, Some(LliFlags::OK_OR_UNKNOWN)); assert_eq!(observed.snr, Some(Snr::from_str("6").unwrap())); let observed = observations.get(&Observable::from_str("L2").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); //assert_eq!(observed.obs, 96187435.849); assert_eq!(observed.lli, Some(LliFlags::OK_OR_UNKNOWN)); assert_eq!(observed.snr, Some(Snr::from_str("6").unwrap())); let observed = observations.get(&Observable::from_str("L5").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); //P1, P2 let observed = observations.get(&Observable::from_str("P1").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); let observed = observations.get(&Observable::from_str("P2").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); //S1,S2,S5 let observed = observations.get(&Observable::from_str("S1").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 41.931); assert_eq!(observed.lli, None); assert_eq!(observed.snr, None); let observed = observations.get(&Observable::from_str("S2").unwrap()); - assert_eq!(observed.is_some(), true); + assert!(observed.is_some()); let observed = observed.unwrap(); assert_eq!(observed.obs, 39.856); assert_eq!(observed.lli, None); assert_eq!(observed.snr, None); let observed = observations.get(&Observable::from_str("S5").unwrap()); - assert_eq!(observed.is_none(), true); + assert!(observed.is_none()); } #[test] fn v3_duth0630() { @@ -581,67 +530,33 @@ mod test { .join("V3") .join("DUTH0630.22O"); let fullpath = path.to_string_lossy(); - let rinex = Rinex::from_file(&fullpath.to_string()); - assert_eq!(rinex.is_ok(), true); + let rinex = Rinex::from_file(fullpath.as_ref()); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); - assert_eq!(rinex.header.obs.is_some(), true); - let obs = rinex.header.obs.as_ref().unwrap(); - - /* - * test Glonass observables - */ - let observables = obs.codes.get(&Constellation::Glonass); - assert_eq!(observables.is_some(), true); - let mut observables = observables.unwrap().clone(); - observables.sort(); - - let mut expected: Vec = "C1C L1C D1C S1C C2P L2P D2P S2P" - .split_ascii_whitespace() - .map(|k| Observable::from_str(k).unwrap()) - .collect(); - expected.sort(); - assert_eq!(observables, expected); - - /* - * test GPS observables - */ - let observables = obs.codes.get(&Constellation::GPS); - assert_eq!(observables.is_some(), true); - let mut observables = observables.unwrap().clone(); - observables.sort(); - - let mut expected: Vec = "C1C L1C D1C S1C C2W L2W D2W S2W" - .split_ascii_whitespace() - .map(|k| Observable::from_str(k).unwrap()) - .collect(); - expected.sort(); - assert_eq!(observables, expected); - let record = rinex.record.as_obs(); - assert_eq!(record.is_some(), true); - let record = record.unwrap(); + test_observation_rinex( + &rinex, + "3.02", + Some("MIXED"), + "GPS, GLO", + "G03, G01, G04, G09, G17, G19, G21, G22, G31, G32, R01, R02, R08, R09, R10, R17, R23, R24", + "C1C, L1C, D1C, S1C, C2P, L2P, D2P, S2P, C2W, L2W, D2W, S2W", + Some("2022-03-04T00:00:00 GPST"), + Some("2022-03-04T23:59:30 GPST"), + erratic_time_frame!( + "2022-03-04T00:00:00 GPST, 2022-03-04T00:28:30 GPST, 2022-03-04T00:57:00 GPST" + ), + ); /* - * Test epochs + * test Glonass observables */ - let expected: Vec = vec![ - Epoch::from_str("2022-03-04T00:00:00 GPST").unwrap(), - Epoch::from_str("2022-03-04T00:28:30 GPST").unwrap(), - Epoch::from_str("2022-03-04T00:57:00 GPST").unwrap(), - ]; - - let content: Vec<_> = rinex.epoch().collect(); - assert!( - expected == content, - "parsed wrong epoch content {:?}", - content, - ); - + let record = rinex.record.as_obs().unwrap(); let epoch = Epoch::from_str("2022-03-04T00:00:00 GPST").unwrap(); let e = record.get(&(epoch, EpochFlag::Ok)); - assert_eq!(e.is_some(), true); + assert!(e.is_some()); let (clk, vehicles) = e.unwrap(); - assert_eq!(clk.is_none(), true); + assert!(clk.is_none()); assert_eq!(vehicles.len(), 18); let g01 = Sv { @@ -649,24 +564,24 @@ mod test { prn: 01, }; let g01 = vehicles.get(&g01); - assert_eq!(g01.is_some(), true); + assert!(g01.is_some()); let data = g01.unwrap(); let c1c = data.get(&Observable::from_str("C1C").unwrap()); - assert_eq!(c1c.is_some(), true); + assert!(c1c.is_some()); let c1c = c1c.unwrap(); assert_eq!(c1c.obs, 20243517.560); assert!(c1c.lli.is_none()); assert!(c1c.snr.is_none()); let l1c = data.get(&Observable::from_str("L1C").unwrap()); - assert_eq!(l1c.is_some(), true); + assert!(l1c.is_some()); let l1c = l1c.unwrap(); assert_eq!(l1c.obs, 106380411.418); assert_eq!(l1c.lli, Some(LliFlags::OK_OR_UNKNOWN)); assert_eq!(l1c.snr, Some(Snr::from_str("8").unwrap())); let s1c = data.get(&Observable::from_str("S1C").unwrap()); - assert_eq!(s1c.is_some(), true); + assert!(s1c.is_some()); let s1c = s1c.unwrap(); assert_eq!(s1c.obs, 51.250); assert!(s1c.lli.is_none()); @@ -677,47 +592,47 @@ mod test { prn: 03, }; let g03 = vehicles.get(&g03); - assert_eq!(g03.is_some(), true); + assert!(g03.is_some()); let data = g03.unwrap(); let c1c = data.get(&Observable::from_str("C1C").unwrap()); - assert_eq!(c1c.is_some(), true); + assert!(c1c.is_some()); let c1c = c1c.unwrap(); assert_eq!(c1c.obs, 20619020.680); - assert_eq!(c1c.lli.is_none(), true); - assert_eq!(c1c.snr.is_none(), true); + assert!(c1c.lli.is_none()); + assert!(c1c.snr.is_none()); let l1c = data.get(&Observable::from_str("L1C").unwrap()); - assert_eq!(l1c.is_some(), true); + assert!(l1c.is_some()); let g04 = Sv { constellation: Constellation::GPS, prn: 04, }; let g04 = vehicles.get(&g04); - assert_eq!(g04.is_some(), true); + assert!(g04.is_some()); let data = g04.unwrap(); let c1c = data.get(&Observable::from_str("C1C").unwrap()); - assert_eq!(c1c.is_some(), true); + assert!(c1c.is_some()); let c1c = c1c.unwrap(); assert_eq!(c1c.obs, 21542633.500); - assert_eq!(c1c.lli.is_none(), true); - assert_eq!(c1c.snr.is_none(), true); + assert!(c1c.lli.is_none()); + assert!(c1c.snr.is_none()); let l1c = data.get(&Observable::from_str("L1C").unwrap()); - assert_eq!(l1c.is_some(), true); + assert!(l1c.is_some()); let epoch = Epoch::from_str("2022-03-04T00:28:30 GPST").unwrap(); let e = record.get(&(epoch, EpochFlag::Ok)); - assert_eq!(e.is_some(), true); + assert!(e.is_some()); let (clk, vehicles) = e.unwrap(); - assert_eq!(clk.is_none(), true); + assert!(clk.is_none()); assert_eq!(vehicles.len(), 17); let epoch = Epoch::from_str("2022-03-04T00:57:00 GPST").unwrap(); let e = record.get(&(epoch, EpochFlag::Ok)); - assert_eq!(e.is_some(), true); + assert!(e.is_some()); let (clk, vehicles) = e.unwrap(); - assert_eq!(clk.is_none(), true); + assert!(clk.is_none()); assert_eq!(vehicles.len(), 17); } //#[test] @@ -725,19 +640,19 @@ mod test { let test_resource = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/CRNX/V3/KMS300DNK_R_20221591000_01H_30S_MO.crx"; let rinex = Rinex::from_file(&test_resource); - assert_eq!(rinex.is_ok(), true); + assert!(rinex.is_ok()); let rinex = rinex.unwrap(); ////////////////////////// // Header testbench ////////////////////////// - assert_eq!(rinex.is_observation_rinex(), true); - assert_eq!(rinex.header.obs.is_some(), true); + assert!(rinex.is_observation_rinex()); + assert!(rinex.header.obs.is_some()); let obs = rinex.header.obs.as_ref().unwrap(); let glo_observables = obs.codes.get(&Constellation::Glonass); - assert_eq!(glo_observables.is_some(), true); + assert!(glo_observables.is_some()); let glo_observables = glo_observables.unwrap(); let mut index = 0; - for code in vec![ + for code in [ "C1C", "C1P", "C2C", "C2P", "C3Q", "L1C", "L1P", "L2C", "L2P", "L3Q", ] { assert_eq!(glo_observables[index], Observable::from_str(code).unwrap()); @@ -748,33 +663,34 @@ mod test { // Record testbench ////////////////////////// let record = rinex.record.as_obs(); - assert_eq!(record.is_some(), true); + assert!(record.is_some()); let record = record.unwrap(); // EPOCH[1] let epoch = Epoch::from_gregorian_utc(2022, 06, 08, 10, 00, 00, 00); let epoch = record.get(&(epoch, EpochFlag::Ok)); - assert_eq!(epoch.is_some(), true); + assert!(epoch.is_some()); let (clk_offset, epoch) = epoch.unwrap(); - assert_eq!(clk_offset.is_none(), true); + assert!(clk_offset.is_none()); assert_eq!(epoch.len(), 49); // EPOCH[2] let epoch = Epoch::from_gregorian_utc(2022, 06, 08, 10, 00, 30, 00); let epoch = record.get(&(epoch, EpochFlag::Ok)); - assert_eq!(epoch.is_some(), true); + assert!(epoch.is_some()); let (clk_offset, epoch) = epoch.unwrap(); - assert_eq!(clk_offset.is_none(), true); + assert!(clk_offset.is_none()); assert_eq!(epoch.len(), 49); // EPOCH[3] let epoch = Epoch::from_gregorian_utc(2020, 6, 8, 10, 1, 0, 00); let epoch = record.get(&(epoch, EpochFlag::Ok)); - assert_eq!(epoch.is_some(), true); + assert!(epoch.is_some()); let (clk_offset, epoch) = epoch.unwrap(); - assert_eq!(clk_offset.is_none(), true); + assert!(clk_offset.is_none()); assert_eq!(epoch.len(), 47); } #[test] + #[ignore] fn v2_kosg0010_95o() { let path = Path::new(env!("CARGO_MANIFEST_DIR")) .join("..") @@ -783,17 +699,28 @@ mod test { .join("V2") .join("KOSG0010.95O"); let fullpath = path.to_string_lossy(); - let rnx = Rinex::from_file(&fullpath.to_string()).unwrap(); - let expected: Vec = vec![ - Epoch::from_str("1995-01-01T00:00:00 GPST").unwrap(), - Epoch::from_str("1995-01-01T11:00:00 GPST").unwrap(), - Epoch::from_str("1995-01-01T20:44:30 GPST").unwrap(), - ]; - let content: Vec<_> = rnx.epoch().collect(); - assert!( - expected == content, - "parsed wrong epoch content {:?}", - content, + let rnx = Rinex::from_file(fullpath.as_ref()).unwrap(); + //for (e, sv) in rnx.sv_epoch() { + // println!("{:?} @ {}", sv, e); + //} + //panic!("stop"); + test_observation_rinex( + &rnx, + "2.0", + Some("GPS"), + "GPS", + //"G01, G04, G05, G06, G16, G17, G18, G19, G20, G21, G22, G23, G24, G25, G27, G29, G31", + "G01, G04, G05, G06, G16, G17, G18, G19, G20, G21, G22, G23, G24, G25, G27, G29, G31", + "C1, L1, L2, P2, S1", + Some("1995-01-01T00:00:00 GPST"), + Some("1995-01-01T23:59:30 GPST"), + erratic_time_frame!( + " + 1995-01-01T00:00:00 GPST, + 1995-01-01T11:00:00 GPST, + 1995-01-01T20:44:30 GPST + " + ), ); } #[test] @@ -805,7 +732,7 @@ mod test { .join("V2") .join("AJAC3550.21O"); let fullpath = path.to_string_lossy(); - let rnx = Rinex::from_file(&fullpath.to_string()).unwrap(); + let rnx = Rinex::from_file(fullpath.as_ref()).unwrap(); let epochs: Vec = vec![ Epoch::from_str("2021-12-21T00:00:00 GPST").unwrap(), Epoch::from_str("2021-12-21T00:00:30 GPST").unwrap(), @@ -906,7 +833,7 @@ mod test { let g07 = Sv::new(Constellation::GPS, 07); let observations = vehicles.get(&g07).unwrap(); - let mut codes: Vec = observations.keys().map(|k| k.clone()).collect(); + let mut codes: Vec = observations.keys().cloned().collect(); codes.sort(); let mut expected: Vec = "L1 L2 C1 P2 D1 D2 S1 S2" @@ -928,7 +855,7 @@ mod test { let r04 = Sv::new(Constellation::Glonass, 04); let observations = vehicles.get(&r04).unwrap(); - let mut codes: Vec = observations.keys().map(|k| k.clone()).collect(); + let mut codes: Vec = observations.keys().cloned().collect(); codes.sort(); let mut expected: Vec = "L1 L2 C1 C2 P2 D1 D2 S1 S2" @@ -983,7 +910,7 @@ mod test { let r04 = Sv::new(Constellation::Glonass, 04); let observations = vehicles.get(&r04).unwrap(); - let mut codes: Vec = observations.keys().map(|k| k.clone()).collect(); + let mut codes: Vec = observations.keys().cloned().collect(); codes.sort(); let mut expected: Vec = "L1 L2 C1 C2 P2 D1 D2 S1 S2" @@ -1039,7 +966,26 @@ mod test { .join("V3") .join("NOA10630.22O"); let fullpath = path.to_string_lossy(); - let rnx = Rinex::from_file(&fullpath.to_string()).unwrap(); + let rnx = Rinex::from_file(fullpath.as_ref()).unwrap(); + + test_observation_rinex( + &rnx, + "3.02", + Some("GPS"), + "GPS", + "G01, G03, G09, G17, G19, G21, G22", + "C1C, L1C, D1C, S1C, S2W, L2W, D2W, S2W", + Some("2022-03-04T00:00:00 GPST"), + Some("2022-03-04T23:59:30 GPST"), + erratic_time_frame!( + " + 2022-03-04T00:00:00 GPST, + 2022-03-04T00:00:30 GPST, + 2022-03-04T00:01:00 GPST, + 2022-03-04T00:52:30 GPST" + ), + ); + let expected: Vec = vec![ Epoch::from_str("2022-03-04T00:00:00 GPST").unwrap(), Epoch::from_str("2022-03-04T00:00:30 GPST").unwrap(), @@ -1057,7 +1003,7 @@ mod test { assert!(clk_offset.is_none()); assert_eq!(vehicles.len(), 9); if e_index < 3 { - let keys: Vec = vehicles.keys().map(|k| *k).collect(); + let keys: Vec = vehicles.keys().copied().collect(); let expected: Vec = vec![ Sv::new(Constellation::GPS, 01), Sv::new(Constellation::GPS, 03), @@ -1071,7 +1017,7 @@ mod test { ]; assert_eq!(keys, expected); } else { - let keys: Vec = vehicles.keys().map(|k| *k).collect(); + let keys: Vec = vehicles.keys().copied().collect(); let expected: Vec = vec![ Sv::new(Constellation::GPS, 01), Sv::new(Constellation::GPS, 03), @@ -1097,6 +1043,34 @@ mod test { Rinex::from_file("../test_resources/CRNX/V3/ESBC00DNK_R_20201770000_01D_30S_MO.crx.gz") .unwrap(); + test_observation_rinex( + &rnx, + "3.05", + Some("MIXED"), + "BDS, GAL, GLO, QZSS, GPS, EGNOS, SDCM, BDSBAS", + "C05, C07, C10, C12, C19, C20, C23, C32, C34, C37, + E01, E03, E05, E09, E13, E15, E24, E31, + G02, G05, G07, G08, G09, G13, G15, G18, G21, G27, G28, G30, + R01, R02, R08, R09, R10, R11, R12, R17, R18, R19, + S23, S25, S36", + "C2I, C6I, C7I, D2I, D6I, D7I, L2I, L6I, L7I, S2I, S6I, S7I, + C1C, C5Q, C6C, C7Q, C8Q, D1C, D5Q, D6C, D7Q, D8Q, L1C, L5Q, L6C, + L7Q, L8Q, S1C, S5Q, S7Q, S8Q, + C1C, C1W, C2L, C2W, C5Q, D1C, D2L, D2W, D5Q, L1C, L2L, L2W, L5Q, + S1C, S1W, S2L, S2W, S5Q, + C1C, C2L, C5Q, D1C, D2L, D5Q, L1C, L2L, L5Q, S1C, S2L, S5Q, + C1C, C1P, C2C, C2P, C3Q, D1C, D1P, D2C, D2P, D3Q, L1C, L1P, L2C, + L2P, L3Q, S1C, S1P, S2C, S2P, S3Q, + C1C, C5I, D1C, D5I, L1C, L5I, S1C, S5I", + Some("2020-06-25T00:00:00 GPST"), + Some("2020-06-25T23:59:30 GPST"), + evenly_spaced_time_frame!( + "2020-06-25T00:00:00 GPST", + "2020-06-25T23:59:30 GPST", + "30 s" + ), + ); + /* * Header tb */ @@ -1163,7 +1137,7 @@ mod test { .collect(); expected.sort(); assert_eq!(sorted, expected); - } else if *k == Constellation::Geo { + } else if *k == Constellation::SBAS { let mut sorted = v.clone(); sorted.sort(); let mut expected: Vec = "C1C C5I D1C D5I L1C L5I S1C S5I" @@ -1178,7 +1152,7 @@ mod test { } assert_eq!(header.glo_channels.len(), 23); - let mut keys: Vec = header.glo_channels.keys().map(|k| *k).collect(); + let mut keys: Vec = header.glo_channels.keys().copied().collect(); keys.sort(); assert_eq!( vec![ @@ -1208,7 +1182,7 @@ mod test { ], keys ); - let mut values: Vec = header.glo_channels.values().map(|k| *k).collect(); + let mut values: Vec = header.glo_channels.values().copied().collect(); values.sort(); assert_eq!( vec![ @@ -1224,8 +1198,23 @@ mod test { let rnx = Rinex::from_file("../test_resources/CRNX/V3/MOJN00DNK_R_20201770000_01D_30S_MO.crx.gz") .unwrap(); + test_observation_rinex( + &rnx, + "3.5", + Some("MIXED"), + "GPS, GLO, GAL, BDS, QZSS, IRNSS, EGNOS, SDCM, GAGAN, BDSBAS", + "C05, C07, C10, C12, C19, C20, C23, C32, C34, C37, E01, E03, E05, E09, E13, E15, E24, E31, G05, G07, G08, G09, G13, G15, G27, G30, I02, I04, I06, R01, R02, R08, R09, R10, R11, R17, R18, R19, S23, S25, S26, S27, S36", + "C2I, C6I, C7I, D2I, D6I, D7I, L2I, L6I, L7I, S2I, S6I, S7I, C1C, C5Q, C6C, C7Q, C8Q, D1C, D5Q, D6C, D7Q, D8Q, L1C, L5Q, L6C, L7Q, L8Q, S1C, S5Q, S6C, S7Q, S8Q, C1C, C1W, C2L, C2W, C5Q, D1C, D2L, D2W, D5Q, L1C, L2L, L2W, L5Q, S1C, S1W, S2L, S2W, S5Q, C5A, D5A, L5A, S5A, C1C, C2L, C5Q, D1C, D2L, D5Q, L1C, L2L, L5Q, S1C, S2L, S5Q, C1C, C1P, C2C, C2P, C3Q, D1C, D1P, D2C, D2P, D3Q, L1C, L1P, L2C, L2P, L3Q, S1C, S1P, S2C, S2P, S3Q, C1C, C5I, D1C, D5I, L1C, L5I, S1C, S5I", + Some("2020-06-25T00:00:00 GPST"), + Some("2020-06-25T23:59:30 GPST"), + evenly_spaced_time_frame!( + "2020-06-25T00:00:00 GPST", + "2020-06-25T23:59:30 GPST", + "30 s" + ) + ); /* - * Test IRNSS vehicles + * Test IRNSS vehicles specificly */ let mut irnss_sv: Vec = rnx .sv() diff --git a/rinex/src/tests/parsing.rs b/rinex/src/tests/parsing.rs index ab863a2ce..88fff808b 100644 --- a/rinex/src/tests/parsing.rs +++ b/rinex/src/tests/parsing.rs @@ -19,7 +19,7 @@ mod test { let entry = entry.unwrap(); let path = entry.path(); let full_path = &path.to_str().unwrap(); - let is_hidden = entry.file_name().to_str().unwrap().starts_with("."); + let is_hidden = entry.file_name().to_str().unwrap().starts_with('.'); if is_hidden { continue; // not a test resource } @@ -35,9 +35,8 @@ mod test { } println!("Parsing \"{}\"", full_path); let rinex = Rinex::from_file(full_path); - assert_eq!( + assert!( rinex.is_ok(), - true, "error parsing \"{}\": {:?}", full_path, rinex.err().unwrap() @@ -54,8 +53,31 @@ mod test { assert!(rinex.epoch().count() > 0); // all files have content assert!(rinex.navigation().count() > 0); // all files have content /* - * Verify interpreted time scale, for all Sv + * For all Epoch: ephemeris selection + * must return given ephemeris */ + for (toc, (_, sv, eph)) in rinex.ephemeris() { + if let Some(ts) = sv.timescale() { + if let Some(toe) = eph.toe(ts) { + let seleph = rinex.sv_ephemeris(*sv, toe); + assert!( + seleph.is_some(), + "ephemeris selection @ toe should always be feasible" + ); + let (seltoe, seleph) = seleph.unwrap(); + assert_eq!(seltoe, toe, "toe should be identical"); + assert!( + (seleph.clock_bias - eph.clock_bias).abs() < 1.0E-6, + "ephemeris selection for t_oc should return exact ephemeris"); + assert!( + (seleph.clock_drift - eph.clock_drift).abs() < 1.0E-6, + "ephemeris selection for t_oc should return exact ephemeris"); + } + } + } + /* + * Verify interpreted time scale, for all Sv + */ //for (e, (_, sv, _)) in rinex.ephemeris() { // /* verify toc correctness */ // match sv.constellation { @@ -165,7 +187,7 @@ mod test { /* * Verify STO logical correctness */ - for (_, (msg, sv, _)) in rinex.system_time_offset() { + for (_, (msg, _sv, _)) in rinex.system_time_offset() { match msg { NavMsgType::LNAV | NavMsgType::FDMA @@ -177,14 +199,39 @@ mod test { } } }, - "OBS" => { + "CRNX" | "OBS" => { assert!(rinex.header.obs.is_some()); + let obs_header = rinex.header.obs.clone().unwrap(); + assert!(rinex.is_observation_rinex()); assert!(rinex.epoch().count() > 0); // all files have content assert!(rinex.observation().count() > 0); // all files have content /* - * test interpreted time scale + * test timescale validity */ + for ((e, _), _) in rinex.observation() { + let ts = e.time_scale; + if let Some(e0) = obs_header.time_of_first_obs { + assert!( + e0.time_scale == ts, + "interpreted wrong timescale: expecting \"{}\", got \"{}\"", + e0.time_scale, + ts + ); + } else { + match rinex.header.constellation { + Some(Constellation::Mixed) | None => {}, // can't test + Some(c) => { + let timescale = c.timescale().unwrap(); + assert!(ts == timescale, + "interpreted wrong timescale: expecting \"{}\", got \"{}\"", + timescale, + ts + ); + }, + } + } + } /* let gf = rinex.observation_gf_combinations(); let nl = rinex.observation_nl_combinations(); @@ -211,11 +258,6 @@ mod test { assert_eq!(wl_combinations, mw_combinations); */ }, - "CRNX" => { - assert!(rinex.header.obs.is_some()); - assert!(rinex.is_observation_rinex()); - assert!(rinex.epoch().count() > 0); // all files have content - }, "MET" => { assert!(rinex.is_meteo_rinex()); assert!(rinex.epoch().count() > 0); // all files have content @@ -243,8 +285,7 @@ mod test { "IONEX" => { assert!(rinex.is_ionex()); assert!(rinex.epoch().count() > 0); // all files have content - let record = rinex.record.as_ionex().unwrap(); - for (e, _) in record { + for e in rinex.epoch() { assert!( e.time_scale == TimeScale::UTC, "wrong {} timescale for a IONEX", diff --git a/rinex/src/tests/production.rs b/rinex/src/tests/production.rs index e5765f83f..51b4b2d89 100644 --- a/rinex/src/tests/production.rs +++ b/rinex/src/tests/production.rs @@ -1,19 +1,19 @@ #[cfg(test)] mod test { - use crate::tests::toolkit::{compare_with_panic, random_name}; + use crate::tests::toolkit::{random_name, test_against_model}; use crate::*; use std::path::Path; fn testbench(path: &str) { // parse this file let rnx = Rinex::from_file(path).unwrap(); // already tested elsewhere let tmp_path = format!("test-{}.rnx", random_name(5)); - assert_eq!(rnx.to_file(&tmp_path).is_ok(), true); // test writer + assert!(rnx.to_file(&tmp_path).is_ok()); // test writer let copy = Rinex::from_file(&tmp_path); - assert_eq!(copy.is_ok(), true); // content should be valid + assert!(copy.is_ok()); // content should be valid let copy = copy.unwrap(); // run comparison if copy != rnx { - compare_with_panic(©, &rnx, path); + test_against_model(©, &rnx, path, 1.0E-6); } println!("production test passed for \"{}\"", path); // remove copy @@ -21,6 +21,7 @@ mod test { } #[test] #[cfg(feature = "flate2")] + #[ignore] fn obs_v2() { let prefix = Path::new(env!("CARGO_MANIFEST_DIR")) .join("..") @@ -28,7 +29,7 @@ mod test { .join("OBS") .join("V2"); // does not work well on very old rinex like V2/KOSG.. - for file in vec![ + for file in [ "AJAC3550.21O", "aopr0010.17o", "barq071q.19o", @@ -41,11 +42,12 @@ mod test { let path = prefix.to_path_buf().join(file); let fullpath = path.to_string_lossy(); - testbench(&fullpath.to_string()); + testbench(fullpath.as_ref()); } } #[test] #[cfg(feature = "flate2")] + #[ignore] fn obs_v3() { let folder = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/OBS/V3/"; for file in std::fs::read_dir(folder).unwrap() { @@ -56,6 +58,7 @@ mod test { } #[test] #[cfg(feature = "flate2")] + //#[ignore] fn meteo_v2() { let folder = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/MET/V2/"; for file in std::fs::read_dir(folder).unwrap() { @@ -87,6 +90,7 @@ mod test { } #[test] #[cfg(feature = "flate2")] + #[ignore] fn nav_v2() { let folder = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/NAV/V2/"; for file in std::fs::read_dir(folder).unwrap() { @@ -97,6 +101,7 @@ mod test { } #[test] #[cfg(feature = "flate2")] + #[ignore] fn nav_v3() { let folder = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/NAV/V3/"; for file in std::fs::read_dir(folder).unwrap() { @@ -105,9 +110,9 @@ mod test { testbench(fp.to_str().unwrap()); } } - /* #[test] #[cfg(feature = "flate2")] + #[ignore] fn nav_v4() { let folder = env!("CARGO_MANIFEST_DIR").to_owned() + "/../test_resources/NAV/V4/"; for file in std::fs::read_dir(folder).unwrap() { @@ -115,5 +120,5 @@ mod test { let fp = fp.path(); testbench(fp.to_str().unwrap()); } - }*/ + } } diff --git a/rinex/src/tests/sampling.rs b/rinex/src/tests/sampling.rs index 74a07384b..afaac0929 100644 --- a/rinex/src/tests/sampling.rs +++ b/rinex/src/tests/sampling.rs @@ -15,15 +15,11 @@ mod sampling { .join("AJAC3550.21O"); let fullpath = path.to_string_lossy(); - let rinex = Rinex::from_file(&fullpath.to_string()); - assert!( - rinex.is_ok(), - "failed to parse \"{}\"", - fullpath.to_string() - ); + let rinex = Rinex::from_file(fullpath.as_ref()); + assert!(rinex.is_ok(), "failed to parse \"{}\"", fullpath); let rinex = rinex.unwrap(); - let expected = vec![(Duration::from_seconds(30.0), 1 as usize)]; + let expected = vec![(Duration::from_seconds(30.0), 1_usize)]; let histogram: Vec<_> = rinex.sampling_histogram().sorted().collect(); diff --git a/rinex/src/tests/toolkit.rs b/rinex/src/tests/toolkit.rs index ec767bfb3..1b8ecadc7 100644 --- a/rinex/src/tests/toolkit.rs +++ b/rinex/src/tests/toolkit.rs @@ -1,5 +1,54 @@ +use crate::navigation::FrameClass; use crate::*; use rand::{distributions::Alphanumeric, Rng}; + +use hifitime::TimeSeries; + +//#[macro_use] +#[macro_export] +macro_rules! erratic_time_frame { + ($csv: expr) => { + TestTimeFrame::Erratic( + $csv.split(",") + .map(|c| Epoch::from_str(c.trim()).unwrap()) + .unique() + .collect::>(), + ) + }; +} + +#[macro_export] +macro_rules! evenly_spaced_time_frame { + ($start: expr, $end: expr, $step: expr) => { + TestTimeFrame::EvenlySpaced(TimeSeries::inclusive( + Epoch::from_str($start.trim()).unwrap(), + Epoch::from_str($end.trim()).unwrap(), + Duration::from_str($step.trim()).unwrap(), + )) + }; +} + +#[derive(Debug, Clone)] +pub enum TestTimeFrame { + Erratic(Vec), + EvenlySpaced(TimeSeries), +} + +impl TestTimeFrame { + pub fn evenly_spaced(&self) -> Option { + match self { + Self::EvenlySpaced(ts) => Some(ts.clone()), + _ => None, + } + } + pub fn erratic(&self) -> Option> { + match self { + Self::Erratic(ts) => Some(ts.clone()), + _ => None, + } + } +} + /* * Tool to generate random names when we need to produce a file */ @@ -10,18 +59,159 @@ pub fn random_name(size: usize) -> String { .map(char::from) .collect() } + +/* + * Creates list of observables + */ +pub fn build_observables(observable_csv: &str) -> Vec { + observable_csv + .split(',') + .map(|c| { + let c = c.trim(); + if let Ok(observ) = Observable::from_str(c) { + observ + } else { + panic!("invalid observable in csv"); + } + }) + .collect::>() + .into_iter() + .unique() + .collect() +} + +use std::str::FromStr; + +/* + * Build GNSS list + */ +pub fn build_gnss_csv(gnss_csv: &str) -> Vec { + gnss_csv + .split(',') + .map(|c| Constellation::from_str(c.trim()).unwrap()) + .collect::>() + .into_iter() + .unique() + .collect() +} + +/* + * Test method to compare one RINEX against GNSS content + */ +pub fn test_gnss_csv(dut: &Rinex, gnss_csv: &str) { + let gnss = build_gnss_csv(gnss_csv); + let dut_gnss: Vec = dut.constellation().collect(); + for g in &gnss { + assert!( + dut_gnss.contains(g), + "dut does not contain constellation \"{}\"", + g + ); + } + for g in &dut_gnss { + assert!( + gnss.contains(g), + "dut should not contain constellation \"{:X}\"", + g + ); + } +} + +/* + * Compares one RINEX against SV total content + */ +pub fn test_sv_csv(dut: &Rinex, sv_csv: &str) { + let sv: Vec = sv_csv + .split(',') + .map(|c| Sv::from_str(c.trim()).unwrap()) + .collect::>() + .into_iter() + .unique() + .collect(); + + let dut_sv: Vec = dut.sv().collect(); + for v in &sv { + assert!(dut_sv.contains(v), "dut does not contain vehicle \"{}\"", v); + } + for v in &sv { + assert!(sv.contains(v), "dut should not contain vehicle \"{}\"", v); + } +} + +/* + * Compares one RINEX against given epoch content + */ +pub fn test_time_frame(dut: &Rinex, tf: TestTimeFrame) { + let mut dut_epochs = dut.epoch(); + let _epochs: Vec = Vec::new(); + if let Some(mut serie) = tf.evenly_spaced() { + for e in serie { + assert_eq!( + Some(e), + dut_epochs.next(), + "dut does not contain epoch {}", + e + ); + } + for e in dut_epochs.by_ref() { + panic!("dut should not contain epoch {}", e); + } + } else if let Some(serie) = tf.erratic() { + for e in serie { + assert!( + dut_epochs.any(|epoch| e == epoch), + "dut does not contain epoch {}", + e + ); + } + for e in dut_epochs { + panic!("dut should not contain epoch {}", e); + } + } +} + +/* + * Tests provided vehicles per epoch + * This is METEO + OBS compatible + */ +pub fn test_observables_csv(dut: &Rinex, observables_csv: &str) { + let observ = build_observables(observables_csv); + let dut_observ: Vec<&Observable> = dut.observable().collect(); + for o in &observ { + assert!( + dut_observ.contains(&o), + "dut does not contain observable {}", + o + ); + } + for o in &dut_observ { + assert!( + dut_observ.contains(o), + "dut should not contain observable {}", + o + ); + } +} + /* * OBS RINEX thorough comparison */ -fn observation_comparison(dut: &Rinex, model: &Rinex, filename: &str) { - let rec_dut = dut - .record - .as_obs() - .expect("failed to unwrap as observation rinex record"); +fn observation_against_model(dut: &Rinex, model: &Rinex, filename: &str, epsilon: f64) { + let rec_dut = dut.record.as_obs().expect("failed to unwrap rinex record"); let rec_model = model .record .as_obs() - .expect("failed to unwrap as observation rinex record"); + .expect("failed to unwrap rinex record"); + /* + * 1: make sure constellations are identical + */ + let dut_constell: Vec<_> = dut.constellation().collect(); + let expected_constell: Vec<_> = model.constellation().collect(); + assert_eq!( + dut_constell, expected_constell, + "mismatch for \"{}\"", + filename + ); for (e_model, (clk_offset_model, vehicles_model)) in rec_model.iter() { if let Some((clk_offset_dut, vehicles_dut)) = rec_dut.get(e_model) { @@ -35,7 +225,7 @@ fn observation_comparison(dut: &Rinex, model: &Rinex, filename: &str) { for (code_model, obs_model) in observables_model { if let Some(obs_dut) = observables_dut.get(code_model) { assert!( - (obs_model.obs - obs_dut.obs).abs() < 1.0E-6, + (obs_model.obs - obs_dut.obs).abs() < epsilon, "\"{}\" - {:?} - {:?} - \"{}\" expecting {} got {}", filename, e_model, @@ -123,15 +313,15 @@ fn observation_comparison(dut: &Rinex, model: &Rinex, filename: &str) { /* * CLOCK Rinex thorough comparison */ -fn clocks_comparison(dut: &Rinex, model: &Rinex, filename: &str) { +fn clocks_against_model(dut: &Rinex, model: &Rinex, filename: &str, _epsilon: f64) { let rec_dut = dut .record .as_clock() - .expect("failed to unwrap as clock rinex record"); + .expect("failed to unwrap rinex record"); let rec_model = model .record .as_clock() - .expect("failed to unwrap as clock rinex record"); + .expect("failed to unwrap rinex record"); for (e_model, model_types) in rec_model.iter() { if let Some(dut_types) = rec_dut.get(e_model) { for (model_data, _model_systems) in model_types.iter() { @@ -149,18 +339,80 @@ fn clocks_comparison(dut: &Rinex, model: &Rinex, filename: &str) { } } +/* + * Navigation RINEX thorough comparison + */ +fn navigation_against_model(dut: &Rinex, model: &Rinex, filename: &str, _epsilon: f64) { + let rec_dut = dut.record.as_nav().expect("failed to unwrap rinex record"); + let rec_model = model + .record + .as_nav() + .expect("failed to unwrap rinex record"); + for (e_model, model_frames) in rec_model.iter() { + if let Some(dut_frames) = rec_dut.get(e_model) { + println!("{:?}", dut_frames); + for model_frame in model_frames { + let mut frametype = FrameClass::default(); + if model_frame.as_eph().is_some() { + frametype = FrameClass::Ephemeris; + } else if model_frame.as_sto().is_some() { + frametype = FrameClass::SystemTimeOffset; + } else if model_frame.as_eop().is_some() { + frametype = FrameClass::EarthOrientation; + } else if model_frame.as_ion().is_some() { + frametype = FrameClass::IonosphericModel; + } + if !dut_frames.contains(model_frame) { + panic!( + "\"{}\" - @{} missing {} frame {:?}", + filename, e_model, frametype, model_frame + ); + //assert_eq!( + // observation_model, observation_dut, + // "\"{}\" - {:?} - faulty \"{}\" observation - expecting {} - got {}", + // filename, e_model, code_model, observation_model, observation_dut + //); + } + } + } else { + panic!("\"{}\" - missing epoch {:?}", filename, e_model); + } + } + + //for (e_dut, obscodes_dut) in rec_dut.iter() { + // if let Some(obscodes_model) = rec_model.get(e_dut) { + // for (code_dut, observation_dut) in obscodes_dut.iter() { + // if let Some(observation_model) = obscodes_model.get(code_dut) { + // assert_eq!( + // observation_model, observation_dut, + // "\"{}\" - {:?} - faulty \"{}\" observation - expecting {} - got {}", + // filename, e_dut, code_dut, observation_model, observation_dut + // ); + // } else { + // panic!( + // "\"{}\" - {:?} parsed \"{}\" unexpectedly", + // filename, e_dut, code_dut + // ); + // } + // } + // } else { + // panic!("\"{}\" - parsed {:?} unexpectedly", filename, e_dut); + // } + //} +} + /* * Meteo RINEX thorough comparison */ -fn meteo_comparison(dut: &Rinex, model: &Rinex, filename: &str) { +fn meteo_against_model(dut: &Rinex, model: &Rinex, filename: &str, _epsilon: f64) { let rec_dut = dut .record .as_meteo() - .expect("failed to unwrap as meteo rinex record"); + .expect("failed to unwrap rinex record"); let rec_model = model .record .as_meteo() - .expect("failed to unwrap as meteo rinex record"); + .expect("failed to unwrap rinex record"); for (e_model, obscodes_model) in rec_model.iter() { if let Some(obscodes_dut) = rec_dut.get(e_model) { for (code_model, observation_model) in obscodes_model.iter() { @@ -208,12 +460,230 @@ fn meteo_comparison(dut: &Rinex, model: &Rinex, filename: &str) { * Compares "dut" Device Under Test to given Model, * panics on unexpected content with detailed explanations. */ -pub fn compare_with_panic(dut: &Rinex, model: &Rinex, filename: &str) { +pub fn test_against_model(dut: &Rinex, model: &Rinex, filename: &str, epsilon: f64) { if dut.is_observation_rinex() { - observation_comparison(&dut, &model, filename); + observation_against_model(dut, model, filename, epsilon); } else if dut.is_meteo_rinex() { - meteo_comparison(&dut, &model, filename); + meteo_against_model(dut, model, filename, epsilon); } else if dut.is_clocks_rinex() { - clocks_comparison(&dut, &model, filename); + clocks_against_model(dut, model, filename, epsilon); + } else if dut.is_navigation_rinex() { + navigation_against_model(dut, model, filename, epsilon); + } +} + +/* + * Any parsed RINEX should go through this test + */ +pub fn test_rinex(dut: &Rinex, version: &str, constellation: Option<&str>) { + let version = Version::from_str(version).unwrap(); + assert!( + dut.header.version == version, + "parsed wrong version {}, expecting \"{}\"", + dut.header.version, + version + ); + + let constellation = constellation.map(|s| Constellation::from_str(s.trim()).unwrap()); + assert!( + dut.header.constellation == constellation, + "bad gnss description: {:?}, expecting {:?}", + dut.header.constellation, + constellation + ); +} + +/* + * Any parsed METEO RINEX should go through this test + */ +pub fn test_meteo_rinex( + dut: &Rinex, + version: &str, + observables_csv: &str, + time_frame: TestTimeFrame, +) { + test_rinex(dut, version, None); + assert!(dut.is_meteo_rinex(), "should be declared as METEO RINEX"); + test_observables_csv(dut, observables_csv); + test_time_frame(dut, time_frame); + /* + * Header specific fields + */ + assert!( + dut.header.obs.is_none(), + "should not contain specific OBS fields" + ); + assert!( + dut.header.meteo.is_some(), + "should contain specific METEO fields" + ); + assert!( + dut.header.ionex.is_none(), + "should not contain specific IONEX fields" + ); + assert!( + dut.header.clocks.is_none(), + "should not contain specific CLOCK fields" + ); + + let _header = dut.header.meteo.as_ref().unwrap(); +} + +/* + * Any parsed NAVIGATION RINEX should go through this test + */ +pub fn test_navigation_rinex(dut: &Rinex, version: &str, constellation: Option<&str>) { + test_rinex(dut, version, constellation); + assert!(dut.is_navigation_rinex(), "should be declared as NAV RINEX"); + /* + * Header specific fields + */ + assert!( + dut.header.obs.is_none(), + "should not contain specific OBS fields" + ); + assert!( + dut.header.meteo.is_none(), + "should not contain specific METEO fields" + ); + assert!( + dut.header.ionex.is_none(), + "should not contain specific IONEX fields" + ); + assert!( + dut.header.clocks.is_none(), + "should not contain specific CLOCK fields" + ); +} + +/* + * Any parsed CLOCK RINEX should go through this test + */ +pub fn test_clock_rinex(dut: &Rinex, version: &str, constellation: Option<&str>) { + test_rinex(dut, version, constellation); + assert!(dut.is_clocks_rinex(), "should be declared as CLK RINEX"); + /* + * Header specific fields + */ + assert!( + dut.header.obs.is_none(), + "should not contain specific OBS fields" + ); + assert!( + dut.header.meteo.is_none(), + "should not contain specific METEO fields" + ); + assert!( + dut.header.ionex.is_none(), + "should not contain specific IONEX fields" + ); + assert!( + dut.header.clocks.is_some(), + "should contain specific CLOCK fields" + ); +} + +/* + * Any parsed IONEX should go through this test + */ +pub fn test_ionex(dut: &Rinex, version: &str, constellation: Option<&str>) { + test_rinex(dut, version, constellation); + assert!(dut.is_ionex(), "should be declared as IONEX"); + /* + * Header specific fields + */ + assert!( + dut.header.obs.is_none(), + "should not contain specific OBS fields" + ); + assert!( + dut.header.meteo.is_none(), + "should not contain specific METEO fields" + ); + assert!( + dut.header.ionex.is_some(), + "should contain specific IONEX fields" + ); + assert!( + dut.header.clocks.is_none(), + "should not contain specific CLOCK fields" + ); +} + +/* + * Any parsed OBSERVATION RINEX should go through this test + */ +pub fn test_observation_rinex( + dut: &Rinex, + version: &str, + constellation: Option<&str>, + gnss_csv: &str, + sv_csv: &str, + observ_csv: &str, + time_of_first_obs: Option<&str>, + time_of_last_obs: Option<&str>, + time_frame: TestTimeFrame, + //observ_gnss_json: &str, +) { + test_rinex(dut, version, constellation); + assert!( + dut.is_observation_rinex(), + "should be declared as OBS RINEX" + ); + + assert!( + dut.record.as_obs().is_some(), + "observation record unwrapping" + ); + test_sv_csv(dut, sv_csv); + test_gnss_csv(dut, gnss_csv); + test_time_frame(dut, time_frame); + test_observables_csv(dut, observ_csv); + /* + * Specific header field testing + */ + assert!( + dut.header.obs.is_some(), + "missing observation specific header fields" + ); + assert!( + dut.header.meteo.is_none(), + "should not contain specific METEO fields" + ); + assert!( + dut.header.ionex.is_none(), + "should not contain specific IONEX fields" + ); + assert!( + dut.header.clocks.is_none(), + "should not contain specific CLOCK fields" + ); + + let header = dut.header.obs.as_ref().unwrap(); + //for (constell, observables) in observables { + // assert!(header_obs.codes.get(&constell).is_some(), "observation rinex specific header missing observables for constellation {}", constell); + // let values = header_obs.codes.get(&constell).unwrap(); + // for o in &observables { + // assert!(values.contains(&o), "observation rinex specific {} header is missing {} observable", constell, o); + // } + // for o in values { + // assert!(values.contains(&o), "observation rinex specific {} header should not contain {} observable", constell, o); + // } + //} + if let Some(time_of_first_obs) = time_of_first_obs { + assert_eq!( + Some(Epoch::from_str(time_of_first_obs).unwrap()), + header.time_of_first_obs, + "obs header is missing time of first obs \"{}\"", + time_of_first_obs + ); + } + if let Some(time_of_last_obs) = time_of_last_obs { + assert_eq!( + Some(Epoch::from_str(time_of_last_obs).unwrap()), + header.time_of_last_obs, + "obs header is missing time of last obs \"{}\"", + time_of_last_obs + ); } } diff --git a/rinex/src/types.rs b/rinex/src/types.rs index 1222e94ed..42dd7c869 100644 --- a/rinex/src/types.rs +++ b/rinex/src/types.rs @@ -62,9 +62,7 @@ impl Type { impl std::str::FromStr for Type { type Err = ParsingError; fn from_str(s: &str) -> Result { - if s.eq("NAVIGATION DATA") { - Ok(Self::NavigationData) - } else if s.contains("NAV DATA") { + if s.eq("NAVIGATION DATA") || s.contains("NAV DATA") { Ok(Self::NavigationData) } else if s.eq("OBSERVATION DATA") { Ok(Self::ObservationData) diff --git a/rinex/src/version.rs b/rinex/src/version.rs index 58e276a0d..8bc4f8663 100644 --- a/rinex/src/version.rs +++ b/rinex/src/version.rs @@ -80,25 +80,25 @@ impl std::ops::SubAssign for Version { } } -impl Into<(u8, u8)> for Version { - fn into(self) -> (u8, u8) { - (self.major, self.minor) +impl From for (u8, u8) { + fn from(v: Version) -> Self { + (v.major, v.minor) } } impl std::str::FromStr for Version { type Err = ParsingError; fn from_str(s: &str) -> Result { - match s.contains(".") { + match s.contains('.') { true => { - let digits: Vec<&str> = s.split(".").collect(); + let mut digits = s.split('.'); Ok(Self { - major: u8::from_str_radix(digits.get(0).unwrap(), 10)?, - minor: u8::from_str_radix(digits.get(1).unwrap(), 10)?, + major: digits.next().unwrap().parse::()?, + minor: digits.next().unwrap().parse::()?, }) }, false => Ok(Self { - major: u8::from_str_radix(s, 10)?, + major: s.parse::()?, minor: 0, }), } @@ -133,44 +133,44 @@ mod test { assert_eq!(version.minor, SUPPORTED_VERSION.minor); let version = Version::from_str("1"); - assert_eq!(version.is_ok(), true); + assert!(version.is_ok()); let version = version.unwrap(); assert_eq!(version.major, 1); assert_eq!(version.minor, 0); let version = Version::from_str("1.2"); - assert_eq!(version.is_ok(), true); + assert!(version.is_ok()); let version = version.unwrap(); assert_eq!(version.major, 1); assert_eq!(version.minor, 2); let version = Version::from_str("3.02"); - assert_eq!(version.is_ok(), true); + assert!(version.is_ok()); let version = version.unwrap(); assert_eq!(version.major, 3); assert_eq!(version.minor, 2); let version = Version::from_str("a.b"); - assert_eq!(version.is_err(), true); + assert!(version.is_err()); } #[test] fn supported_version() { let version = Version::default(); - assert_eq!(version.is_supported(), true); + assert!(version.is_supported()); let version = SUPPORTED_VERSION; - assert_eq!(version.is_supported(), true); + assert!(version.is_supported()); } #[test] fn non_supported_version() { let version = Version::new(5, 0); - assert_eq!(version.is_supported(), false); + assert!(!version.is_supported()); } #[test] fn version_comparison() { let v_a = Version::from_str("1.2").unwrap(); let v_b = Version::from_str("3.02").unwrap(); - assert_eq!(v_b > v_a, true); - assert_eq!(v_b == v_a, false); + assert!(v_b > v_a); + assert!(v_b != v_a); } #[test] fn version_arithmetics() { diff --git a/rnx2crx/Cargo.toml b/rnx2crx/Cargo.toml index d18919331..73e5a90ca 100644 --- a/rnx2crx/Cargo.toml +++ b/rnx2crx/Cargo.toml @@ -15,4 +15,4 @@ readme = "README.md" chrono = "0.4" thiserror = "1" clap = { version = "4", features = ["derive", "color"] } -rinex = { path = "../rinex", version = "=0.14.0", features = ["serde"] } +rinex = { path = "../rinex", version = "=0.14.1", features = ["serde"] } diff --git a/rnx2crx/src/cli.rs b/rnx2crx/src/cli.rs index 109272405..4ba86b768 100644 --- a/rnx2crx/src/cli.rs +++ b/rnx2crx/src/cli.rs @@ -62,7 +62,7 @@ impl Cli { } } pub fn input_path(&self) -> &str { - &self.matches.get_one::("filepath").unwrap() + self.matches.get_one::("filepath").unwrap() } pub fn output_path(&self) -> Option<&String> { self.matches.get_one::("output") @@ -75,16 +75,14 @@ impl Cli { } pub fn date(&self) -> Option { if let Some(s) = self.matches.get_one::("date") { - let items: Vec<&str> = s.split("-").collect(); + let items: Vec<&str> = s.split('-').collect(); if items.len() != 3 { println!("failed to parse \"yyyy-mm-dd\""); return None; - } else { - if let Ok(y) = i32::from_str_radix(items[0], 10) { - if let Ok(m) = u8::from_str_radix(items[1], 10) { - if let Ok(d) = u8::from_str_radix(items[2], 10) { - return Some(Epoch::from_gregorian_utc_at_midnight(y, m, d)); - } + } else if let Ok(y) = i32::from_str_radix(items[0], 10) { + if let Ok(m) = u8::from_str_radix(items[1], 10) { + if let Ok(d) = u8::from_str_radix(items[2], 10) { + return Some(Epoch::from_gregorian_utc_at_midnight(y, m, d)); } } } @@ -93,16 +91,14 @@ impl Cli { } pub fn time(&self) -> Option<(u8, u8, u8)> { if let Some(s) = self.matches.get_one::("time") { - let items: Vec<&str> = s.split(":").collect(); + let items: Vec<&str> = s.split(':').collect(); if items.len() != 3 { println!("failed to parse \"hh:mm:ss\""); return None; - } else { - if let Ok(h) = u8::from_str_radix(items[0], 10) { - if let Ok(m) = u8::from_str_radix(items[1], 10) { - if let Ok(s) = u8::from_str_radix(items[2], 10) { - return Some((h, m, s)); - } + } else if let Ok(h) = u8::from_str_radix(items[0], 10) { + if let Ok(m) = u8::from_str_radix(items[1], 10) { + if let Ok(s) = u8::from_str_radix(items[2], 10) { + return Some((h, m, s)); } } } diff --git a/rnx2crx/src/main.rs b/rnx2crx/src/main.rs index 11dd02128..ba45915ce 100644 --- a/rnx2crx/src/main.rs +++ b/rnx2crx/src/main.rs @@ -32,11 +32,9 @@ fn main() -> Result<(), Error> { crx.date = Epoch::from_gregorian_utc(y, m, d, hh, mm, ss, 0); } } - } else { - if let Some(obs) = &mut rinex.header.obs { - if let Some(crx) = &mut obs.crinex { - crx.date = Epoch::from_gregorian_utc_at_midnight(y, m, d); - } + } else if let Some(obs) = &mut rinex.header.obs { + if let Some(crx) = &mut obs.crinex { + crx.date = Epoch::from_gregorian_utc_at_midnight(y, m, d); } } } else if let Some((hh, mm, ss)) = cli.time() { @@ -54,9 +52,9 @@ fn main() -> Result<(), Error> { Some(path) => path.clone(), _ => { // deduce from input - match input_path.strip_suffix("o") { + match input_path.strip_suffix('o') { Some(prefix) => prefix.to_owned() + "d", - _ => match input_path.strip_suffix("O") { + _ => match input_path.strip_suffix('O') { Some(prefix) => prefix.to_owned() + "D", _ => match input_path.strip_suffix("rnx") { Some(prefix) => prefix.to_owned() + "crx", diff --git a/sinex/Cargo.toml b/sinex/Cargo.toml index cdbbfab4c..3820bf0cc 100644 --- a/sinex/Cargo.toml +++ b/sinex/Cargo.toml @@ -20,4 +20,4 @@ chrono = "0.4" thiserror = "1" strum = { version = "0.25", features = ["derive"] } strum_macros = "0.25" -rinex = { path = "../rinex", version = "=0.14.0", features = ["serde"] } +rinex = { path = "../rinex", version = "=0.14.1", features = ["serde"] } diff --git a/sinex/src/bias/description.rs b/sinex/src/bias/description.rs index d2ae333c0..27e0d7012 100644 --- a/sinex/src/bias/description.rs +++ b/sinex/src/bias/description.rs @@ -5,7 +5,7 @@ use rinex::constellation::Constellation; use std::collections::HashMap; //use crate::datetime::{parse_datetime, ParseDateTimeError}; -#[derive(Debug, Clone)] +#[derive(Debug, Clone, Default)] pub struct Description { /// Observation Sampling: sampling interval in seconds pub sampling: Option, @@ -28,80 +28,66 @@ pub struct Description { pub sat_clock_ref: HashMap>, } -impl Default for Description { - fn default() -> Self { - Self { - sampling: None, - spacing: None, - method: None, - bias_mode: bias::header::BiasMode::default(), - system: bias::TimeSystem::default(), - rcvr_clock_ref: None, - sat_clock_ref: HashMap::new(), - } - } -} - impl Description { pub fn with_sampling(&self, sampling: u32) -> Self { Self { sampling: Some(sampling), - spacing: self.spacing.clone(), + spacing: self.spacing, method: self.method.clone(), bias_mode: self.bias_mode.clone(), system: self.system.clone(), - rcvr_clock_ref: self.rcvr_clock_ref.clone(), + rcvr_clock_ref: self.rcvr_clock_ref, sat_clock_ref: self.sat_clock_ref.clone(), } } pub fn with_spacing(&self, spacing: u32) -> Self { Self { - sampling: self.sampling.clone(), + sampling: self.sampling, spacing: Some(spacing), method: self.method.clone(), bias_mode: self.bias_mode.clone(), system: self.system.clone(), - rcvr_clock_ref: self.rcvr_clock_ref.clone(), + rcvr_clock_ref: self.rcvr_clock_ref, sat_clock_ref: self.sat_clock_ref.clone(), } } pub fn with_method(&self, method: bias::DeterminationMethod) -> Self { Self { - sampling: self.sampling.clone(), - spacing: self.spacing.clone(), + sampling: self.sampling, + spacing: self.spacing, method: Some(method), bias_mode: self.bias_mode.clone(), system: self.system.clone(), - rcvr_clock_ref: self.rcvr_clock_ref.clone(), + rcvr_clock_ref: self.rcvr_clock_ref, sat_clock_ref: self.sat_clock_ref.clone(), } } pub fn with_bias_mode(&self, mode: bias::header::BiasMode) -> Self { Self { - sampling: self.sampling.clone(), - spacing: self.spacing.clone(), + sampling: self.sampling, + spacing: self.spacing, method: self.method.clone(), bias_mode: mode, system: self.system.clone(), - rcvr_clock_ref: self.rcvr_clock_ref.clone(), + rcvr_clock_ref: self.rcvr_clock_ref, sat_clock_ref: self.sat_clock_ref.clone(), } } pub fn with_time_system(&self, system: bias::TimeSystem) -> Self { Self { - sampling: self.sampling.clone(), - spacing: self.spacing.clone(), + sampling: self.sampling, + spacing: self.spacing, method: self.method.clone(), bias_mode: self.bias_mode.clone(), system, - rcvr_clock_ref: self.rcvr_clock_ref.clone(), + rcvr_clock_ref: self.rcvr_clock_ref, sat_clock_ref: self.sat_clock_ref.clone(), } } pub fn with_rcvr_clock_ref(&self, clock_ref: Constellation) -> Self { Self { - sampling: self.sampling.clone(), - spacing: self.spacing.clone(), + sampling: self.sampling, + spacing: self.spacing, method: self.method.clone(), bias_mode: self.bias_mode.clone(), system: self.system.clone(), @@ -111,12 +97,12 @@ impl Description { } pub fn with_sat_clock_ref(&self, c: Constellation, observable: &str) -> Self { Self { - sampling: self.sampling.clone(), - spacing: self.spacing.clone(), + sampling: self.sampling, + spacing: self.spacing, method: self.method.clone(), bias_mode: self.bias_mode.clone(), system: self.system.clone(), - rcvr_clock_ref: self.rcvr_clock_ref.clone(), + rcvr_clock_ref: self.rcvr_clock_ref, sat_clock_ref: { let mut map = self.sat_clock_ref.clone(); if let Some(codes) = map.get_mut(&c) { diff --git a/sinex/src/bias/header.rs b/sinex/src/bias/header.rs index 015dd30d5..af27281af 100644 --- a/sinex/src/bias/header.rs +++ b/sinex/src/bias/header.rs @@ -138,7 +138,7 @@ mod test { fn test_header() { let content = "%=BIA 1.00 PF2 2011:180:59736 PF2 2011:113:86385 2011:114:86385 R 00000024"; let header = Header::from_str(content); - assert_eq!(header.is_ok(), true); + assert!(header.is_ok()); let header = header.unwrap(); assert_eq!(header.version, "1.00"); assert_eq!(header.creator_code, "PF2"); @@ -147,7 +147,7 @@ mod test { assert_eq!(header.length, 24); let content = "%=BIA 1.00 COD 2016:327:30548 IGS 2016:296:00000 2016:333:00000 A 00000194"; let header = Header::from_str(content); - assert_eq!(header.is_ok(), true); + assert!(header.is_ok()); let header = header.unwrap(); assert_eq!(header.version, "1.00"); assert_eq!(header.creator_code, "COD"); diff --git a/sinex/src/bias/mod.rs b/sinex/src/bias/mod.rs index f8d3fb84c..def065061 100644 --- a/sinex/src/bias/mod.rs +++ b/sinex/src/bias/mod.rs @@ -30,12 +30,10 @@ impl std::str::FromStr for TimeSystem { Ok(Self::UTC) } else if content.eq("TAI") { Ok(Self::TAI) + } else if let Ok(c) = Constellation::from_str(content) { + Ok(Self::GNSS(c)) } else { - if let Ok(c) = Constellation::from_str(content) { - Ok(Self::GNSS(c)) - } else { - Err(TimeSystemError::UnknownSystem(content.to_string())) - } + Err(TimeSystemError::UnknownSystem(content.to_string())) } } } @@ -160,7 +158,7 @@ impl std::str::FromStr for Solution { svn: svn.trim().to_string(), prn: prn.trim().to_string(), station: { - if station.trim().len() > 0 { + if !station.trim().is_empty() { Some(station.trim().to_string()) } else { None @@ -170,7 +168,7 @@ impl std::str::FromStr for Solution { start_time: parse_datetime(start_time.trim())?, end_time: parse_datetime(end_time.trim())?, obs: { - if obs2.trim().len() > 0 { + if !obs2.trim().is_empty() { (obs1.trim().to_string(), Some(obs2.trim().to_string())) } else { (obs1.trim().to_string(), None) @@ -200,14 +198,14 @@ mod tests { #[test] fn test_determination_methods() { let method = DeterminationMethod::from_str("COMBINED_ANALYSIS"); - assert_eq!(method.is_ok(), true); + assert!(method.is_ok()); assert_eq!(method.unwrap(), DeterminationMethod::CombinedAnalysis); } #[test] fn test_solution_parser() { let solution = Solution::from_str( "ISB G G GIEN C1W C2W 2011:113:86385 2011:115:00285 ns 0.000000000000000E+00 .000000E+00"); - assert_eq!(solution.is_ok(), true); + assert!(solution.is_ok()); let solution = solution.unwrap(); assert_eq!(solution.btype, BiasType::ISB); assert_eq!(solution.svn, "G"); @@ -221,7 +219,7 @@ mod tests { assert_eq!(solution.stddev, 0.0); let solution = Solution::from_str( "ISB E E GOUS C1C C7Q 2011:113:86385 2011:115:00285 ns -.101593337222667E+03 .259439E+02"); - assert_eq!(solution.is_ok(), true); + assert!(solution.is_ok()); let solution = solution.unwrap(); assert_eq!(solution.btype, BiasType::ISB); assert_eq!(solution.svn, "E"); @@ -235,7 +233,7 @@ mod tests { assert!((solution.stddev - 0.259439E+02) < 1E-6); let solution = Solution::from_str( "OSB G063 G01 C1C 2016:296:00000 2016:333:00000 ns 10.2472 0.0062"); - assert_eq!(solution.is_ok(), true); + assert!(solution.is_ok()); let solution = solution.unwrap(); assert_eq!(solution.btype, BiasType::OSB); assert_eq!(solution.svn, "G063"); @@ -249,7 +247,7 @@ mod tests { fn test_bia_v1_example1() { let file = env!("CARGO_MANIFEST_DIR").to_owned() + "/data/BIA/V1/example-1a.bia"; let sinex = Sinex::from_file(&file); - assert_eq!(sinex.is_ok(), true); + assert!(sinex.is_ok()); let sinex = sinex.unwrap(); let reference = &sinex.reference; assert_eq!( @@ -287,7 +285,7 @@ mod tests { let description = &sinex.description; let description = description.bias_description(); - assert_eq!(description.is_some(), true); + assert!(description.is_some()); let description = description.unwrap(); assert_eq!(description.sampling, Some(300)); assert_eq!(description.spacing, Some(86400)); @@ -301,7 +299,7 @@ mod tests { assert_eq!(description.sat_clock_ref.len(), 2); let solutions = sinex.record.bias_solutions(); - assert_eq!(solutions.is_some(), true); + assert!(solutions.is_some()); let solutions = solutions.unwrap(); assert_eq!(solutions.len(), 50); } @@ -309,7 +307,7 @@ mod tests { fn test_bia_v1_example1b() { let file = env!("CARGO_MANIFEST_DIR").to_owned() + "/data/BIA/V1/example-1b.bia"; let sinex = Sinex::from_file(&file); - assert_eq!(sinex.is_ok(), true); + assert!(sinex.is_ok()); let sinex = sinex.unwrap(); assert_eq!(sinex.acknowledgments.len(), 2); assert_eq!( @@ -322,7 +320,7 @@ mod tests { let description = &sinex.description; let description = description.bias_description(); - assert_eq!(description.is_some(), true); + assert!(description.is_some()); let description = description.unwrap(); assert_eq!(description.sampling, Some(300)); assert_eq!(description.spacing, Some(86400)); @@ -336,12 +334,12 @@ mod tests { assert_eq!(description.sat_clock_ref.len(), 2); let solutions = sinex.record.bias_solutions(); - assert_eq!(solutions.is_some(), true); + assert!(solutions.is_some()); let solutions = solutions.unwrap(); assert_eq!(solutions.len(), 50); for sol in solutions.iter() { let obs = &sol.obs; - assert_eq!(obs.1.is_some(), true); // all came with OBS1+OBS2 + assert!(obs.1.is_some()); // all came with OBS1+OBS2 } } } diff --git a/sinex/src/datetime.rs b/sinex/src/datetime.rs index bca130dfc..cb0e145f0 100644 --- a/sinex/src/datetime.rs +++ b/sinex/src/datetime.rs @@ -11,7 +11,7 @@ pub enum ParseDateTimeError { pub fn parse_datetime(content: &str) -> Result { let ym = &content[0..8]; // "YYYY:DDD" - let dt = chrono::NaiveDate::parse_from_str(&ym, "%Y:%j")?; + let dt = chrono::NaiveDate::parse_from_str(ym, "%Y:%j")?; let secs = &content[9..]; let secs = f32::from_str(secs)?; let h = secs / 3600.0; @@ -26,8 +26,8 @@ mod test { #[test] fn test_parsing() { let datetime = parse_datetime("2022:021:20823"); - assert_eq!(datetime.is_ok(), true); + assert!(datetime.is_ok()); let datetime = parse_datetime("2022:009:00000"); - assert_eq!(datetime.is_ok(), true); + assert!(datetime.is_ok()); } } diff --git a/sinex/src/lib.rs b/sinex/src/lib.rs index 09434acda..8403a9b15 100644 --- a/sinex/src/lib.rs +++ b/sinex/src/lib.rs @@ -18,11 +18,11 @@ use header::{is_valid_header, Header}; use reference::Reference; fn is_comment(line: &str) -> bool { - line.starts_with("*") + line.starts_with('*') } fn section_start(line: &str) -> Option { - if line.starts_with("+") { + if line.starts_with('+') { Some(line[1..].to_string()) } else { None @@ -30,7 +30,7 @@ fn section_start(line: &str) -> Option { } fn section_end(line: &str) -> Option { - if line.starts_with("-") { + if line.starts_with('-') { Some(line[1..].to_string()) } else { None diff --git a/sinex/src/receiver.rs b/sinex/src/receiver.rs index 56febb816..584651fa4 100644 --- a/sinex/src/receiver.rs +++ b/sinex/src/receiver.rs @@ -66,7 +66,7 @@ mod tests { let rcvr = Receiver::from_str( "MAO0 G @MP0 2015:276:00000 2015:276:86399 JAVAD TRE-G3TH DELTA 3.6.4", ); - assert_eq!(rcvr.is_ok(), true); + assert!(rcvr.is_ok()); let rcvr = rcvr.unwrap(); assert_eq!(rcvr.station, "MAO0"); assert_eq!(rcvr.group, "@MP0"); diff --git a/sinex/tests/parser.rs b/sinex/tests/parser.rs index 03ebd1921..6ca1e02d8 100644 --- a/sinex/tests/parser.rs +++ b/sinex/tests/parser.rs @@ -15,7 +15,7 @@ mod test { let entry = entry.unwrap(); let path = entry.path(); let full_path = &path.to_str().unwrap(); - let is_hidden = entry.file_name().to_str().unwrap().starts_with("."); + let is_hidden = entry.file_name().to_str().unwrap().starts_with('.'); println!("Parsing file: \"{}\"", full_path); if !is_hidden { // PARSER diff --git a/sp3/Cargo.toml b/sp3/Cargo.toml index 795d7ecc5..8bd2cf6e0 100644 --- a/sp3/Cargo.toml +++ b/sp3/Cargo.toml @@ -1,6 +1,6 @@ [package] name = "sp3" -version = "1.0.4" +version = "1.0.5" license = "MIT OR Apache-2.0" authors = ["Guillaume W. Bres "] description = "IGS SP3 file parser" @@ -24,7 +24,7 @@ rustdoc-args = ["--cfg", "docrs", "--generate-link-to-definition"] [dependencies] thiserror = "1" hifitime = "3.8.4" -rinex = { path = "../rinex", version = "=0.14.0", features = ["serde"] } +rinex = { path = "../rinex", version = "=0.14.1", features = ["serde"] } serde = { version = "1.0", optional = true, default-features = false, features = ["derive"] } flate2 = { version = "1.0.24", optional = true, default-features = false, features = ["zlib"] } diff --git a/sp3/src/lib.rs b/sp3/src/lib.rs index feea8d77b..eeea03ac9 100644 --- a/sp3/src/lib.rs +++ b/sp3/src/lib.rs @@ -275,7 +275,7 @@ fn parse_epoch(content: &str, time_scale: TimeScale) -> Result 4 { + comments.push(line[3..].to_string()); + } continue; } if end_of_file(line) { diff --git a/sp3/src/tests/interpolation.rs b/sp3/src/tests/interpolation.rs index b1c843b53..fe8e1991e 100644 --- a/sp3/src/tests/interpolation.rs +++ b/sp3/src/tests/interpolation.rs @@ -2,10 +2,10 @@ #[cfg(test)] mod test { use crate::prelude::*; - use rinex::prelude::Sv; - use rinex::sv; + //use rinex::prelude::Sv; + //use rinex::sv; use std::path::PathBuf; - use std::str::FromStr; + //use std::str::FromStr; /* * Theoretical maximal error of a Lagrangian interpolation * over a given Dataset for specified interpolation order @@ -43,7 +43,7 @@ mod test { let total_epochs = sp3.epoch().count(); //TODO: replace with max_error() - for (order, max_error) in vec![(7, 1E-1_f64), (9, 1.0E-2_64), (11, 0.5E-3_f64)] { + for (order, max_error) in [(7, 1E-1_f64), (9, 1.0E-2_64), (11, 0.5E-3_f64)] { let tmin = first_epoch + (order / 2) * dt; let tmax = last_epoch - (order / 2) * dt; println!("running Interp({}) testbench..", order); diff --git a/sp3/src/tests/parser_3c.rs b/sp3/src/tests/parser_3c.rs index 48873aebb..e35d915bc 100644 --- a/sp3/src/tests/parser_3c.rs +++ b/sp3/src/tests/parser_3c.rs @@ -66,7 +66,7 @@ mod test { } } - for (index, (epoch, sv, clock)) in sp3.sv_clock().enumerate() {} + //for (index, (epoch, sv, clock)) in sp3.sv_clock().enumerate() {} /* * Test file comments diff --git a/sp3/src/tests/parser_3d.rs b/sp3/src/tests/parser_3d.rs index 2571e3281..e9e3752ef 100644 --- a/sp3/src/tests/parser_3d.rs +++ b/sp3/src/tests/parser_3d.rs @@ -79,7 +79,6 @@ mod test { } } - let mut clk: Vec<_> = sp3.sv_clock().collect(); for (epoch, sv, clock) in sp3.sv_clock() { assert_eq!(epoch, Epoch::from_str("2019-10-27T00:00:00 GPST").unwrap()); if sv == sv!("C01") { diff --git a/sp3/src/tests/test_pool.rs b/sp3/src/tests/test_pool.rs index acc1382fc..6db870d07 100644 --- a/sp3/src/tests/test_pool.rs +++ b/sp3/src/tests/test_pool.rs @@ -11,7 +11,7 @@ mod test { .join("test_resources") .join("SP3"); - for file in vec![ + for file in [ "EMR0OPSULT_20232391800_02D_15M_ORB.SP3.gz", "ESA0OPSULT_20232320600_02D_15M_ORB.SP3.gz", "COD0MGXFIN_20230500000_01D_05M_ORB.SP3.gz", @@ -36,7 +36,7 @@ mod test { .join("test_resources") .join("SP3"); - for file in vec![ + for file in [ "co108870.sp3", "em108871.sp3", //"emr08874.sp3", diff --git a/sp3/src/version.rs b/sp3/src/version.rs index 2538edd65..fc9f0829f 100644 --- a/sp3/src/version.rs +++ b/sp3/src/version.rs @@ -87,15 +87,13 @@ mod test { use std::str::FromStr; #[test] fn version() { - for (desc, expected) in vec![("c", Version::C), ("d", Version::D)] { - assert!( - Version::from_str(desc).is_ok(), - "failed to parse Version from \"{}\"", - desc - ); + for (desc, expected) in [("c", Version::C), ("d", Version::D)] { + let version = Version::from_str(desc); + assert!(version.is_ok(), "failed to parse Version from \"{}\"", desc); + assert_eq!(version.unwrap(), expected); } - for (vers, expected) in vec![(Version::C, 3), (Version::D, 4)] { + for (vers, expected) in [(Version::C, 3), (Version::D, 4)] { let version: u8 = vers.into(); assert_eq!(version, expected, "convertion to integer failed"); } diff --git a/tools/clippy.sh b/tools/clippy.sh new file mode 100755 index 000000000..66d659fd4 --- /dev/null +++ b/tools/clippy.sh @@ -0,0 +1,5 @@ +#!/bin/sh +cargo clippy \ + --fix \ + --allow-dirty \ + -- -Dclippy::perf diff --git a/tools/release.py b/tools/release.py new file mode 100755 index 000000000..b3f5ba2d7 --- /dev/null +++ b/tools/release.py @@ -0,0 +1,58 @@ +#! /usr/bin/env python3 + +import os +import tomli +import tomli_w + +def parse_toml(path): + with open(path, "r") as fd: + content = fd.read() + tomli_dict = tomli.loads(content) + return tomli_dict + + +def parse_pool(): + tomls = [] + for subdir in os.listdir("."): + if os.path.isdir(subdir): + target = subdir + "/Cargo.toml" + if os.path.exists(target): + tomls.append(target) + + content = {} + for toml in tomls: + key = toml.split("/")[0] # dir name + content[key] = parse_toml(toml) + return content + +def update_pool(content): + for subdir in os.listdir("."): + if os.path.isdir(subdir): + target = subdir + "/Cargo.toml" + if os.path.exists(target): + with open(target, "w") as fd: + fd.write(tomli_w.dumps(content[subdir])) + +def replace_local_referencing(content): + latest = {} + for pkg in content: + # latest uploaded version + latest[pkg] = content[pkg]["package"]["version"] + print(latest) + for pkg in content: + # replace local paths + dependencies = list(content[pkg]["dependencies"].keys()) + for localpkg in latest: + if localpkg in dependencies: + content[pkg]["dependencies"][localpkg].pop("path", None) + content[pkg]["dependencies"][localpkg]["version"] = latest[localpkg] + # print(content[pkg]["dependencies"][localpkg]) + + +def main(release=None): + content = parse_pool() + replace_local_referencing(content) + update_pool(content) + +if __name__ == "__main__": + main() diff --git a/ublox-rnx/Cargo.toml b/ublox-rnx/Cargo.toml index 935f410ee..0e6b9e867 100644 --- a/ublox-rnx/Cargo.toml +++ b/ublox-rnx/Cargo.toml @@ -12,10 +12,13 @@ edition = "2021" readme = "README.md" [dependencies] -chrono = "0.4" +log = "0.4" +pretty_env_logger = "0.5" +chrono = "0.4.30" serde = "1.0" +thiserror = "1" serde_json = "1.0" serialport = "4.2.0" ublox = "0.4.4" -rinex = { path = "../rinex", version = "=0.14.0", features = ["serde"] } +rinex = { path = "../rinex", version = "=0.14.1", features = ["serde", "nav", "obs"] } clap = { version = "3.2.22", features = ["yaml"] } diff --git a/ublox-rnx/src/main.rs b/ublox-rnx/src/main.rs index 6c4a2ff0f..c3a8cab59 100644 --- a/ublox-rnx/src/main.rs +++ b/ublox-rnx/src/main.rs @@ -3,19 +3,45 @@ //! Homepage: use clap::load_yaml; use clap::App; -//use std::str::FromStr; +use std::str::FromStr; -use rinex::*; -//use rinex::sv::Sv; +use thiserror::Error; + +use rinex::navigation::{IonMessage, KbModel, KbRegionCode}; +use rinex::observation::{LliFlags, ObservationData}; +use rinex::prelude::EpochFlag; use rinex::prelude::*; -//:use rinex::observation::record::ObservationData; +use rinex::sv; extern crate ublox; -use ublox::*; -use ublox::{CfgPrtUart, UartPortId}; +use ublox::{ + CfgMsgAllPorts, CfgMsgAllPortsBuilder, CfgPrtUart, CfgPrtUartBuilder, DataBits, InProtoMask, + OutProtoMask, PacketRef, Parity, StopBits, UartMode, UartPortId, +}; +use ublox::{GpsFix, RecStatFlags}; +use ublox::{NavSat, NavTimeUtcFlags}; +use ublox::{NavStatusFlags, NavStatusFlags2}; + +use log::{debug, error, info, trace, warn}; mod device; +#[derive(Debug, Clone, Error)] +pub enum Error { + #[error("unknown constellation #{0}")] + UnknownConstellationId(u8), +} + +fn identify_constellation(id: u8) -> Result { + match id { + 0 => Ok(Constellation::GPS), + 1 => Ok(Constellation::Galileo), + 2 => Ok(Constellation::Glonass), + 3 => Ok(Constellation::BeiDou), + _ => Err(Error::UnknownConstellationId(id)), + } +} + pub fn main() -> Result<(), Box> { let yaml = load_yaml!("app.yml"); let app = App::from_yaml(yaml); @@ -31,7 +57,7 @@ pub fn main() -> Result<(), Box> { // open device let port = serialport::new(port, baud) .open() - .expect(&format!("failed to open serial port \"{}\"", port)); + .unwrap_or_else(|_| panic!("failed to open serial port \"{}\"", port)); let mut device = device::Device::new(port); // Enable UBX protocol on all ports @@ -52,6 +78,13 @@ pub fn main() -> Result<(), Box> { )?; device.wait_for_ack::().unwrap(); + /* + * HEADER <=> Configuration + */ + //CfgNav5 : model, dynamics.. + //CfgNav5X : min_svs, aiding, wkn, ppp.. + //AidIni + /* NEED UBX CRATE UPDATE!! device.write_all( &CfgPrtUartBuilder { @@ -121,39 +154,193 @@ pub fn main() -> Result<(), Box> { */ // Create header section - let _header = header::Header::basic_obs(); + let mut _nav_header = Header::basic_nav(); + let mut _obs_header = Header::basic_obs(); + // let mut clk_header = Header::basic_clk(); + + //TODO header CLI customization - //TODO header customization + // current work structures + let mut itow = 0_u32; + let mut epoch = Epoch::default(); + let mut epoch_flag = EpochFlag::default(); - let mut _epoch = Epoch::default(); // current epoch + // observation + let mut _observable = Observable::default(); + let mut lli: Option = None; + let mut obs_data = ObservationData::default(); + + let mut uptime = Duration::default(); + + let mut fix_type = GpsFix::NoFix; // current fix status + let mut fix_flags = NavStatusFlags::empty(); // current fix flag + let mut nav_status = NavStatusFlags2::Inactive; loop { // main loop let _ = device.update(|packet| { match packet { + /* + * Configuration frames: + * should be depiceted by HEADER section + */ + //PacketRef::CfgRate(pkt) => { + // //TODO EPOCH INTERVAL + // let gps_rate = pkt.measure_rate_ms(); + // //TODO EPOCH INTERVAL + // let nav_rate = pkt.nav_rate(); + // //TODO reference time + // let time = pkt.time_ref(); + //}, + PacketRef::CfgNav5(pkt) => { + // Dynamic model + let _dyn_model = pkt.dyn_model(); + }, + PacketRef::RxmRawx(pkt) => { + let _leap_s = pkt.leap_s(); + if pkt.rec_stat().intersects(RecStatFlags::CLK_RESET) { + // notify reset event + if let Some(ref mut lli) = lli { + *lli |= LliFlags::LOCK_LOSS; + } else { + lli = Some(LliFlags::LOCK_LOSS); + } + epoch_flag = EpochFlag::CycleSlip; + } + obs_data.lli = lli; + }, + PacketRef::MonHw(_pkt) => { + //let jamming = pkt.jam_ind(); //TODO + //antenna problem: + // pkt.a_status(); + // pkt.a_power(); + }, + PacketRef::MonGnss(_pkt) => { + //pkt.supported(); // GNSS + //pkt.default(); // GNSS + //pkt.enabled(); //GNSS + }, + PacketRef::MonVer(pkt) => { + //UBX revision + pkt.software_version(); + pkt.hardware_version(); + }, + /* + * NAVIGATION + */ PacketRef::NavSat(pkt) => { for sv in pkt.svs() { - let _gnss_id = sv.gnss_id(); - let _sv_id = sv.sv_id(); - let _elev = sv.elev(); - let _azim = sv.azim(); - let _pr_res = sv.pr_res(); - let _flags = sv.flags(); - //if flags.sv_used() { - //} - //flags.health(); - //flags.quality_ind(); - //flags.differential_correction_available(); - //flags.ephemeris_available(); + let gnss = identify_constellation(sv.gnss_id()); + if gnss.is_ok() { + let _elev = sv.elev(); + let _azim = sv.azim(); + let _pr_res = sv.pr_res(); + let _flags = sv.flags(); + + let _sv = Sv { + constellation: gnss.unwrap(), + prn: sv.sv_id(), + }; + + // flags.sv_used() + //flags.health(); + //flags.quality_ind(); + //flags.differential_correction_available(); + //flags.ephemeris_available(); + } + } + }, + PacketRef::NavTimeUTC(pkt) => { + if pkt.valid().intersects(NavTimeUtcFlags::VALID_UTC) { + // leap seconds already known + let e = Epoch::maybe_from_gregorian( + pkt.year().into(), + pkt.month(), + pkt.day(), + pkt.hour(), + pkt.min(), + pkt.sec(), + pkt.nanos() as u32, + TimeScale::UTC, + ); + if e.is_ok() { + epoch = e.unwrap(); + } } }, - /* NEED UBX CRATE UPDATE !! + PacketRef::NavStatus(pkt) => { + itow = pkt.itow(); + fix_type = pkt.fix_type(); + fix_flags = pkt.flags(); + nav_status = pkt.flags2(); + uptime = Duration::from_milliseconds(pkt.uptime_ms() as f64); + trace!("uptime: {}", uptime); + }, PacketRef::NavEoe(pkt) => { - // End of epoch notification - let _itow = pkt.itow(); - // ==> push into file + itow = pkt.itow(); + // reset Epoch + lli = None; + epoch_flag = EpochFlag::default(); + }, + /* + * NAVIGATION : EPHEMERIS + */ + PacketRef::MgaGpsEph(pkt) => { + let _sv = sv!(&format!("G{}", pkt.sv_id())); + //nav_record.insert(epoch, sv); + }, + PacketRef::MgaGloEph(pkt) => { + let _sv = sv!(&format!("R{}", pkt.sv_id())); + //nav_record.insert(epoch, sv); + }, + /* + * NAVIGATION: IONOSPHERIC MODELS + */ + PacketRef::MgaGpsIono(pkt) => { + let kbmodel = KbModel { + alpha: (pkt.alpha0(), pkt.alpha1(), pkt.alpha2(), pkt.alpha3()), + beta: (pkt.beta0(), pkt.beta1(), pkt.beta2(), pkt.beta3()), + region: KbRegionCode::default(), // TODO, + }; + let _iono = IonMessage::KlobucharModel(kbmodel); + }, + /* + * OBSERVATION: Receiver Clock + */ + PacketRef::NavClock(pkt) => { + let _bias = pkt.clk_b(); + let _drift = pkt.clk_d(); + // pkt.t_acc(); // phase accuracy + // pkt.f_acc(); // frequency accuracy + }, + /* + * Errors, Warnings + */ + PacketRef::InfTest(pkt) => { + if let Some(msg) = pkt.message() { + trace!("{}", msg); + } + }, + PacketRef::InfDebug(pkt) => { + if let Some(msg) = pkt.message() { + debug!("{}", msg); + } + }, + PacketRef::InfNotice(pkt) => { + if let Some(msg) = pkt.message() { + info!("{}", msg); + } + }, + PacketRef::InfError(pkt) => { + if let Some(msg) = pkt.message() { + error!("{}", msg); + } + }, + PacketRef::InfWarning(pkt) => { + if let Some(msg) = pkt.message() { + warn!("{}", msg); + } }, - */ _ => {}, } });