Appendix C - Operational and technical requirements on PNT/I output data
Clasification Society 2024 - Version 9.40
Statutory Documents - IMO Publications and Documents - Circulars - Maritime Safety Committee - MSC.1/Circular.1575 - Guidelines for Shipborne Position, Navigation And Timing (PNT) Data Processing – (16 June 2017) - Annex - Guidelines for Shipborne Position, Navigation and Timing (PNT) Data Processing - Appendix C - Operational and technical requirements on PNT/I output data

Appendix C - Operational and technical requirements on PNT/I output data

 Generally, requirements on data are specified as

  • (a) amount and types of PNT output data (including integrity and status data),

  • (b) accuracy and integrity of data content, and

  • (c) continuity and availability of data provision.

Appendix C provides guidance on the specifications for the accuracy and integrity levels intended for PNT output data.

1 Accuracy level

1.1 Accuracy definitions

Requirements on accuracy should preferably be specified by the 95% error boundaries regarding the absolute accuracy determined as the difference between the measured and reference (true) values (see figure C-1).

Requirements on precision should be defined by the standard deviation to quantify the scattering of measurements around its mean value E(xm). Therefore the standard deviation is only sufficient to specify the absolute accuracy in cases of normal distributed errors with zero-mean (E(xm)=0). In this case the 95% error boundary corresponds with the 2σ value range. Requirements on relative accuracy should take into account the accuracy of used reference.

1.2 Operational accuracy level

Operational accuracy level should specify the required absolute accuracy of PNT output data based on current IMO specifications, if available, and future needs.

Table C-1 summarizes the operational accuracy level for PNT data intended as output of the PNT-DP supporting the application grades I, II, III or IV.

PNT Output Data Operational Accuracy Level Level of Confidencefootnote [%]
  A B C D  
Horizontal Position [m] 100.0footnote 10.0footnote, footnote 1.0footnote 0.1footnote 95
SOG [kn] 0.5 0.4 0.3 0.2footnote 95
COG [°] 3.0 1.0 0.5 0.1 95
Timefootnote 1.0 s 0.1 s 0.0001 s 50.0 nsfootnote 95
Heading [°] 1.5footnote 1.0footnote, footnote 0.5footnote 0.2footnote 95
ROT [°/s] 1.0 0.5footnote 0.3 0.1 95
STW [kn] 0.5 0.4 0.3 0.2footnote 95
CTW [°] 3.0 1.0 0.5 0.1 95
Vertical Position [m] 10.0 5.0 1.0footnote 0.5 95
Depth [m] 5.0 1.0 0.5 0.2 95
Pitch [°] 1.5 1.0 0.5 0.2 95
Roll [°] 1.5 1.0footnote 0.5 0.2 95

Table C-1: Operational Accuracy Level for PNT Output Data

1.3 Technical accuracy level

Technical accuracy levels enable the gradual specification of task and application-related requirements and promote the performance description of individual technical solutions.

The following table provides an example for non-mandatory technical accuracy levels for horizontal position.

Absolute Accuracy level (95%) in m
Operational A 100.0 B 10.0 C 1.0 D 0.1
Technical A.1 50.0 B.1 5.0 C.1 0.5 D.1 0.05
A.2 35.0 B.2 3.5 C.2 0.35    
A.3 25.0 B.3 2.5 C.3 0.25    
A.4 15.0 B.4 1.5 C.4 0.15    

Table C-2: Technical Accuracy Level for Horizontal Position

Note: A.2 is applied for GPS/GLONASS (MSC.115(73)), A.3 for BDS specification (MSC.379(93) and A.4 for GALILEO specification (MSC.233(82)). B.2 may be used for SBAS specification. B.1 to D.1 may be used to illustrate requirements for specific applications.

2 Integrity level

2.1 General remarks

Generally, integrity data should be associated with individual PNT output data (or a set of it) and used to indicate the further usability of data for multi-purpose nautical applications. As explained in Module B the value of integrity data depends on applied principles of integrity evaluation (N, L, M, H…) in relation to supported accuracy levels (A, B, C…).

Therefore, provided integrity data should be completed at least with attributes characterizing the applied evaluation principle and the evaluated accuracy level in an unambiguous manner (see table C-9, left-hand side).

The attributes may be completed by an additional factor indicating if the integrity evaluation is performed in relation to an operational or a technical accuracy level (see table C-9, right-hand side). If the factor is unspecified or set to 1, the integrity data are associated with the indicated operational accuracy level. A factor less than 1 specifies the technical accuracy level used for integrity evaluation. This enables an application-orientated decision on the usability of provided PNT data.

  Evaluation Principle   Factor F Accuracy level Example for operational Level B
    N L M H  
Operational Accuracy Level O A {A,N} {A,L} {A,M} {A,H}   F=1 Operational (O) B
B {B,N} {B,L} {B,M} {B,H}   tbd<F<1 Technical (T) T=O.F
C {C,N} {C,L} {C,M} {C,H}  
D {D,N} {D,L} {D,M} {D,H}   F=0 Not applicable -

Table C-3: Attributes of integrity data and factor indicating the evaluated accuracy level

Note: tbd stands for a lower boundary of a factor which results from the associated operational technical levels.

2.2 Requirements on integrity monitoring

2.2.1 Performance parameters

Typically, requirements on functions realizing the integrity monitoring of data in the GNSS sector or aviation are specified by the alert limit, time to alarm (TTA), and the residual integrity risk over a specified time period. Paragraph 122 of Module C states that a PNT-DP is embedded software contributing to the BAM of the mothering system by provision of status and integrity data. Therefore, the use of alert limits and time to alarm may be misleading, if they are used to formulate the requirements on integrity monitoring of the PNT-DP. To avoid misinterpretations with BAM the meaning of performances parameters on integrity monitoring is generalized:

  • .1 Methods and thresholds used by the PNT-DP for integrity monitoring should be qualified to evaluate if the supported accuracy level of PNT output data has been achieved or not. Therefore the accuracy level (AL) is used as intra-system "alert limit" or threshold value (see A.915(22)) to differ between fulfilled and failed requirements on PNT data output.

  • .2 A.915(22) specifies the time to alarm as time elapsed between the occurrence of a failure in the radionavigation system and its presentation on the bridge. A PNT-DP evaluates, if the PNT output data will fulfil the supported accuracy level taking into account the performance of used data input and performed data processing. Therefore, the time to alarm (TTA) is more likely the tolerated time span for accuracy evaluation by the PNT-DP.

  • .3 Residual integrity risk: Probability defined for a specified period that a positive evaluation result (estimated inaccuracy is smaller than the applied accuracy level) is faulty (inaccuracy of PNT data output exceeds the required accuracy level).

2.2.2 Performance requirements

Resolution A.915(22) provides requirements on integrity monitoring in relation to accuracy of horizontal position. The following procedures should be adopted by the integrity monitoring function applied by the PNT-DP:

  • .1 If the integrity of the PNT output data is evaluated based on estimates of its accuracy, the applied AL should be the absolute accuracy level currently supported by the PNT-DP.

  • .2 If the integrity evaluation is performed with alternative performance identifiers and tests (not addressed to absolute accuracy), the AL should be determined by the expected value range of used performance identifier. The ALs should be adapted to the currently supported accuracy level, if practicable.

  • .3 If the final evaluation result is derived from the combination of several test results, the applied analysis rules and decision criteria should be compliant in relation to currently supported accuracy of PNT output data.

  • .4 The TTA is limited by the supported update rate (fupdate) for the PNT data provision:

  • TTA < 1/fupdate.

  • .5 With increasing capability of integrity monitoring methods it can be expected that the probability of incorrect integrity assessment decreases. From a safety-critical applications' point of view, an integrity risk is tolerated. It is therefore recommended to manufacturers to predetermine the integrity risk of applied integrity monitoring methods, taking into account application-relevant time periods under nominal conditions, if practicable.

  • .6 If the PNT-DP supports a redundant provision of PNT and integrity data in relation to the same accuracy level, the integrity risk should be pre-evaluated for application-relevant time periods and provided as configuration parameter to ensure that the most reliable PNT data are selected for output (see paragraph 99.2).

2.3 Remarks to integrity data provision at output

Integrity data should be synchronized with the assigned PNT data. A prerequisite is the fulfilment of the requirement on TTA described in the previous section. However, if integrity data of external services and systems are needed to generate integrity data at output of the PNT-DP, their latency should be taken into account. This implicates that either the complete data provision is delayed or provisional integrity data can be provided only.

Integrity data can be provided

  • (a) as flags, or

  • (b) as floating data, carrying the estimated accuracy.

Results of integrity evaluation are provided preferably as estimate of achieved accuracy to support that the final evaluation of usability can be done by multi-purpose nautical applications in relation to own requirements on PNT data output.

The provision of flags is sufficient to indicate if the considered accuracy level is most probably achieved, taking into account that the applied tests are passed.

3 Integrity explanations

In general, the use of different methods for integrity evaluation results in different values of integrity statements. A logical consequence is the implementation of different integrity levels (see chapter B) to reflect these differences and to avoid the misinterpretation of provided integrity information.

The applications of plausibility and consistency tests, which are insufficient to prove the fulfilment of requirements on accuracy, are associated to a low level of integrity:

Data is considered plausible, if the data content lies within a specified value range. The limits of the specified value range are determined by technical design parameter, typical behaviour, or both. For example, the shortest and largest distance between possible satellite and user positions as well as typical measurement errors determine the expected value range of GNSS-based distance measurements. As shown in figure C-2, the plausibility tests are not sufficient to evaluate the current accuracy of distance measurement. Another example: the position of a ship in operation is considered plausible, if the ship's position is at sea, not ashore.

Often plausibility tests are only applied on various performance identifiers such as number of tracked satellites, ranges and range rates, DOP-values, noise, etc. However, plausibility tests are insufficient to prove that requirements on accuracy are met.

Figure C-2: Value range for plausibility tests

Consistency tests evaluate either the coherence between several data or the compliance of different data with a common measurement model. Figure C-3 illustrates simple, as well as enhanced, approaches of consistency tests:

  • (a) The example shown in (a) evaluates the consistency of successive data (e.g. ship's positions) indicated by triangles. The model of ship's movement (curve) may be determined from historical data (e.g. by extrapolation), with support of other data sources (e.g. SDMEfootnote), or using complementary measuring methods (e.g. Doppler). If the measured positions are close to the predicted positions (green triangles), they are considered as consistent. If the difference between predicted and measured positions exceeds the level of tolerated inaccuracies (e.g. 2σ circle around predicted value), the position is marked as inconsistent (red triangle). This consistency test is insufficient to validate the currently supported position accuracy because the accuracy of predicted value is undetermined.

Figure C-3: Variety of consistency tests (examples)

Note: BL = baseline as the true distance between the 2 sensor positions (e.g. antenna of GNSS receiver)

  • (b) Example b) illustrates the true position of two sensors (orange points) with different error distribution functions whose means have been adjusted to their true position. It should be remarked that in case of horizontal positioning the error behaviour will be described by a 2-dimensional distribution function. The blue and the cyan triangle represent exemplarily a measured position by sensors 1 and 2 (PosSensor1; PosSensor2). The upper graphic shows the case where the measurement errors of both sensors follow their nominal behaviour. This is assumed, if the difference between both measurements is below the test threshold given by, for example:

  • |PosSensor1 - PosSensor2| <BL + k · σSensor1 + k · σSensor2

  • with σ as standard deviation of measuring error at the sensors and k as scaling factor specifying the probability, e.g. k=2 for 95% taken into account. The lower graphic illustrates the case where the increased measuring error at sensor 1 induces that the position difference exceeds the test threshold. However, this consistency test can only attest that both sensors most probably operate according their specified performance. An estimation of absolute accuracy is impossible.

  • (c) The reliability of the result of such consistency tests decreases if the data of compared sensors are influenced by the same error sources and the probability increases that the errors at both sensors follow the same magnitude and direction (e.g. GPS receiver with short baseline or at the same antenna). Then it must be expected that the risks of undetected outliers increase. The example in (c) illustrates 4 time points with attested consistency; both positions are located within a common circle (grey line) with a diameter of the test threshold. However, both sensors fulfil only the accuracy requirements at time tn-2. At successive time points one or both measurements violate the accuracy requirements, whereby the large position errors (red crosses) at time tn+1 may remain undetected due to their correlated shift.

  • (d) Enhanced consistency tests evaluate the achieved processing results in relation to the used input data. This can be done on a logical level, e.g. it is impossible to provide protection level by RAIMfootnote, if only the signals of 4 GNSS satellites have been tracked. Alternatively, the enhanced consistency test may be performed under consideration of analytical dependencies: The threshold of the best attainable DOPfootnote per measuring setup is determined by the available number of ranging signals taking into account the applied elevation mask and the current satellite geometry. A DOP value cannot fall below the setup-specific threshold (see graphic (d) in figure C-3). But it is also possible to use statistical hypothesis tests to model the performance of PNT output data in dependence on performance of input data. An example is the precision of position estimated as product of DOP and assumed standard deviation of ranging errors. It should be remarked that precision of position is only a sufficient indicator of position accuracy if the ranging errors follow a normal distribution with zero-mean and assumed standard deviation.

More recently Performance Standards of maritime radionavigation receivers recommend the use of Receiver Autonomous Integrity Monitoring (RAIM) to evaluate the integrity of provided position solution. RAIM applies consistency tests to answer two hypothesis-testing questions:

  • 1) Does a failure exists in the available range measurements?

  • 2) And if so, which is the failed measurement?

The application of consistency tests and therefore the answering of both questions depends on the availability of redundant range measurements: more than 4 ranges are needed for question 1 and more than 5 for question 2. Integrity of the provided position may only be assumed, if the RAIM has confirmed that the position is calculated with consistent range measurements, may be after iterative answering of both questions in relation to different setups of range measurements.

Extended RAIM algorithms are also able to answer a third question:

  • 3) Does the currently provided position meet most probably the specified accuracy requirements?

The question will be answered by calculation of protection level based on range measurements indicated as usable, standard deviation of range inaccuracies (nominal, modelled, or estimated), satellite geometry, as well as probabilities of false alerts and missed detection, whereby the latter should be specified in relation to specific applications.

However, a wide variety of RAIM implementations has been developed in the last decades. They are realized as snapshot schemes testing only the consistency of current measurement or as averaging and filtering schemes taking into account previous measurements to compensate effects induced by the vessel's movement. They differ also in applied search strategies for fault detection and isolation; and, if supported, in methods and parameters used for the determination of protection level. Ultimately, the diversity of RAIM implementations makes it impossible to achieve a general assignment of RAIM approaches to a single integrity level.

A position determined with consistent range measurements of a single GNSS may be assigned to a low integrity level due to the remaining sensitivity to systemic errors. None integrity is ensured, if the position solution has been determined with ranges without proof of their consistency. A medium integrity level may be met by position solutions using ranges of two or more GNSS, for which consistency is attested in the range as well as the position domain. However, the high integrity level should be assigned to RAIM implementations supporting the determination of realistic protection level (PL) as expected bound of position inaccuracies.

Figure C-4 illustrates exemplarily the determination of PL by RAIM. From 6 satellites in view only 5 measured ranges have passed the consistency tests.

Figure C-4: Exemplary determination of protection level

The left graphic shows the 6 position solutions, which can be determined with the 5 consistent ranges: the all-in-view solution (PosAIV, orange rhombus) and the solutions achieved with any set of 4 ranges (dark blue points). The position error per solution is indicated as blue circle, whose radius depends on the expected standard deviation of position error (DOP based projection of expected standard deviation of ranging errors in the position domain) and a factor k. The right graphic illustrates the dependency between factor k and the required integrity risk, if a normal distribution of errors is assumed. In this example the largest distance of an expected position error (here Pos4) to the all-in-view solution (PosAIV) is determined as protection level:

  • |Pos4 - PosAIV| + k · σ4 = PL

The examples illustrate that the truthfulness of protection level depends on the correctness of error modelling (distribution function and parameters) in relation to current situation (value of range errors) as well as on specified performance requirements (e.g. tolerable integrity risk).


Copyright 2022 Clasifications Register Group Limited, International Maritime Organization, International Labour Organization or Maritime and Coastguard Agency. All rights reserved. Clasifications Register Group Limited, its affiliates and subsidiaries and their respective officers, employees or agents are, individually and collectively, referred to in this clause as 'Clasifications Register'. Clasifications Register assumes no responsibility and shall not be liable to any person for any loss, damage or expense caused by reliance on the information or advice in this document or howsoever provided, unless that person has signed a contract with the relevant Clasifications Register entity for the provision of this information or advice and in that case any responsibility or liability is exclusively on the terms and conditions set out in that contract.