1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
|
Internet Engineering Task Force (IETF) M. Bagnulo
Request for Comments: 8911 UC3M
Category: Standards Track B. Claise
ISSN: 2070-1721 Huawei
P. Eardley
BT
A. Morton
AT&T Labs
A. Akhter
Consultant
November 2021
Registry for Performance Metrics
Abstract
This document defines the format for the IANA Registry of Performance
Metrics. This document also gives a set of guidelines for Registered
Performance Metric requesters and reviewers.
Status of This Memo
This is an Internet Standards Track document.
This document is a product of the Internet Engineering Task Force
(IETF). It represents the consensus of the IETF community. It has
received public review and has been approved for publication by the
Internet Engineering Steering Group (IESG). Further information on
Internet Standards is available in Section 2 of RFC 7841.
Information about the current status of this document, any errata,
and how to provide feedback on it may be obtained at
https://www.rfc-editor.org/info/rfc8911.
Copyright Notice
Copyright (c) 2021 IETF Trust and the persons identified as the
document authors. All rights reserved.
This document is subject to BCP 78 and the IETF Trust's Legal
Provisions Relating to IETF Documents
(https://trustee.ietf.org/license-info) in effect on the date of
publication of this document. Please review these documents
carefully, as they describe your rights and restrictions with respect
to this document. Code Components extracted from this document must
include Revised BSD License text as described in Section 4.e of the
Trust Legal Provisions and are provided without warranty as described
in the Revised BSD License.
Table of Contents
1. Introduction
2. Terminology
3. Scope
4. Motivations for the Performance Metrics Registry
4.1. Interoperability
4.2. Single Point of Reference for Performance Metrics
4.3. Side Benefits
5. Criteria for Performance Metrics Registration
6. Performance Metrics Registry: Prior Attempt
6.1. Why This Attempt Should Succeed
7. Definition of the Performance Metrics Registry
7.1. Summary Category
7.1.1. Identifier
7.1.2. Name
7.1.3. URI
7.1.4. Description
7.1.5. Reference
7.1.6. Change Controller
7.1.7. Version (of Registry Format)
7.2. Metric Definition Category
7.2.1. Reference Definition
7.2.2. Fixed Parameters
7.3. Method of Measurement Category
7.3.1. Reference Method
7.3.2. Packet Stream Generation
7.3.3. Traffic Filter
7.3.4. Sampling Distribution
7.3.5. Runtime Parameters
7.3.6. Role
7.4. Output Category
7.4.1. Type
7.4.2. Reference Definition
7.4.3. Metric Units
7.4.4. Calibration
7.5. Administrative Information
7.5.1. Status
7.5.2. Requester
7.5.3. Revision
7.5.4. Revision Date
7.6. Comments and Remarks
8. Processes for Managing the Performance Metrics Registry Group
8.1. Adding New Performance Metrics to the Performance Metrics
Registry
8.2. Backward-Compatible Revision of Registered Performance
Metrics
8.3. Non-Backward-Compatible Deprecation of Registered
Performance Metrics
8.4. Obsolete Registry Entries
8.5. Registry Format Version and Future Changes/Extensions
9. Security Considerations
10. IANA Considerations
10.1. Registry Group
10.2. Performance Metrics Name Elements
10.3. New Performance Metrics Registry
11. Blank Registry Template
11.1. Summary
11.1.1. ID (Identifier)
11.1.2. Name
11.1.3. URI
11.1.4. Description
11.1.5. Reference
11.1.6. Change Controller
11.1.7. Version (of Registry Format)
11.2. Metric Definition
11.2.1. Reference Definition
11.2.2. Fixed Parameters
11.3. Method of Measurement
11.3.1. Reference Method
11.3.2. Packet Stream Generation
11.3.3. Traffic Filtering (Observation) Details
11.3.4. Sampling Distribution
11.3.5. Runtime Parameters and Data Format
11.3.6. Roles
11.4. Output
11.4.1. Type
11.4.2. Reference Definition
11.4.3. Metric Units
11.4.4. Calibration
11.5. Administrative Items
11.5.1. Status
11.5.2. Requester
11.5.3. Revision
11.5.4. Revision Date
11.6. Comments and Remarks
12. References
12.1. Normative References
12.2. Informative References
Acknowledgments
Authors' Addresses
1. Introduction
The IETF specifies and uses Performance Metrics of protocols and
applications transported over its protocols. Performance Metrics are
an important part of network operations using IETF protocols, and
[RFC6390] specifies guidelines for their development.
The definition and use of Performance Metrics in the IETF have been
fostered in various working groups (WGs). Most notably:
* The "IP Performance Metrics" (IPPM) WG is the WG primarily
focusing on Performance Metrics definition at the IETF.
* The "Benchmarking Methodology" WG (BMWG) defines many Performance
Metrics for use in laboratory benchmarking of internetworking
technologies.
* The "Metric Blocks for use with RTCP's Extended Report Framework"
(XRBLOCK) WG (concluded) specified many Performance Metrics
related to "RTP Control Protocol Extended Reports (RTCP XR)"
[RFC3611], which establishes a framework to allow new information
to be conveyed in RTCP, supplementing the original report blocks
defined in "RTP: A Transport Protocol for Real-Time Applications"
[RFC3550].
* The "IP Flow Information eXport" (IPFIX) WG (concluded) specified
an Internet Assigned Numbers Authority (IANA) process for new
Information Elements. Some Information Elements related to
Performance Metrics are proposed on a regular basis.
* The "Performance Metrics for Other Layers" (PMOL) WG (concluded)
defined some Performance Metrics related to Session Initiation
Protocol (SIP) voice quality [RFC6035].
It is expected that more Performance Metrics will be defined in the
future -- not only IP-based metrics but also metrics that are
protocol specific and application specific.
Despite the importance of Performance Metrics, there are two related
problems for the industry:
* First, ensuring that when one party requests that another party
measure (or report or in some way act on) a particular Performance
Metric, both parties have exactly the same understanding of what
Performance Metric is being referred to.
* Second, discovering which Performance Metrics have been specified,
to avoid developing a new Performance Metric that is very similar
but not quite interoperable.
These problems can be addressed by creating a Registry for
Performance Metrics with the Internet Assigned Numbers Authority
(IANA). As such, this document defines the new IANA Registry for
Performance Metrics.
Per this document, IANA has created and now maintains the Performance
Metrics Registry, according to the maintenance procedures and the
format defined in the sections below. The resulting Performance
Metrics Registry is for use by the IETF and others. Although the
Registry formatting specifications herein are primarily for Registry
creation by IANA, any other organization that wishes to create a
Performance Metrics Registry may use the same formatting
specifications for their purposes. The authors make no guarantee of
the Registry format's applicability to any possible set of
Performance Metrics envisaged by other organizations, but we
encourage others to apply it. In the remainder of this document,
unless we explicitly say otherwise, we will refer to the IANA-
maintained Performance Metrics Registry as simply the Performance
Metrics Registry.
2. Terminology
The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT",
"SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and
"OPTIONAL" in this document are to be interpreted as described in
BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all
capitals, as shown here.
Performance Metric: A quantitative measure of performance, targeted
to an IETF-specified protocol or targeted to an application
transported over an IETF-specified protocol. Examples of
Performance Metrics are the FTP response time for a complete file
download, the DNS Response time to resolve the IP address(es), a
database logging time, etc. This definition is consistent with
the definition of a metric in [RFC2330] and broader than the
definition of a Performance Metric in [RFC6390].
Registered Performance Metric: A Performance Metric expressed as an
entry in the Performance Metrics Registry, administered by IANA.
Such a Performance Metric has met all of the Registry review
criteria defined in this document in order to be included in the
Registry.
Performance Metrics Registry: The IANA Registry containing
Registered Performance Metrics.
Proprietary Registry: A set of metrics that are registered in a
proprietary Registry, as opposed to the Performance Metrics
Registry.
Performance Metrics Experts: A group of designated experts [RFC8126]
selected by the IESG to validate the Performance Metrics before
updating the Performance Metrics Registry. The Performance
Metrics Experts work closely with IANA.
Parameter: An input factor defined as a variable in the definition
of a Performance Metric. A Parameter is a numerical or other
specified factor forming one of a set that defines a metric or
sets the conditions of its operation. All Parameters must be
known in order to make a measurement using a metric and interpret
the results. There are two types of Parameters: Fixed and
Runtime. For the Fixed Parameters, the value of the variable is
specified in the Performance Metrics Registry Entry and different
Fixed Parameter values results in different Registered Performance
Metrics. For the Runtime Parameters, the value of the variable is
defined when the Metric Measurement Method is executed and a given
Registered Performance Metric supports multiple values for the
Parameter. Although Runtime Parameters do not change the
fundamental nature of the Performance Metric's definition, some
have substantial influence on the network property being assessed
and interpretation of the results.
| Note: Consider the case of packet loss in the following two
| Active Measurement Method cases. The first case is packet loss
| as background loss where the Runtime Parameter set includes a
| very sparse Poisson stream and only characterizes the times
| when packets were lost. Actual user streams likely see much
| higher loss at these times, due to tail drop or radio errors.
| The second case is packet loss ratio as the complimentary
| probability of delivery ratio where the Runtime Parameter set
| includes a very dense, bursty stream, and characterizes the
| loss experienced by a stream that approximates a user stream.
| These are both "Loss metrics", but the difference in
| interpretation of the results is highly dependent on the
| Runtime Parameters (at least), to the extreme where we are
| actually using loss ratio to infer its complimentary
| probability: delivery ratio.
Active Measurement Methods: Methods of Measurement conducted on
traffic that serves only the purpose of measurement and is
generated for that reason alone, and whose traffic characteristics
are known a priori. The complete definition of Active Methods is
specified in Section 3.4 of [RFC7799]. Examples of Active
Measurement Methods are the Measurement Methods for the one-way
delay metric defined in [RFC7679] and the round-trip delay metric
defined in [RFC2681].
Passive Measurement Methods: Methods of Measurement conducted on
network traffic, generated by either (1) the end users or
(2) network elements that would exist regardless of whether the
measurement was being conducted or not. The complete definition
of Passive Methods is specified in Section 3.6 of [RFC7799]. One
characteristic of Passive Measurement Methods is that sensitive
information may be observed and, as a consequence, stored in the
measurement system.
Hybrid Measurement Methods: Methods of Measurement that use a
combination of Active Methods and Passive Methods, to assess
Active Metrics, Passive Metrics, or new metrics derived from the
a priori knowledge and observations of the stream of interest.
The complete definition of Hybrid Methods is specified in
Section 3.8 of [RFC7799].
3. Scope
This document is intended for two different audiences:
1. For those preparing a candidate Performance Metric, it provides
criteria that the proposal SHOULD meet (see Section 5). It also
provides instructions for writing the text for each column of the
candidate Performance Metric and the references required for the
new Performance Metrics Registry Entry (up to and including the
publication of one or more immutable documents such as an RFC)
(see Section 7).
2. For the appointed Performance Metrics Experts and for IANA
personnel administering the new IANA Performance Metrics
Registry, it defines a set of acceptance criteria against which a
candidate Registered Performance Metric should be evaluated, and
requirements for the composition of a candidate Performance
Metric Registry Entry.
Other organizations that standardize performance metrics are
encouraged to use the process defined in this memo to propose a
candidate Registered Performance Metric. In addition, this document
may be useful for other organizations who are defining a Performance
Metrics Registry of their own and may reuse the features of the
Performance Metrics Registry defined in this document.
This Performance Metrics Registry is applicable to Performance
Metrics derived from Active Measurement, Passive Measurement, and any
other form of Performance Metric. This Registry is designed to
encompass Performance Metrics developed throughout the IETF and
especially for the technologies specified in the following working
groups: IPPM, XRBLOCK, IPFIX, and BMWG. This document analyzes a
prior attempt to set up a Performance Metrics Registry and the
reasons why this design was inadequate [RFC6248].
[RFC8912] populates the new Registry with the initial set of entries.
4. Motivations for the Performance Metrics Registry
In this section, we detail several motivations for the Performance
Metrics Registry.
4.1. Interoperability
As with any IETF Registry, the primary intention is to manage the
registration of Identifiers for use within one or more protocols. In
the particular case of the Performance Metrics Registry, there are
two types of protocols that will use the Performance Metrics in the
Performance Metrics Registry during their operation (by referring to
the index values):
Control Protocol: This type of protocol is used to allow one entity
to request that another entity perform a measurement using a
specific metric defined by the Performance Metrics Registry. One
particular example is the Large-scale Measurement of Broadband
Performance (LMAP) framework [RFC7594]. Using the LMAP
terminology, the Performance Metrics Registry is used in the LMAP
Control Protocol to allow a Controller to schedule a Measurement
Task for one or more Measurement Agents. In order to enable this
use case, the entries in the Performance Metrics Registry must be
sufficiently defined to allow a Measurement Agent implementation
to trigger a specific Measurement Task upon the reception of a
Control Protocol message. This requirement heavily constrains the
types of entries that are acceptable for the Performance Metrics
Registry.
Report Protocol: This type of protocol is used to allow an entity to
report Measurement Results to another entity. By referencing to a
specific Registered Performance Metric, it is possible to properly
characterize the Measurement Result data being reported. Using
the LMAP terminology, the Performance Metrics Registry is used in
the LMAP Report Protocol to allow a Measurement Agent to report
Measurement Results to a Collector.
It should be noted that the LMAP framework explicitly allows for
using not only the IANA-maintained Performance Metrics Registry but
also other registries containing Performance Metrics, i.e., either
(1) registries defined by other organizations or (2) private
registries. However, others who are creating registries to be used
in the context of an LMAP framework are encouraged to use the
Registry format defined in this document, because this makes it
easier for developers of LMAP Measurement Agents to programmatically
use information found in those other registries' entries.
4.2. Single Point of Reference for Performance Metrics
A Performance Metrics Registry serves as a single point of reference
for Performance Metrics defined in different working groups in the
IETF. As we mentioned earlier, there are several working groups that
define Performance Metrics in the IETF, and it is hard to keep track
of all of them. This results in multiple definitions of similar
Performance Metrics that attempt to measure the same phenomena but in
slightly different (and incompatible) ways. Having a Registry would
allow the IETF community and others to have a single list of relevant
Performance Metrics defined by the IETF (and others, where
appropriate). The single list is also an essential aspect of
communication about Performance Metrics, where different entities
that request measurements, execute measurements, and report the
results can benefit from a common understanding of the referenced
Performance Metric.
4.3. Side Benefits
There are a couple of side benefits of having such a Registry.
First, the Performance Metrics Registry could serve as an inventory
of useful and used Performance Metrics that are normally supported by
different implementations of Measurement Agents. Second, the results
of measurements using the Performance Metrics should be comparable
even if they are performed by different implementations and in
different networks, as the Performance Metric is properly defined.
BCP 176 [RFC6576] examines whether the results produced by
independent implementations are equivalent in the context of
evaluating the completeness and clarity of metric specifications.
[RFC6576] is a BCP [RFC2026] that defines the Standards Track
advancement testing for (Active) IPPM Metrics, and the same process
will likely suffice to determine whether Registered Performance
Metrics are sufficiently well specified to result in comparable (or
equivalent) results. If a Registered Performance Metric has
undergone such testing, this SHOULD be noted in "Comments and
Remarks" (see Section 7.6), with a reference to the test results.
5. Criteria for Performance Metrics Registration
It is neither possible nor desirable to populate the Performance
Metrics Registry with all combinations of Parameters of all
Performance Metrics. A Registered Performance Metric SHOULD be:
1. Interpretable by the human user.
2. Implementable by the software or hardware designer.
3. Deployable by network operators.
4. Accurate in terms of producing equivalent results, and for
interoperability and deployment across vendors.
5. Operationally useful, so that it has significant industry
interest and/or has seen deployment.
6. Sufficiently tightly defined, so that different values for the
Runtime Parameters do not change the fundamental nature of the
measurement or change the practicality of its implementation.
In essence, there needs to be evidence that (1) a candidate
Registered Performance Metric has significant industry interest or
has seen deployment and (2) there is agreement that the candidate
Registered Performance Metric serves its intended purpose.
6. Performance Metrics Registry: Prior Attempt
There was a previous attempt to define a Metrics Registry [RFC4148].
However, it was obsoleted by [RFC6248] because it was "found to be
insufficiently detailed to uniquely identify IPPM metrics... [there
was too much] variability possible when characterizing a metric
exactly", which led to the IPPM Metrics Registry defined in [RFC4148]
having "very few users, if any."
Three interesting additional quotes from [RFC6248] might help to
understand the issues related to that registry.
1. "It is not believed to be feasible or even useful to register
every possible combination of Type P, metric parameters, and
Stream parameters using the current structure of the IPPM Metrics
Registry."
2. "The current registry structure has been found to be
insufficiently detailed to uniquely identify IPPM metrics."
3. "Despite apparent efforts to find current or even future users,
no one responded to the call for interest in the RFC 4148
registry during the second half of 2010."
The current approach learns from this by tightly defining each
Registered Performance Metric with only a few variable (Runtime)
Parameters to be specified by the measurement designer, if any. The
idea is that entries in the Performance Metrics Registry stem from
different Measurement Methods that require input (Runtime) Parameters
to set factors like Source and Destination addresses (which do not
change the fundamental nature of the measurement). The downside of
this approach is that it could result in a large number of entries in
the Performance Metrics Registry. There is agreement that less is
more in this context -- it is better to have a reduced set of useful
metrics rather than a large set of metrics, some with questionable
usefulness.
6.1. Why This Attempt Should Succeed
As mentioned in the previous section, one of the main issues with the
previous Registry was that the metrics contained in the Registry were
too generic to be useful. This document specifies stricter criteria
for Performance Metric registration (see Section 5) and imposes a
group of Performance Metrics Experts that will provide guidelines to
assess if a Performance Metric is properly specified.
Another key difference between this attempt and the previous one is
that in this case there is at least one clear user for the
Performance Metrics Registry: the LMAP framework and protocol.
Because the LMAP protocol will use the Performance Metrics Registry
values in its operation, this actually helps to determine if a metric
is properly defined -- in particular, since we expect that the LMAP
Control Protocol will enable a Controller to request that a
Measurement Agent perform a measurement using a given metric by
embedding the Performance Metrics Registry Identifier in the
protocol. Such a metric and method are properly specified if they
are defined well enough so that it is possible (and practical) to
implement them in the Measurement Agent. This was the failure of the
previous attempt: a Registry Entry with an undefined Type-P
(Section 13 of [RFC2330]) allows measurement results to vary
significantly.
7. Definition of the Performance Metrics Registry
This Performance Metrics Registry is applicable to Performance
Metrics used for Active Measurement, Passive Measurement, and any
other form of Performance Measurement. Each category of measurement
has unique properties, so some of the columns defined below are not
applicable for a given metric category. In this case, the column(s)
SHOULD be populated with the "N/A" value (Not Applicable). However,
the "N/A" value MUST NOT be used by any metric in the following
columns: Identifier, Name, URI, Status, Requester, Revision, Revision
Date, Description. In the future, a new category of metrics could
require additional columns, and adding new columns is a recognized
form of Registry extension. The specification defining the new
column(s) MUST give general guidelines for populating the new
column(s) for existing entries.
The columns of the Performance Metrics Registry are defined below.
The columns are grouped into "Categories" to facilitate the use of
the Registry. Categories are described at the "Section 7.x" heading
level, and columns are described at the "Section 7.x.y" heading
level. The figure below illustrates this organization. An entry
(row) therefore gives a complete description of a Registered
Performance Metric.
Each column serves as a checklist item and helps to avoid omissions
during registration and Expert Review [RFC8126].
Registry Categories and Columns are shown below in this format:
Category
------------------...
Column | Column |...
Summary
----------------------------------------------------------------
Identifier | Name | URI | Desc. | Reference | Change | Ver |
| | | | | Controller |
Metric Definition
-----------------------------------------
Reference Definition | Fixed Parameters |
Method of Measurement
---------------------------------------------------------------------
Reference | Packet | Traffic | Sampling | Runtime | Role |
Method | Stream | Filter | Distribution | Parameters | |
| Generation |
Output
-----------------------------------------
Type | Reference | Units | Calibration |
| Definition | | |
Administrative Information
-------------------------------------
Status |Requester | Rev | Rev. Date |
Comments and Remarks
--------------------
There is a blank template of the Registry template provided in
Section 11 of this memo.
7.1. Summary Category
7.1.1. Identifier
This column provides a numeric Identifier for the Registered
Performance Metric. The Identifier of each Registered Performance
Metric MUST be unique. Note that revising a Metric according to the
process in Section 8.2 creates a new entry in the Performance Metrics
Registry with the same identifier.
The Registered Performance Metric unique Identifier is an unbounded
integer (range 0 to infinity).
The Identifier 0 should be Reserved. The Identifier values from
64512 to 65535 are reserved for private or experimental use, and the
user may encounter overlapping uses.
When adding new Registered Performance Metrics to the Performance
Metrics Registry, IANA SHOULD assign the lowest available Identifier
to the new Registered Performance Metric.
If a Performance Metrics Expert providing review determines that
there is a reason to assign a specific numeric Identifier, possibly
leaving a temporary gap in the numbering, then the Performance
Metrics Expert SHALL inform IANA of this decision.
7.1.2. Name
As the Name of a Registered Performance Metric is the first thing a
potential human implementer will use when determining whether it is
suitable for their measurement study, it is important to be as
precise and descriptive as possible. In the future, users will
review the Names to determine if the metric they want to measure has
already been registered, or if a similar entry is available, as a
basis for creating a new entry.
Names are composed of the following elements, separated by an
underscore character "_":
MetricType_Method_SubTypeMethod_... Spec_Units_Output
MetricType: A combination of the directional properties and the
metric measured, such as and not limited to:
+-----------+--------------------------------------+
| RTDelay | Round-Trip Delay |
+-----------+--------------------------------------+
| RTDNS | Response Time Domain Name Service |
+-----------+--------------------------------------+
| RLDNS | Response Loss Domain Name Service |
+-----------+--------------------------------------+
| OWDelay | One-Way Delay |
+-----------+--------------------------------------+
| RTLoss | Round-Trip Loss |
+-----------+--------------------------------------+
| OWLoss | One-Way Loss |
+-----------+--------------------------------------+
| OWPDV | One-Way Packet Delay Variation |
+-----------+--------------------------------------+
| OWIPDV | One-Way Inter-Packet Delay Variation |
+-----------+--------------------------------------+
| OWReorder | One-Way Packet Reordering |
+-----------+--------------------------------------+
| OWDuplic | One-Way Packet Duplication |
+-----------+--------------------------------------+
| OWBTC | One-Way Bulk Transport Capacity |
+-----------+--------------------------------------+
| OWMBM | One-Way Model-Based Metric |
+-----------+--------------------------------------+
| SPMonitor | Single-Point Monitor |
+-----------+--------------------------------------+
| MPMonitor | Multi-Point Monitor |
+-----------+--------------------------------------+
Table 1
Method: One of the methods defined in [RFC7799], such as and not
limited to:
+-------------+----------------------------------------------+
| Active | depends on a dedicated measurement packet |
| | stream and observations of the stream as |
| | described in [RFC7799] |
+-------------+----------------------------------------------+
| Passive | depends *solely* on observation of one or |
| | more existing packet streams as described in |
| | [RFC7799] |
+-------------+----------------------------------------------+
| HybridType1 | Hybrid Type I observations on one stream |
| | that combine both Active Methods and Passive |
| | Methods as described in [RFC7799] |
+-------------+----------------------------------------------+
| HybridType2 | Hybrid Type II observations on two or more |
| | streams that combine both Active Methods and |
| | Passive Methods as described in [RFC7799] |
+-------------+----------------------------------------------+
| Spatial | spatial metric as described in [RFC5644] |
+-------------+----------------------------------------------+
Table 2
SubTypeMethod: One or more subtypes to further describe the features
of the entry, such as and not limited to:
+----------------+------------------------------------------------+
| ICMP | Internet Control Message Protocol |
+----------------+------------------------------------------------+
| IP | Internet Protocol |
+----------------+------------------------------------------------+
| DSCPxx | where xx is replaced by a Diffserv code point |
+----------------+------------------------------------------------+
| UDP | User Datagram Protocol |
+----------------+------------------------------------------------+
| TCP | Transport Control Protocol |
+----------------+------------------------------------------------+
| QUIC | QUIC transport protocol |
+----------------+------------------------------------------------+
| HS | Hand-Shake, such as TCP's 3-way HS |
+----------------+------------------------------------------------+
| Poisson | packet generation using Poisson distribution |
+----------------+------------------------------------------------+
| Periodic | periodic packet generation |
+----------------+------------------------------------------------+
| SendOnRcv | sender keeps one packet in transit by sending |
| | when previous packet arrives |
+----------------+------------------------------------------------+
| PayloadxxxxB | where xxxx is replaced by an integer, the |
| | number of octets or 8-bit Bytes in the Payload |
+----------------+------------------------------------------------+
| SustainedBurst | capacity test, worst case |
+----------------+------------------------------------------------+
| StandingQueue | test of bottleneck queue behavior |
+----------------+------------------------------------------------+
Table 3
SubTypeMethod values are separated by a hyphen "-" character,
which indicates that they belong to this element and that their
order is unimportant when considering Name uniqueness.
Spec: An immutable document Identifier combined with a document
section Identifier. For RFCs, this consists of the RFC number and
major section number that specifies this Registry Entry in the
form "RFCXXXXsecY", e.g., RFC7799sec3. Note: The RFC number is
not the primary reference specification for the metric definition
(e.g., [RFC7679] as the primary reference specification for one-
way delay metrics); it will contain the placeholder "RFCXXXXsecY"
until the RFC number is assigned to the specifying document and
would remain blank in Private Registry Entries without a
corresponding RFC. Anticipating the "RFC10K" problem, the number
of the RFC continues to replace "RFCXXXX", regardless of the
number of digits in the RFC number. Anticipating Registry Entries
from other standards bodies, the form of this Name Element MUST be
proposed and reviewed for consistency and uniqueness by the Expert
Reviewer.
Units: The units of measurement for the output, such as and not
limited to:
+------------+----------------------------+
| Seconds | |
+------------+----------------------------+
| Ratio | unitless |
+------------+----------------------------+
| Percent | value multiplied by 100% |
+------------+----------------------------+
| Logical | 1 or 0 |
+------------+----------------------------+
| Packets | |
+------------+----------------------------+
| BPS | bits per second |
+------------+----------------------------+
| PPS | packets per second |
+------------+----------------------------+
| EventTotal | for unitless counts |
+------------+----------------------------+
| Multiple | more than one type of unit |
+------------+----------------------------+
| Enumerated | a list of outcomes |
+------------+----------------------------+
| Unitless | |
+------------+----------------------------+
Table 4
Output: The type of output resulting from measurement, such as and
not limited to:
+--------------+------------------------------------+
| Singleton | |
+--------------+------------------------------------+
| Raw | multiple singletons |
+--------------+------------------------------------+
| Count | |
+--------------+------------------------------------+
| Minimum | |
+--------------+------------------------------------+
| Maximum | |
+--------------+------------------------------------+
| Median | |
+--------------+------------------------------------+
| Mean | |
+--------------+------------------------------------+
| 95Percentile | 95th percentile |
+--------------+------------------------------------+
| 99Percentile | 99th percentile |
+--------------+------------------------------------+
| StdDev | standard deviation |
+--------------+------------------------------------+
| Variance | |
+--------------+------------------------------------+
| PFI | pass, fail, inconclusive |
+--------------+------------------------------------+
| FlowRecords | descriptions of flows observed |
+--------------+------------------------------------+
| LossRatio | lost packets to total packets, <=1 |
+--------------+------------------------------------+
Table 5
An example, as described in Section 4 of [RFC8912], is
RTDelay_Active_IP-UDP-Periodic_RFC8912sec4_Seconds_95Percentile
Note that private registries following the format described here
SHOULD use the prefix "Priv_" on any Name to avoid unintended
conflicts (further considerations are described in Section 10).
Private Registry Entries usually have no specifying RFC; thus, the
Spec: element has no clear interpretation.
7.1.3. URI
The URI column MUST contain a URL [RFC3986] that uniquely identifies
and locates the Metric Entry so it is accessible through the
Internet. The URL points to a file containing all of the human-
readable information for one Registry Entry. The URL SHALL reference
a target file that is preferably HTML-formatted and contains URLs to
referenced sections of HTMLized RFCs, or other reference
specifications. These target files for different entries can be more
easily edited and reused when preparing new entries. The exact form
of the URL for each target file, and the target file itself, will be
determined by IANA and reside on <https://www.iana.org/>. Section 4
of [RFC8912], as well as subsequent major sections of that document,
provide an example of a target file in HTML form.
7.1.4. Description
A Registered Performance Metric description is a written
representation of a particular Performance Metrics Registry Entry.
It supplements the Registered Performance Metric Name to help
Performance Metrics Registry users select relevant Registered
Performance Metrics.
7.1.5. Reference
This entry gives the specification containing the candidate Registry
Entry that was reviewed and agreed upon, if such an RFC or other
specification exists.
7.1.6. Change Controller
This entry names the entity responsible for approving revisions to
the Registry Entry and SHALL provide contact information (for an
individual, where appropriate).
7.1.7. Version (of Registry Format)
This column gives the version number for the Registry format used, at
the time the Performance Metric is registered. The format complying
with this memo MUST use 1.0. A new RFC that changes the Registry
format will designate a new version number corresponding to that
format. The version number of Registry Entries SHALL NOT change
unless the Registry Entry is updated to reflect the Registry format
(following the procedures in Section 8).
7.2. Metric Definition Category
This category includes columns to prompt all necessary details
related to the metric definition, including the immutable document
reference and values of input factors, called "Fixed Parameters",
which are left open in the immutable document but have a particular
value defined by the Performance Metric.
7.2.1. Reference Definition
This entry provides a reference (or references) to the relevant
sections of the document or documents that define the metric, as well
as any supplemental information needed to ensure an unambiguous
definition for implementations. A given reference needs to be an
immutable document, such as an RFC; for other standards bodies, it is
likely to be necessary to reference a specific, dated version of a
specification.
7.2.2. Fixed Parameters
Fixed Parameters are Parameters whose values must be specified in the
Performance Metrics Registry. The measurement system uses these
values.
Where referenced metrics supply a list of Parameters as part of their
descriptive template, a subset of the Parameters will be designated
as Fixed Parameters. As an example for Active Metrics, Fixed
Parameters determine most or all of the IPPM framework convention
"packets of Type-P" as described in [RFC2330], such as transport
protocol, payload length, TTL, etc. An example for Passive Metrics
is for an RTP packet loss calculation that relies on the validation
of a packet as RTP, which is a multi-packet validation controlled by
the MIN_SEQUENTIAL variable as defined by [RFC3550]. Varying
MIN_SEQUENTIAL values can alter the loss report, and this variable
could be set as a Fixed Parameter.
Parameters MUST have well-defined names. For human readers, the
hanging-indent style is preferred, and any Parameter names and
definitions that do not appear in the Reference Method Specification
MUST appear in this column (or the Runtime Parameters column).
Parameters MUST have a well-specified data format.
A Parameter that is a Fixed Parameter for one Performance Metrics
Registry Entry may be designated as a Runtime Parameter for another
Performance Metrics Registry Entry.
7.3. Method of Measurement Category
This category includes columns for references to relevant sections of
the immutable document(s) and any supplemental information needed to
ensure an unambiguous method for implementations.
7.3.1. Reference Method
This entry provides references to relevant sections of immutable
documents, such as RFC(s) (for other standards bodies, it is likely
to be necessary to reference a specific, dated version of a
specification) describing the Method of Measurement, as well as any
supplemental information needed to ensure unambiguous interpretation
for implementations referring to the immutable document text.
Specifically, this section should include pointers to pseudocode or
actual code that could be used for an unambiguous implementation.
7.3.2. Packet Stream Generation
This column applies to Performance Metrics that generate traffic as
part of their Measurement Method, including, but not necessarily
limited to, Active Metrics. The generated traffic is referred to as
a "stream", and this column describes its characteristics.
Each entry for this column contains the following information:
Value: The name of the packet stream scheduling discipline
Reference: The specification where the Parameters of the stream are
defined
The packet generation stream may require Parameters such as the
average packet rate and distribution truncation value for streams
with Poisson-distributed inter-packet sending times. If such
Parameters are needed, they should be included in either the Fixed
Parameters column or the Runtime Parameters column, depending on
whether they will be fixed or will be an input for the metric.
The simplest example of stream specification is singleton scheduling
(see [RFC2330]), where a single atomic measurement is conducted.
Each atomic measurement could consist of sending a single packet
(such as a DNS request) or sending several packets (for example, to
request a web page). Other streams support a series of atomic
measurements using pairs of packets, where the packet stream follows
a schedule defining the timing between transmitted packets, and an
atomic measurement assesses the reception time between successive
packets (e.g., a measurement of Inter-Packet Delay Variation). More
complex streams and measurement relationships are possible.
Principally, two different streams are used in IPPM Metrics:
(1) Poisson, distributed as described in [RFC2330] and (2) periodic,
as described in [RFC3432]. Both Poisson and periodic have their own
unique Parameters, and the relevant set of Parameter names and values
should be included in either the Fixed Parameters column or the
Runtime Parameters column.
7.3.3. Traffic Filter
This column applies to Performance Metrics that observe packets
flowing through (the device with) the Measurement Agent, i.e.,
packets that are not necessarily addressed to the Measurement Agent.
This includes, but is not limited to, Passive Metrics. The filter
specifies the traffic that is measured. This includes protocol field
values/ranges, such as address ranges, and flow or session
Identifiers.
The Traffic Filter itself depends on the needs of the metric itself
and a balance of an operator's measurement needs and a user's need
for privacy. Mechanics for conveying the filter criteria might be
the BPF (Berkeley Packet Filter) or PSAMP (Packet Sampling) [RFC5475]
Property Match Filtering, which reuses IPFIX [RFC7012]. An example
BPF string for matching TCP/80 traffic to remote Destination net
192.0.2.0/24 would be "dst net 192.0.2.0/24 and tcp dst port 80".
More complex filter engines may allow for matching using Deep Packet
Inspection (DPI) technology.
The Traffic Filter includes the following information:
Type: The type of Traffic Filter used, e.g., BPF, PSAMP, OpenFlow
rule, etc., as defined by a normative reference
Value: The actual set of rules expressed
7.3.4. Sampling Distribution
The sampling distribution defines, out of all of the packets that
match the Traffic Filter, which one or more of those packets are
actually used for the measurement. One possibility is "all", which
implies that all packets matching the Traffic Filter are considered,
but there may be other sampling strategies. It includes the
following information:
Value: The name of the sampling distribution
Reference definition: Pointer to the specification where the
sampling distribution is properly defined
The sampling distribution may require Parameters. If such Parameters
are needed, they should be included in either the Fixed Parameters
column or the Runtime Parameters column, depending on whether they
will be fixed or will be an input for the metric.
PSAMP is documented in "Sampling and Filtering Techniques for IP
Packet Selection" [RFC5475], while "A Framework for Packet Selection
and Reporting" [RFC5474] provides more background information. The
sampling distribution Parameters might be expressed in terms of the
model described in "Information Model for Packet Sampling Exports"
[RFC5477] and the process provided in "Flow Selection Techniques"
[RFC7014].
7.3.5. Runtime Parameters
In contrast to the Fixed Parameters, Runtime Parameters are
Parameters that do not change the fundamental nature of the
measurement and their values are not specified in the Performance
Metrics Registry. They are left as variables in the Registry, as an
aid to the measurement system implementer or user. Their values are
supplied on execution, configured into the measurement system, and
reported with the Measurement Results (so that the context is
complete).
Where metrics supply a list of Parameters as part of their
descriptive template, a subset of the Parameters will be designated
as Runtime Parameters.
Parameters MUST have well-defined names. For human readers, the
hanging-indent style is preferred, and the names and definitions that
do not appear in the Reference Method Specification MUST appear in
this column.
A data format for each Runtime Parameter MUST be specified in this
column, to simplify the control and implementation of measurement
devices. For example, Parameters that include an IPv4 address can be
encoded as a 32-bit integer (i.e., a binary base64-encoded value) or
"ip-address" as defined in [RFC6991]. The actual encoding(s) used
must be explicitly defined for each Runtime Parameter. IPv6
addresses and options MUST be accommodated, allowing Registered
Performance Metrics to be used in that address family. Other address
families are permissible.
Examples of Runtime Parameters include IP addresses, measurement
point designations, start times and end times for measurement, and
other information essential to the Method of Measurement.
7.3.6. Role
In some Methods of Measurement, there may be several Roles defined,
e.g., for a one-way packet delay Active Measurement, there is one
Measurement Agent that generates the packets and another Agent that
receives the packets. This column contains the name of the Role(s)
for this particular entry. In the one-way delay example above, there
should be two entries in the Registry's Role column, one for each
Role (Source and Destination). When a Measurement Agent is
instructed to perform the "Source" Role for the one-way delay metric,
the Agent knows that it is required to generate packets. The values
for this field are defined in the Reference Method of Measurement
(and this frequently results in abbreviated Role names such as
"Src").
When the Role column of a Registry Entry defines more than one Role,
the Role SHALL be treated as a Runtime Parameter and supplied for
execution. It should be noted that the LMAP framework [RFC7594]
distinguishes the Role from other Runtime Parameters.
7.4. Output Category
For entries that involve a stream and many singleton measurements, a
statistic may be specified in this column to summarize the results to
a single value. If the complete set of measured singletons is
output, this will be specified here.
Some metrics embed one specific statistic in the reference metric
definition, while others allow several output types or statistics.
7.4.1. Type
This column contains the name of the output type. The output type
defines a single type of result that the metric produces. It can be
the raw results (packet send times and singleton metrics), or it can
be a summary statistic. The specification of the output type MUST
define the format of the output. In some systems, format
specifications will simplify both measurement implementation and
collection/storage tasks. Note that if two different statistics are
required from a single measurement (for example, both "Xth percentile
mean" and "Raw"), then a new output type must be defined ("Xth
percentile mean AND Raw"). See Section 7.1.2 above for a list of
output types.
7.4.2. Reference Definition
This column contains a pointer to the specification(s) where the
output type and format are defined.
7.4.3. Metric Units
The measured results must be expressed using some standard dimension
or units of measure. This column provides the units.
When a sample of singletons (see Section 11 of [RFC2330] for
definitions of these terms) is collected, this entry will specify the
units for each measured value.
7.4.4. Calibration
Some specifications for Methods of Measurement include the ability to
perform an error calibration. Section 3.7.3 of [RFC7679] is one
example. In the Registry Entry, this field will identify a method of
calibration for the metric, and, when available, the measurement
system SHOULD perform the calibration when requested and produce the
output with an indication that it is the result of a calibration
method. In-situ calibration could be enabled with an internal
loopback that includes as much of the measurement system as possible,
performs address manipulation as needed, and provides some form of
isolation (e.g., deterministic delay) to avoid send-receive interface
contention. Some portion of the random and systematic error can be
characterized in this way.
For one-way delay measurements, the error calibration must include an
assessment of the internal clock synchronization with its external
reference (this internal clock is supplying timestamps for
measurement). In practice, the time offsets of clocks at both the
Source and Destination are needed to estimate the systematic error
due to imperfect clock synchronization (the time offsets are
smoothed; thus, the random variation is not usually represented in
the results).
Both internal loopback calibration and clock synchronization can be
used to estimate the *available accuracy* of the Output Metric Units.
For example, repeated loopback delay measurements will reveal the
portion of the output result resolution that is the result of system
noise and is thus inaccurate.
7.5. Administrative Information
7.5.1. Status
This entry indicates the status of the specification of this
Registered Performance Metric. Allowed values are 'Current',
'Deprecated', and 'Obsolete'. All newly defined Registered
Performance Metrics have 'Current' Status.
7.5.2. Requester
This entry indicates the requester for the Registered Performance
Metric. The requester MAY be a document (such as an RFC) or a
person.
7.5.3. Revision
This entry indicates the revision number of a Registered Performance
Metric, starting at 0 for Registered Performance Metrics at the time
of definition and incremented by one for each revision. However, in
the case of a non-backward-compatible revision, see Section 8.3.
7.5.4. Revision Date
This entry indicates the date of acceptance of the most recent
revision for the Registered Performance Metric. The date SHALL be
determined by IANA and the reviewing Performance Metrics Expert.
7.6. Comments and Remarks
Besides providing additional details that do not appear in other
categories, this open category (single column) allows unforeseen
issues to be addressed by simply updating this informational entry.
8. Processes for Managing the Performance Metrics Registry Group
Once a Performance Metric or set of Performance Metrics has been
identified for a given application, candidate Performance Metrics
Registry Entry specifications prepared in accordance with Section 7
should be submitted to IANA to follow the process for review by the
Performance Metrics Experts, as defined below. This process is also
used for other changes to a Performance Metrics Registry Entry, such
as deprecation or revision, as described later in this section.
It is desirable that the author(s) of a candidate Performance Metrics
Registry Entry seek review in the relevant IETF working group or
offer the opportunity for review on the working group mailing list.
8.1. Adding New Performance Metrics to the Performance Metrics Registry
Requests to add Registered Performance Metrics in the Performance
Metrics Registry SHALL be submitted to IANA, which forwards the
request to a designated group of experts (Performance Metrics
Experts) appointed by the IESG; these are the reviewers called for by
the Specification Required policy [RFC8126] defined for the
Performance Metrics Registry. The Performance Metrics Experts review
the request for such things as compliance with this document,
compliance with other applicable Performance Metrics-related RFCs,
and consistency with the currently defined set of Registered
Performance Metrics. The most efficient path for submission begins
with preparation of an Internet-Draft containing the proposed
Performance Metrics Registry Entry using the template in Section 11,
so that the submission formatting will benefit from the normal IETF
Internet-Draft submission processing (including HTMLization).
Submission to IANA may be during IESG review (leading to IETF
Standards Action), where an Internet-Draft proposes one or more
Registered Performance Metrics to be added to the Performance Metrics
Registry, including the text of the proposed Registered Performance
Metric(s).
If an RFC-to-be includes a Performance Metric and a proposed
Performance Metrics Registry Entry but the Performance Metrics
Expert's review determines that one or more of the criteria listed in
Section 5 have not been met, then the proposed Performance Metrics
Registry Entry MUST be removed from the text. Once evidence exists
that the Performance Metric meets the criteria in Section 5, the
proposed Performance Metrics Registry Entry SHOULD be submitted to
IANA to be evaluated in consultation with the Performance Metrics
Experts for registration at that time.
Authors of proposed Registered Performance Metrics SHOULD review
compliance with the specifications in this document to check their
submissions before sending them to IANA.
At least one Performance Metrics Expert should endeavor to complete
referred reviews in a timely manner. If the request is acceptable,
the Performance Metrics Experts signify their approval to IANA, and
IANA updates the Performance Metrics Registry. If the request is not
acceptable, the Performance Metrics Experts MAY coordinate with the
requester to change the request so that it is compliant; otherwise,
IANA SHALL coordinate resolution of issues on behalf of the expert.
The Performance Metrics Experts MAY choose to reject clearly
frivolous or inappropriate change requests outright, but such
exceptional circumstances should be rare.
If the proposed Metric is unique in a significant way, in order to
properly describe the Metric, it may be necessary to propose a new
Name Element Registry, or (more likely) a new Entry in an existing
Name Element Registry. This proposal is part of the request for the
new Metric, so that it undergoes the same IANA review and approval
process.
Decisions by the Performance Metrics Experts may be appealed per
Section 10 of [RFC8126].
8.2. Backward-Compatible Revision of Registered Performance Metrics
A request for revision is only permitted when the requested changes
maintain backward compatibility with implementations of the prior
Performance Metrics Registry Entry describing a Registered
Performance Metric (entries with lower revision numbers but having
the same Identifier and Name).
The purpose of the Status field in the Performance Metrics Registry
is to indicate whether the entry for a Registered Performance Metric
is 'Current', 'Deprecated', or 'Obsolete'. The term 'deprecated' is
used when an entry is replaced, either with a backwards-compatible
revision (this sub-section) or with a non-backwards-compatible
revision (in Section 8.3).
In addition, no policy is defined for revising the Performance Metric
Entries in the IANA Registry or addressing errors therein. To be
clear, changes and deprecations within the Performance Metrics
Registry are not encouraged and should be avoided to the extent
possible. However, in recognition that change is inevitable, the
provisions of this section address the need for revisions.
Revisions are initiated by sending a candidate Registered Performance
Metric definition to IANA, per Section 8.1, identifying the existing
Performance Metrics Registry Entry, and explaining how and why the
existing entry should be revised.
The primary requirement in the definition of procedures for managing
changes to existing Registered Performance Metrics is avoidance of
measurement interoperability problems; the Performance Metrics
Experts must work to maintain interoperability above all else.
Changes to Registered Performance Metrics may only be done in an
interoperable way; necessary changes that cannot be done in a way
that allows interoperability with unchanged implementations MUST
result in the creation of a new Registered Performance Metric (with a
new Name, replacing the RFCXXXXsecY portion of the Name) and possibly
the deprecation of the earlier metric.
A change to a Registered Performance Metric SHALL be determined to be
backward compatible when:
1. it involves the correction of an error that is obviously only
editorial, or
2. it corrects an ambiguity in the Registered Performance Metric's
definition, which itself leads to issues severe enough to prevent
the Registered Performance Metric's usage as originally defined,
or
3. it corrects missing information in the metric definition without
changing its meaning (e.g., the explicit definition of 'quantity'
semantics for numeric fields without a Data Type Semantics
value), or
4. it harmonizes with an external reference that was itself
corrected, or
5. if the current Registry format has been revised by adding a new
column that is not relevant to an existing Registered Performance
Metric (i.e., the new column can be safely filled in with "Not
Applicable").
If a Performance Metric revision is deemed permissible and backward
compatible by the Performance Metrics Experts, according to the rules
in this document, IANA SHOULD execute the change(s) in the
Performance Metrics Registry. The requester of the change is
appended to the original requester in the Performance Metrics
Registry. The Name of the revised Registered Performance Metric,
including the RFCXXXXsecY portion of the Name, SHALL remain unchanged
even when the change is the result of IETF Standards Action. The
revised Registry Entry SHOULD reference the new immutable document,
such as an RFC. For other standards bodies, it is likely to be
necessary to reference a specific, dated version of a specification,
in an appropriate category and column.
Each Registered Performance Metric in the Performance Metrics
Registry has a revision number, starting at zero. Each change to a
Registered Performance Metric following this process increments the
revision number by one.
When a revised Registered Performance Metric is accepted into the
Performance Metrics Registry, the date of acceptance of the most
recent revision is placed into the Revision Date column of the
Registry for that Registered Performance Metric.
Where applicable, additions to Registered Performance Metrics in the
form of text in the Comments or Remarks column should include the
date, but such additions may not constitute a revision according to
this process.
Older versions of the updated Metric Entries are kept in the Registry
for archival purposes. The older entries are kept with all fields
unmodified (including Revision Date) except for the Status field,
which SHALL be changed to 'Deprecated'.
This process should not in any way be construed as allowing the
Performance Metrics Experts to overrule IETF consensus.
Specifically, any Registered Performance Metrics that were added to
the Performance Metrics Registry with IETF consensus require IETF
consensus for revision or deprecation.
8.3. Non-Backward-Compatible Deprecation of Registered Performance
Metrics
This section describes how to make a non-backward-compatible update
to a Registered Performance Metric. A Registered Performance Metric
MAY be deprecated and replaced when:
1. the Registered Performance Metric definition has an error or
shortcoming that cannot be permissibly changed per Section 8.2
("Revising Registered Performance Metrics"), or
2. the deprecation harmonizes with an external reference that was
itself deprecated through that reference's accepted deprecation
method.
A request for deprecation is sent to IANA, which passes it to the
Performance Metrics Experts for review. When deprecating a
Performance Metric, the Performance Metric Description in the
Performance Metrics Registry MUST be updated to explain the
deprecation, as well as to refer to the new Performance Metric
created to replace the deprecated Performance Metric.
When a new, non-backward-compatible Performance Metric replaces a
(now) deprecated metric, the revision number of the new Registered
Performance Metric is incremented over the value in the deprecated
version, and the current date is entered as the Revision Date of the
new Registered Performance Metric.
The intentional use of deprecated Registered Performance Metrics
should result in a log entry or human-readable warning by the
respective application.
Names and Metric IDs of deprecated Registered Performance Metrics
must not be reused.
The deprecated entries are kept with all Administrative columns
unmodified, except the Status field (which is changed to
'Deprecated').
8.4. Obsolete Registry Entries
Existing Registry Entries may become obsolete over time due to:
1. the Registered Performance Metric is found to contain
considerable errors (and no one sees the value in the effort to
fix it), or
2. one or more critical References (or sections thereof) have been
designated obsolete by the SDO, or
3. other reasons brought to the attention of IANA and the Registry
Experts.
When a Performance Metric Registry Entry is declared obsolete, the
Performance Metric Description in the Performance Metrics Registry is
updated to explain the reasons the Entry is now obsolete and has not
been replaced (Deprecation always involves replacement).
Obsolete entries are kept with all Administrative columns unmodified,
except the Status field (which is changed to 'Obsolete').
8.5. Registry Format Version and Future Changes/Extensions
The Registry Format Version defined in this memo is 1.0, and
candidate Registry Entries complying with this memo MUST use 1.0.
The Registry Format can only be updated by publishing a new RFC with
the new format (Standards Action).
When a Registered Performance Metric is created or revised, then it
uses the most recent Registry Format Version.
Only one form of Registry extension is envisaged:
Adding columns, or both categories and columns, to accommodate
unanticipated aspects of new measurements and metric categories.
If the Performance Metrics Registry is extended in this way, the
version number of future entries complying with the extension SHALL
be incremented (in either the unit or the tenths digit, depending on
the degree of extension).
9. Security Considerations
This document defines a Registry structure and does not itself
introduce any new security considerations for the Internet. The
definition of Performance Metrics for this Registry may introduce
some security concerns, but the mandatory references should have
their own considerations for security, and such definitions should be
reviewed with security in mind if the security considerations are not
covered by one or more reference standards.
The aggregated results of the Performance Metrics described in this
Registry might reveal network topology information that may be
considered sensitive. If such cases are found, then access control
mechanisms should be applied.
10. IANA Considerations
With the background and processes described in earlier sections, IANA
has taken the actions described below.
10.1. Registry Group
The new Registry group is named Performance Metrics. This document
refers to it as the "Performance Metrics Group" or "Registry Group",
meaning all registrations appearing on
<https://www.iana.org/assignments/performance-metrics>
(https://www.iana.org/assignments/performance-metrics).
For clarity, note that this document and [RFC8912] use the following
conventions to refer to the various IANA registries related to
Performance Metrics.
+===============+===========================+=====================+
| | RFC 8911 and RFC 8912 | IANA Web page |
+===============+===========================+=====================+
| Page Title | Performance Metrics Group | Performance Metrics |
+---------------+---------------------------+---------------------+
| Main Registry | Performance Metrics | Performance Metrics |
| | Registry | Registry |
+---------------+---------------------------+---------------------+
| Registry Row | Performance Metrics | registration (also |
| | Registry Entry | template) |
+---------------+---------------------------+---------------------+
Table 6
Registration Procedure: Specification Required
Reference: RFC 8911
Experts: Performance Metrics Experts
10.2. Performance Metrics Name Elements
This memo specifies and populates the Registries for the Performance
Metric Name Elements. The Name assigned to a Performance Metric
Registry Entry consists of multiple Elements separated by an "_"
(underscore), in the order defined in Section 7.1.2. IANA has
created the following registries, which contain the current set of
possibilities for each Element in the Performance Metric Name.
MetricType
Method
SubTypeMethod
Spec
Units
Output
At creation, IANA has populated the Registered Performance Metrics
Name Elements using the lists of values for each Name Element listed
in Section 7.1.2. The Name Elements in each Registry are case
sensitive.
When preparing a Metric Entry for registration, the developer SHOULD
choose Name Elements from among the registered elements. However, if
the proposed metric is unique in a significant way, it may be
necessary to propose a new Name Element to properly describe the
metric, as described below.
A candidate Metric Entry proposes a set of values for its Name
Elements. These are reviewed by IANA and an Expert Reviewer. It is
possible that a candidate Metric Entry proposes a new value for a
Name Element (that is, one that is not in the existing list of
possibilities), or even that it proposes a new Name Element. Such
new assignments are administered by IANA through the Specification
Required policy [RFC8126], which includes Expert Review (i.e., review
by one of a group of Performance Metrics Experts, who are appointed
by the IESG upon recommendation of the Transport Area Directors).
10.3. New Performance Metrics Registry
This document specifies the Performance Metrics Registry. The
Registry contains the following columns in the Summary category:
Identifier
Name
URI
Description
Reference
Change Controller
Version
Descriptions of these columns and additional information found in the
template for Registry Entries (categories and columns) are further
defined in Section 7.
The Identifier 0 should be Reserved. The Registered Performance
Metric unique Identifier is an unbounded integer (range 0 to
infinity). The Identifier values from 64512 to 65535 are reserved
for private or experimental use, and the user may encounter
overlapping uses. When adding new Registered Performance Metrics to
the Performance Metrics Registry, IANA SHOULD assign the lowest
available Identifier to the new Registered Performance Metric. If a
Performance Metrics Expert providing review determines that there is
a reason to assign a specific numeric Identifier, possibly leaving a
temporary gap in the numbering, then the Performance Metrics Expert
SHALL inform IANA of this decision.
Names starting with the prefix "Priv_" are reserved for private use
and are not considered for registration. The Name column entries are
further defined in Section 7.
The URI column will have a URL to each completed Registry Entry. The
Registry Entry text SHALL be HTMLized to aid the reader (similar to
the way that Internet-Drafts are HTMLized, the same tool can perform
the function), with links to referenced section(s) of an RFC or
another immutable document.
The Reference column will include an RFC number, an approved
specification designator from another standards body, or some other
immutable document.
New assignments for the Performance Metrics Registry will be
administered by IANA through the Specification Required policy
[RFC8126] (which includes Expert Review, i.e., review by one of a
group of experts -- in the case of this document, the Performance
Metrics Experts, who are appointed by the IESG upon recommendation of
the Transport Area Directors) or by Standards Action. The experts
can be initially drawn from the Working Group Chairs, document
editors, and members of the Performance Metrics Directorate, among
other sources of experts.
Extensions to the Performance Metrics Registry require IETF Standards
Action. Only one form of Registry extension is envisaged:
* Adding columns, or both categories and columns, to accommodate
unanticipated aspects of new measurements and metric categories.
If the Performance Metrics Registry is extended in this way, the
version number of future entries complying with the extension SHALL
be incremented (in either the unit or the tenths digit, depending on
the degree of extension).
11. Blank Registry Template
This section provides a blank template to help IANA and Registry
Entry writers.
11.1. Summary
This category includes multiple indexes to the Registry Entry: the
element ID and Metric Name.
11.1.1. ID (Identifier)
<insert a numeric Identifier, an integer, TBD>
11.1.2. Name
<insert the Name, according to the metric naming convention>
11.1.3. URI
URL: https://www.iana.org/assignments/performance-metrics/ ... <Name>
11.1.4. Description
<provide a description>
11.1.5. Reference
<provide the RFC or other specification that contains the approved
candidate Registry Entry>
11.1.6. Change Controller
<provide information regarding the entity responsible for approving
revisions to the Registry Entry (including contact information for an
individual, where appropriate)>
11.1.7. Version (of Registry Format)
11.2. Metric Definition
This category includes columns to prompt the entry of all necessary
details related to the metric definition, including the immutable
document reference and values of input factors, called "Fixed
Parameters".
11.2.1. Reference Definition
<provide a full bibliographic reference to an immutable document>
<provide a specific section reference and additional clarifications,
if needed>
11.2.2. Fixed Parameters
<list and specify Fixed Parameters, input factors that must be
determined and embedded in the measurement system for use when
needed>
11.3. Method of Measurement
This category includes columns for references to relevant sections of
the immutable document(s) and any supplemental information needed to
ensure an unambiguous method for implementations.
11.3.1. Reference Method
<for the metric, insert relevant section references and supplemental
info>
11.3.2. Packet Stream Generation
<provide a list of generation Parameters and section/spec references
if needed>
11.3.3. Traffic Filtering (Observation) Details
This category provides the filter details (when present), which
qualify the set of packets that contribute to the measured results
from among all packets observed.
<provide a section reference>
11.3.4. Sampling Distribution
<insert time distribution details, or how this is different from the
filter>
11.3.5. Runtime Parameters and Data Format
Runtime Parameters are input factors that must be determined,
configured into the measurement system, and reported with the results
for the context to be complete.
<provide a list of Runtime Parameters and their data formats>
11.3.6. Roles
<list the names of the different Roles from the Measurement Method>
11.4. Output
This category specifies all details of the output of measurements
using the metric.
11.4.1. Type
<insert the name of the output type -- raw results or a selected
summary statistic>
11.4.2. Reference Definition
<describe the reference data format for each type of result>
11.4.3. Metric Units
<insert units for the measured results, and provide the reference
specification>
11.4.4. Calibration
<insert information on calibration>
11.5. Administrative Items
This category provides administrative information.
11.5.1. Status
<provide status: 'Current' or 'Deprecated'>
11.5.2. Requester
<provide a person's name, an RFC number, etc.>
11.5.3. Revision
<provide the revision number: starts at 0>
11.5.4. Revision Date
<provide the date, in YYYY-MM-DD format>
11.6. Comments and Remarks
<list any additional (informational) details for this entry>
12. References
12.1. Normative References
[RFC2026] Bradner, S., "The Internet Standards Process -- Revision
3", BCP 9, RFC 2026, DOI 10.17487/RFC2026, October 1996,
<https://www.rfc-editor.org/info/rfc2026>.
[RFC2119] Bradner, S., "Key words for use in RFCs to Indicate
Requirement Levels", BCP 14, RFC 2119,
DOI 10.17487/RFC2119, March 1997,
<https://www.rfc-editor.org/info/rfc2119>.
[RFC2330] Paxson, V., Almes, G., Mahdavi, J., and M. Mathis,
"Framework for IP Performance Metrics", RFC 2330,
DOI 10.17487/RFC2330, May 1998,
<https://www.rfc-editor.org/info/rfc2330>.
[RFC3986] Berners-Lee, T., Fielding, R., and L. Masinter, "Uniform
Resource Identifier (URI): Generic Syntax", STD 66,
RFC 3986, DOI 10.17487/RFC3986, January 2005,
<https://www.rfc-editor.org/info/rfc3986>.
[RFC5644] Stephan, E., Liang, L., and A. Morton, "IP Performance
Metrics (IPPM): Spatial and Multicast", RFC 5644,
DOI 10.17487/RFC5644, October 2009,
<https://www.rfc-editor.org/info/rfc5644>.
[RFC6390] Clark, A. and B. Claise, "Guidelines for Considering New
Performance Metric Development", BCP 170, RFC 6390,
DOI 10.17487/RFC6390, October 2011,
<https://www.rfc-editor.org/info/rfc6390>.
[RFC6576] Geib, R., Ed., Morton, A., Fardid, R., and A. Steinmitz,
"IP Performance Metrics (IPPM) Standard Advancement
Testing", BCP 176, RFC 6576, DOI 10.17487/RFC6576, March
2012, <https://www.rfc-editor.org/info/rfc6576>.
[RFC7799] Morton, A., "Active and Passive Metrics and Methods (with
Hybrid Types In-Between)", RFC 7799, DOI 10.17487/RFC7799,
May 2016, <https://www.rfc-editor.org/info/rfc7799>.
[RFC8126] Cotton, M., Leiba, B., and T. Narten, "Guidelines for
Writing an IANA Considerations Section in RFCs", BCP 26,
RFC 8126, DOI 10.17487/RFC8126, June 2017,
<https://www.rfc-editor.org/info/rfc8126>.
[RFC8174] Leiba, B., "Ambiguity of Uppercase vs Lowercase in RFC
2119 Key Words", BCP 14, RFC 8174, DOI 10.17487/RFC8174,
May 2017, <https://www.rfc-editor.org/info/rfc8174>.
12.2. Informative References
[RFC2681] Almes, G., Kalidindi, S., and M. Zekauskas, "A Round-trip
Delay Metric for IPPM", RFC 2681, DOI 10.17487/RFC2681,
September 1999, <https://www.rfc-editor.org/info/rfc2681>.
[RFC3432] Raisanen, V., Grotefeld, G., and A. Morton, "Network
performance measurement with periodic streams", RFC 3432,
DOI 10.17487/RFC3432, November 2002,
<https://www.rfc-editor.org/info/rfc3432>.
[RFC3550] Schulzrinne, H., Casner, S., Frederick, R., and V.
Jacobson, "RTP: A Transport Protocol for Real-Time
Applications", STD 64, RFC 3550, DOI 10.17487/RFC3550,
July 2003, <https://www.rfc-editor.org/info/rfc3550>.
[RFC3611] Friedman, T., Ed., Caceres, R., Ed., and A. Clark, Ed.,
"RTP Control Protocol Extended Reports (RTCP XR)",
RFC 3611, DOI 10.17487/RFC3611, November 2003,
<https://www.rfc-editor.org/info/rfc3611>.
[RFC4148] Stephan, E., "IP Performance Metrics (IPPM) Metrics
Registry", BCP 108, RFC 4148, DOI 10.17487/RFC4148, August
2005, <https://www.rfc-editor.org/info/rfc4148>.
[RFC5474] Duffield, N., Ed., Chiou, D., Claise, B., Greenberg, A.,
Grossglauser, M., and J. Rexford, "A Framework for Packet
Selection and Reporting", RFC 5474, DOI 10.17487/RFC5474,
March 2009, <https://www.rfc-editor.org/info/rfc5474>.
[RFC5475] Zseby, T., Molina, M., Duffield, N., Niccolini, S., and F.
Raspall, "Sampling and Filtering Techniques for IP Packet
Selection", RFC 5475, DOI 10.17487/RFC5475, March 2009,
<https://www.rfc-editor.org/info/rfc5475>.
[RFC5477] Dietz, T., Claise, B., Aitken, P., Dressler, F., and G.
Carle, "Information Model for Packet Sampling Exports",
RFC 5477, DOI 10.17487/RFC5477, March 2009,
<https://www.rfc-editor.org/info/rfc5477>.
[RFC6035] Pendleton, A., Clark, A., Johnston, A., and H. Sinnreich,
"Session Initiation Protocol Event Package for Voice
Quality Reporting", RFC 6035, DOI 10.17487/RFC6035,
November 2010, <https://www.rfc-editor.org/info/rfc6035>.
[RFC6248] Morton, A., "RFC 4148 and the IP Performance Metrics
(IPPM) Registry of Metrics Are Obsolete", RFC 6248,
DOI 10.17487/RFC6248, April 2011,
<https://www.rfc-editor.org/info/rfc6248>.
[RFC6991] Schoenwaelder, J., Ed., "Common YANG Data Types",
RFC 6991, DOI 10.17487/RFC6991, July 2013,
<https://www.rfc-editor.org/info/rfc6991>.
[RFC7012] Claise, B., Ed. and B. Trammell, Ed., "Information Model
for IP Flow Information Export (IPFIX)", RFC 7012,
DOI 10.17487/RFC7012, September 2013,
<https://www.rfc-editor.org/info/rfc7012>.
[RFC7014] D'Antonio, S., Zseby, T., Henke, C., and L. Peluso, "Flow
Selection Techniques", RFC 7014, DOI 10.17487/RFC7014,
September 2013, <https://www.rfc-editor.org/info/rfc7014>.
[RFC7594] Eardley, P., Morton, A., Bagnulo, M., Burbridge, T.,
Aitken, P., and A. Akhter, "A Framework for Large-Scale
Measurement of Broadband Performance (LMAP)", RFC 7594,
DOI 10.17487/RFC7594, September 2015,
<https://www.rfc-editor.org/info/rfc7594>.
[RFC7679] Almes, G., Kalidindi, S., Zekauskas, M., and A. Morton,
Ed., "A One-Way Delay Metric for IP Performance Metrics
(IPPM)", STD 81, RFC 7679, DOI 10.17487/RFC7679, January
2016, <https://www.rfc-editor.org/info/rfc7679>.
[RFC8912] Morton, A., Bagnulo, M., Eardley, P., and K. D'Souza,
"Initial Performance Metrics Registry Entries", RFC 8912,
DOI 10.17487/RFC8912, November 2021,
<https://www.rfc-editor.org/info/rfc8912>.
Acknowledgments
Thanks to Brian Trammell and Bill Cerveny, IPPM co-chairs during the
development of this memo, for leading several brainstorming sessions
on this topic. Thanks to Barbara Stark and Juergen Schoenwaelder for
the detailed feedback and suggestions. Thanks to Andrew McGregor for
suggestions on metric naming. Thanks to Michelle Cotton for her
early IANA review, and to Amanda Baber for answering questions
related to the presentation of the Registry and accessibility of the
complete template via URL. Thanks to Roni Even for his review and
suggestions to generalize the procedures. Thanks to all of the Area
Directors for their reviews.
Authors' Addresses
Marcelo Bagnulo
Universidad Carlos III de Madrid
Av. Universidad 30
28911 Leganes Madrid
Spain
Phone: 34 91 6249500
Email: marcelo@it.uc3m.es
URI: http://www.it.uc3m.es
Benoit Claise
Huawei
Email: benoit.claise@huawei.com
Philip Eardley
BT
Adastral Park, Martlesham Heath
Ipswich
United Kingdom
Email: philip.eardley@bt.com
Al Morton
AT&T Labs
200 Laurel Avenue South
Middletown, NJ 07748
United States of America
Email: acmorton@att.com
Aamer Akhter
Consultant
118 Timber Hitch
Cary, NC
United States of America
Email: aakhter@gmail.com
|